Across the world we are spending billions of dollars a year developing technology for self-driving or autonomous cars. The attractions of fully automatic vehicles are clear. Most road accidents (maybe as high as 90%) are due to human error, so full or semi-autonomous vehicles could save lives, time and money. Interestingly, many of the technologies offer advantages for use in wider industrial use.
Some projects have been running for over ten years and the technology developed has already appeared in newer cars. Many current production vehicles now offer driver assistance modes helping with parking, staying in lane and emergency braking assistance. It is probable that further developments will continue to filter down onto new vehicles, but with the driver remaining in control. We are some way from reliable fully autonomous cars, although Google expects to be in production within a couple of years.
The SAE International (Society of Automotive Engineers) produces a visual chart for use with its J3016 “Levels of Driving Automation” standard that defines the six levels of driving automation. It ranges from no automation (level zero) to level six, being full automation. Currently the Tesla Autopilot offers Level two “hand off”. One or two makers are claiming level three “eyes off”, but even these have cameras to ensure the driver is not asleep. Only Waymo (Google) is claiming level 4, saying a driver is not necessary (although drivers are currently present for safety reasons).
IoT Connected Autonomous Cars
Current autonomous vehicles use edge computing and permanent connection to the Internet. They rely on information from IoT based sensors and information on GPS mapping, traffic and weather to function. They also need 5G communications, a boot full of computers and a car covered in sensors. Moreover, they need high levels of artificial intelligence to handle unexpected or unplanned events.
Developing the necessary algorithms to make sense of the data is unquestionably the most complex part of the development. Within the software is where and how the decisions are made and where any errors occur. Layers of algorithms are combined to offer the AI needed for safe operation of the vehicles from the data collected.
Two of the leading developers are Google and Tesla, and they have taken a slightly different approach to gathering data. Tesla collects data in the area around the vehicle using a high-tech camera system. Their “Autopilot” software then analyses the image sensing data. Google on the other hand uses a system called LIDAR (Light detection and ranging sensors). LIDAR delivers 360° self-contained 3D mapping, object detection and identification, and navigation regardless of the local conditions.
Most sensing systems are a combination of multiple 360°image sensing cameras LIDARs and radar and ultra-sonic modules placed around the vehicle to ensure multiple redundancy. The sensors themselves are undergoing significant and rapid development. Early LIDAR units resembled an inverted bucket on the car roof, but digitisation means current versions are merely teacup size. Future developments will refine this and the other technologies still further.
Technology transfer
As the technologies prove themselves, digitisation and mass production will ensure prices fall to commodity levels. In factories, the hardware will be readily adopted for AGVs, safety zones and robots, moreover the proven algorithms and AI developed to ensure reliability and safety will also transfer.
Finally, understanding and explaining the complex technologies used for autonomous vehicles is not the reason for this article. There are many expert sources available on the Internet.