Every year, we’re told that this is the year we’ll get self-driving cars. And every year, come December, those autonomous vehicles fail to materialise anywhere beyond the research facility. But now, a report from California has outlined some of the things self-driving cars still need to learn before they can be let loose on the public.
In The Golden State, some of the companies pioneering self-driving cars have been testing their autonomous vehicles out on the streets. Firms like Cruise, Waymo and Apple have all sent fleets of autonomous vehicles out onto the roads in California to test their mettle.
And now, the California Department of Motor Vehicles has published a report outlining every issue these self-driving cars faced in 2021.
The DMV has strict rules for anyone testing self-driving cars in the state. As such, every time a test vehicle is out on the road and a driver has to take over for any reason, the incident must be logged. At the end of the year, these incidents are all compiled in the Disengagement Report, which includes more than 2,500 incidents from the past 12 months.
The Disengagement Report shows that there are 25 companies licensed to test their autonomous cars on the streets of California. OEMs like Toyota, Mercedes and Nissan are on this list, while including tech firms like Qualcomm and NVIDIA.
But no matter what company it is, each autonomous vehicle tester in California’s report seems to be encountering similar issues – all following the three Ps: perception, prediction and planning.
Object perception is about what the software driving the autonomous car thinks is in the road ahead. So the issues self-driving cars faced in this regard are all about when a car mistook an object for something else, like a red traffic light for a green one.
Everything from “small objects in the road” to “incorrectly perceived rain” lead to unwanted braking. Or, at times, the cars were also late to apply the brakes. In one test, a self driving car was “late to perceive” an animal crossing the road and the test driver had to slam on the anchors.
Then there are the prediction issues, which are all about the way self-driving cars can “guess” how the objects they observe will behave. As such, the times test drivers were forced to step in came about when the cars couldn’t correctly predict how pedestrians would behave, how other cars in traffic would act or that a parked car won’t move. In each instance, incorrect predictions about these objects caused an “undesirable motion plan” and forced the test driver to take over.
Then there are the planning issues. Rather than the behaviours of various objects, these are directly related to other road users, such as other cars, trucks, pedestrians crossing the road, or even cyclists.
So here, it’s all about how the car plans to react to vehicles changing lanes on a highway, trucks making wide turns, or pedestrians “making illegal crossings.”
Away from the three Ps, self-driving cars also had issues maintaining the correct speed on various roads. Test drivers reported taking the wheel when the self-driving car was following the speed limit, but was said to be driving “too slow or too fast given the traffic and road conditions.”
There’s also the whole “map discrepancy” issue, which seemingly only affects Apple-operated vehicles. I guess that’s just more Apple Maps woes, which is something we’ll all have to learn to live with.
Then, there are also a lot of general hardware issues.
Sometimes, drivers were forced to take the wheel when data recorders failed, if certain components went offline or if a software glitch asked for the test driver to take over. Some companies also reported “precautionary” takeovers when they approached pedestrians, traffic signals or certain stopped vehicles. And finally, there are all the times that test drivers were forced to take the wheel when they encountered a “recklessly behaving road user.” Because, of course, you can program an autonomous car to follow the rules of the road, but you’ll sadly never get some people to do the same.