Why You Shouldn't Be Too Quick To Cheer Self-Driving Cars

One of the clear automotive technology trends at CES this year is cars that drive themselves. From Audi to Lexus to Ford, the world's largest car companies are beginning to follow Google's lead in an effort to produce cars smart enough to drive themselves.

The thought is that autonomous cars will reduce the number of traffic deaths — more than 100 people per day, currently — while simultaneously allowing car owners to do more productive things on their car trips, like work or read. All of this sounds magical, especially to a traffic-jammed Angeleno like myself, but let's get real: How soon do we actually think state and federal legislators are going to cotton to the idea of robot cars all over the roads?

To be sure, self-driving consumer cars will initially be prohibitively expensive for the vast majority of drivers (some guess costlier than a Ferrari), but if that price point comes down, expect them to be as prevalent as hybrids are today. The one thing standing in the way of that prevalence, of course, could be legal hang-ups.

Optimists will tell you that robot cars have already breezed into street-legality in Nevada, Florida, and California. And that's true — but in each case they did so conditionally, and with a whole lot of uncertainty still lingering over their futures. For instance, in all three states, a self-driving car must at all times be operated by an in-car driver, but how vigilant that driver should be is still a grey area. Under Nevada's law, anyone operating a "driverless" car is, unlike regular drivers, allowed to text. They are not allowed to drink alcohol, however, meaning that Nevada thinks self-driving cars should allow people freedom to not pay attention — but only to a certain point.

Further complicating things is who will be at fault in the likely event that autonomous cars are imperfect and get into accidents. Say a self-driving car on its way to pick up its owner were to blow through a red light — who would pay the ticket for that violation? The owner? The car manufacturer? The people who wrote the navigation software? This sort of query gets even thornier as the potential problems get more harmful: Who is liable if an autonomous car rear ends a standard car with a driver? Will the robot always be implicated? Worse still, what if a malfunctioning robot car veers momentarily onto a footpath and kills a kid? Who will pay for that tragedy?

Unfortunately, despite the huge number of very serious questions people have about what sort of laws will eventually govern autonomous cars, some politicians refuse to even acknowledge such issues exist. In October of last year, when California Governor Jerry Brown signed the bill paving the way for legal self-driving cars in California, a reporter posed the question of who would be held responsible if a robot car indeed ran a red light. The governor dismissed the question as being stupid and simple. "I don't know — whoever owns the car, I would think. But we will work that out," he said. "That will be the easiest thing to work out."

I don't know what kind of government bureaucracy Governor Brown is used to, but from what I've seen of the American political system, very few policy questions are "easy" anymore, especially not ones relating to emerging technologies and public safety. To be sure, cars that can drive themselves will be amazing, and I look forward to one day programming a Prius to come pick me up from the airport or a bar at last call. But it's probably best to hold off on celebrating the coming fleets of robot cars until we have a serious conversation about the rules and regulations that will almost surely slow their public adoption, if not halt it entirely for years.


Comments

    Why is no one asking the question, if these robot cars are all networked and the Cylons hack the system we could have killer cars all over our highways. What then? You think John Connor's gonna save you hurdling along at 200+ in your lexus.

      I dont remember any characters named John Connor being in any of the Battlestar Series. Could it be that you are confused?

    I don't think I'd ever be confident enough to let one of those self parking cars you can get now do their thing. I'd be too paranoid if it hit one of the other cars I'd be liable.

      my understanding of the self parking ones is you still control the throttle so it would be your fault if you hit something.

    Good, im glad they are ignoring these questions. Red tape. We need driver less cars NOW. No more drunk drivers, no more eyes off the road because your kid is screaming etc. These "who will pay for it if it hits someone by some freak chance" are just rubbish. We need to look at this as a numbers thing rather than a human thing. There will be thousands of lives saved when things are going right, rather than the 1% of the time when it accidentally goes through a red light. and bumps someones arm. This rubbish of liability is what is going to stifle the advancement of this technology. The maniac thinking of what if this freak event happens, but lets ignore the thousands of lives that will be saved for certain by implementing driver less cars.

    In the future, they will look back on these days and be horrified. The idea that there are millions of cars on the roads, driven by emotional humans, that can just swerve at their leisure into all the oncoming traffic, without rails, without protective barriers, without computer assistance. The current state of traffic where the onus is on individuals to be skilled enough to control cars without crashing into each other is actually mind boggling.

      And the bugs in the systems that drive robotic cars were programmed by humans. So we better get more robots to programme those robots. But then we will need robots to programme the robots that programme the robots. Just as long as there is no humans involved. They make mistakes.

      I imagine since the car could be entirely controlled by a computer, it wouldn't be much of a stretch to have them all connected wirelessly with 3G or similar. That way they could be centrally controlled by the manufacturer and disable self-driving if any sort of freak accident causing fault existed.

      Last edited 11/01/13 8:33 pm

      "These "who will pay for it if it hits someone by some freak chance" are just rubbish"

      Yep, that's clearly something we should just sort out WHEN it happens! Then whoever has the most expensive lawyers can win the case and set precedent and we'll be stuck with whatever the result is, whether good or incredibly bad. Let's blindly hurl ourselves into the future and ignore the consequences! o.0

    I don't know if I'd ever get a driveless car. I very much appreciate being in full control of the car but I'm not opposed to the idea. Bring on a driverless world, I can send packages to people without having to physically go there, haha.

    Liability should fall with the manufacturer, the same as any product that caused harm due to a manufacturing defect. The only time it wouldn't be the manufacturers responsibility is when the fault is due to the owner neglecting prescribed Maintenance.

    I really don't see it taking off however as I can't see human operated cars and robotic cars being able to accurately account for each others behaviour on the road and would peg that most accidents after their introduction would involve a human and robotic car rather than human on human or robotic on robotic.

    Brown is right, though; questions of legal liability in the result that things go wrong ARE the easiest questions. Our courts are already very good at handling this, and there are a large number of analogous cases to draw from.

    The question for government is not in assessing what happens when things go wrong, but in correctly determining how LIKELY they are to go wrong, and what society's level of tolerance for accident in an enterprise like this is. Mathematically, it obviously should be the case that if robot cars are less accident prone on average than real humans, then they're acceptable. Unfortunately humans are messy creatures and I expect that we'll have far less tolerance for error in machines than we do for human drivers.

    Talk about ignoring the elephant in the room! Hacking! Driverless cars WILL be on the internet. If your computer is hacked and crashes there's probably not going to be a loss-of-life. Compare that with the effect of a hacked Googlemobile.
    How could you ever be certain your car was fit to drive itself? Or not operating under the influence of Alqahaida? Would it be a legal defense to plead that your car had been hacked if it ran down someone? Probably not.

    If driverless cars saved one of your family members from a drunk, speeding, fatigued, texting, cell phoning driver. Then it would all be worth it. It sounds like we want to put your faith in a T-Model Ford and be content. I may not see it, but there will come a time when driverless cars will be the norm and racing car tracks will be the only place where you will have a right to drive a car. Lets face it, we humans are pretty good, but when it comes to driving, there are too many distractions, temptations, deadlines and egos behind the wheel, that technology eliminates. I pray for the day when road fatalities are zero. Is that really achievable while humans are still driving?

Join the discussion!

Trending Stories Right Now