We Probably Should Be Worried About Driverless Cars

The advent of electric vehicles and increasingly effective battery storage heralds a new age of propulsion, but interestingly, that change seems to have coincided with another major alteration to how we move ourselves around: the potential removal of human control from the operation of the vehicle. This won’t have a big impact on emissions, but it will have big impacts on safety, perception of risk and culture.

I’ve previously explored the fairly secure prediction that, in the long-run, handing over driving control to computers will significantly improve safety. In that analysis, I found that humans are responsible for around 90% of all traffic incidents. The human brain is good at a lot of things, but sadly, combining the flaws of human perception with a metal box powered by explosions results in injury and death. A machine would probably serve us better, in this particular task.

There’s a possibility that seems a little under-discussed, though. Commuting will still entail spending millions of collective hours on the road, in a very large variety of situations. I suspect there’s a collection of real risks that will briefly emerge during the early years of this major technological shift, along with some predictably irritating political reactions.

When we made the process of navigation autonomous, initially through GPS units and now mostly through software on our smartphones, there were plenty of instances of cars plunging into rivers. This has been largely smoothed over, but it caused real, quantifiable harm to a small subset of drivers. In the early years of automation and the rapid spread of new technology, the sheer novelty of situations is almost impossible to pre-empt, and so I suspect there will be some harm incurred.

You can minimise these initial problems through an iterative and cautious approach. A fair few cars already feature autonomous extras, such as self-parking or lane-steering, like Tesla’s Autopilot. But even that feature can hit its limits early on - are we sure every manufacturer will know the limits of its own technology? Tesla seem quite responsible, but I wonder about new companies that might seek to cash in on a trend without an eye for risk management.

Autopiloted Teslas Screwing Up Proves Self-Driving Cars Aren't Quite Here Yet

Recently, I was going through the Highlands in Scotland, on a day of fairly serious flooding. Storm Frank caused some insane flooding, and we had to navigate busy roads that also happened to be coated with surprisingly deep water. We were fine, but being driven through these terrifying corners, I couldn’t help thinking about how a robot brain would see this situation, and how it might react. And, a human being, sitting in a seat with no steering wheel or pedals, would be seriously terrified. We will almost certainly develop the computational skill to deal with these situations. But what happens during our trip up the learning curve? As an avowed, unashamedly-irrational early adopter, I can’t help but feel a twinge of hesitancy with this technology.

One of the major advantages of autonomous vehicles is the ability to network - install devices that enable telemetry and transmission, and you can create amazingly efficient intersections, and significantly decrease traffic congestion by enabling vehicles to drive very close to one another. Yet, I wonder about the privacy and security implications of this change. It’s not surprising that exciting new capabilities creates new risks, but this particular change makes me consider that it might be worth mixing my excitement with caution. We’re climbing inside these things, and propelling our fleshy selves at very high speeds. The fundamental problem is that the organic matter we’re made of really does not like to go from a high speed to a low speed in a short period of time, and this problem still exists with driverless cars.

Another potential problem is the tweaking of vehicle software by users. Perhaps there will be some regulation or agreement to ensure you can’t put your car in ‘hothead’ mode - but surely, someone will crack it, and we won’t have programmed uncracked vehicles to deal with this scenario.

Some incident with a driverless will garner intense media coverage, there will be a rapid and poorly-considered political reaction, and regulation will be passed that isn’t an effective method of reducing risk, but certainly creates the impression that it is. Useful legislative changes that are related to new technology and informed by a range of experts are almost always stuck in ‘development hell’ for years, and implemented long after technology has changed again. I don’t doubt this will very much be the case with driverless cars.

I’m quite confident that handing over the controls to a collection of computers will have a net positive impact on road safety. But I do worry about that brief period, where bugs that aren’t yet ironed out will have surprisingly serious consequences. I also worry that we won’t respond to these roadblocks with a good mix of reason and haste - these events will likely feed into already-established patterns around new technology and flawed risk perception. Our initial reactions will be too hasty, and our useful reactions will take too long.

I suspect driverless cars will be one of the biggest and most exciting technological changes in my lifetime, and it’s going to be almost impossible to quell the thrill of this shift. But the more I dwell on it, the more it seems logical to include a healthy dose of caution in there, too.


Comments

    "A machine would probably serve us better, in this particular task." You are assuming safety is the goal but surely the goal is to get where you are going safely. When it comes to getting where you are going, autonomous cars are going to be slow, much slower than driving yourself. It's something I am actually looking forward to - autonomous cars will be the most easily intimidated cars on the road, they'll baulk at every hint of danger, allowing me to plough on through the traffic with ease.

      Wow.. just.. wow

        Another fine post from S.O. Idiot, with many of his/her trademark sweeping, unsourced, declarations.

          What? Do you seriously think for a moment they will be able to pro-actively protect their place in a queue? Of course they won't. When another car veers towards them, they will cede their spot in a heartbeat. There is simply no way they will be programmed to drive aggressively in any way. They will only recognise obstacles/danger, not opportunity.

    Driverless cars are not as scarey as Driverless Trucks. Automated driverless multi tonne vehicles carrying payloads across country at over 110 kmh... its a Cruise Control Missile!!! What payloads from food, chemicals, gazardous materials, fuel... all travelling across the kand with software lbuilt by an industry that lies about emissions, faulty ignitions, weak tires, impact exploding fuel tanks, shrapnel grenade airbags and busted seatbelts. People wilk die.

      To the contrary, I think this will be one of the biggest benefits of autonomous vehicles.

      Without a human driver, fatigue would no longer be a concern to the freight companies. No more stories of drivers using drugs to stay awake. No more split-second decisions that go wrong and take out an entire family driving in the other direction. Not meaning to paint truck drivers as incapable - most of them a highly professional - but they (and I) cannot compete with a well programmed computer in terms of speed, accuracy and resistance to fatigue.

      I would envisage that, at least at first, trucks could move autonomously between depots setup outside metropolitan areas. That way they can stick to highways/freeways where the difficulty of navigation is lower (as there are a greater number of variables to consider in densely populated areas). Human drivers could take over the "last mile" between depots and the final delivery point until the software is up-to-scratch. Even if the trucks are limited to lower speeds for safety, they'll still be quicker than a human driver over long distances as they won't need to pull over and sleep once they've hit their limit of hours on the road. Just pull in at a predetermined refueling point, fill up, and carry on; rinse, repeat.

        I dont mind if this technology was used to create Advanced Co-Pilot Style safety software.... but DRIVERLESS!!! Thats the scarey thing, they are telling people that these vehicles dont need and shouldnt need drivers, thats not okay. This technology needs supervision... hell even a train which is near automatic has a person hanging onto a dead man switch for safety.

        The only automatic navigating technology in the world at this moment that is NOT supervised... are Military Missles Systems, and its even scarey to me they call them "Fire and Forget", you seriously shouldnt forget where a warhead is going... and neither should a car or truck.

        The amount of accidents caused by people blindly following GPS is remarkable... low bridges, wrong way down one way streets, wrong speeds, school zone speeding, hell people have driven into buildings cause the little voice said turn left.

        People are going to get into these cars at 10pm at night in Sydney and set it to drive Brisbane and go to sleep and wake up in another state... thats scarey.

      Dirtyshadow says 'People will die' - yup. For a start, these driverless things *will* be programmed to kill people. As experts have already said - they have to be,

      In the event of a collision the car will need to be designed to calculate who to kill. If a group of schoolkid leaps out unexpectedly in front of you either through mischief or by accident, the car must have the capacity to minimize the deaths - and if you are a sole occupant and the options include going headon into traffic and killing someone else as well as you, a group of schoolkids or driving straight into a concrete pylon then .. pylon it will be.

      Are people happy with this? I have a big heavy car as I'd rather use another car as my crumple zone. Sorry, but my life is more important to me than it is to anyone else. To have an idiot kid amusing them self go under the wheels would be tragic, but I'd rather it be them than me. I do not want an AI deciding I should die on a calculation.

    My concern is when an automated car makes the decision between squashing a rabbit running over the road or driving me into oncoming traffic, off the road into a tree or catapulting me through the windscreen (or garroting me with the seat belt) to avoid it.

    When your car has the ability to decide what is the safest solution to a problem, you may find it decides to kill you to save others. A noble gesture for sure, but not too comforting.

      Until the day comes when a self-driving car can reliably determine that something outside the car is a lot more valuable than you are (a very long way off) - it will prioritise saving you. You'll get the benefit of the doubt for many decades at least.

      And even then, ethicists usually agree that you shouldn't be sacrificed in such a situation without being given any choice in the matter.

        have a read of this:
        http://indaily.com.au/news/2015/11/09/designed-to-kill-the-uncomfortable-ethics-of-driverless-cars/

        “We want to encourage motorcyclists to wear motorcycle helmets, but if you’ve got the choice of crashing into (one of) two motorcyclists, it’s less likely to produce a fatality if you crash into the guy who’s wearing the motorcycle helmet,” said Sparrow.

        “So, if you’re a law-abiding cyclist, then you’ve just made yourself a target for the autonomous vehicles.

          These are the opinions of an academic who's not even in the industry, let alone actually designing autonomous systems. The hypothetical situations he's talking about have very little to do with the decision-making processes of current self-driving vehicles.

        Correct me if i'm wrong, but my understanding is that the autonomous vehicle "sees" the rabbit running around well in advance and adjusts its speed to safely navigate past it. Even if it means slowing to 20km/h or less. The cars are programmed to be super cautious so they aren't going to barrel along at 100km/h if there's something unpredictable in the vicinity.

          Maybe so, but they're certainly not going to kill their passenger doing it. No self-driving car would jam on the brakes like that if by doing so it would be hit by a car from behind. That's a lot more predictable than a small obstacle somewhere in front.

          That in itself makes me *not* want to buy them. While I agree that slowing down somewhat for children or large animals near the road isn't a bad idea imagine cruising along the highway and being forced down to 20kmph because there's a rabbit 200 meters ahead. Or how about a plastic bag blowing across the road? Will the computer rightly ignore it or will it break thinking it's a hazard?

          I'd assume (hope) that there would be some logic built in that the car wouldn't actually slow down in situations like that but who knows? Like the author of the article suggests, there will definitely be teething problems for some time. Especially while there is mixed road use (driverless and regular vehicles).

            Of course. As with anything like this there will be 'teething problems' but if people like it or not driverless care ARE coming. Car and tech companies around the world aren't investing billions of dollars for nothing. They are well aware of what is on the line with critics constantly in their ears, it's in their best interest to get it right. Hence the millions of kilometers of testing and learning (and that's just Google!).

            I'm not suggesting that everyone will be on the streets in driverless cars in the next 5 years, heck even probably the next 15 years, but 20 years from now I'd be surprised if driverless cars are not out numbering 'manually driven' cars. The Tesla cars are already on the roads in limited driverless capacity. The future is coming.

    I doubt that there will be "driverless" cars without steering wheels until well into the technologies development. Faulty GPS won't be too much of a problem as the cars don't blindly follow directions, but rely on onboard sensors to assess imminent risk and danger. In cases of abnormal driving conditions, at least to begin with, I believe that driverless cars will revert to manual driving ef floods, heavy pedestrian traffic on road etc.

    The article is just a bit alarmist imo, the percentage of accidents in driverless vehicles is going to be insignificant compared to human drivers and most of the accidents driverless cars do get into will still be caused by humans.

      The author agrees with you, as he says a couple of times in the article. The only point was, until the bugs are smoothed out, we'll inevitably get some accidents that could have been avoided, so let's take our time and minimise that.

      Personally though, I tend to think that even if self-driving cars do get involved in avoidable accidents through faulty software, as long as their overall accident rate is significantly less than humans then we should be trying to replace fallible drivers as fast as possible. Humans get into a lot of avoidable accidents..

    I think the most important aspect of the autonomous vehicle equation is going to be automatic testing and servicing. Judging by how some people look after their cars, poorly maintained autonomous cars would be disastrous!

    I think Autonomous Vehicles can only function in a place where all vehicles are autonomous. Where they all communicate position, speed, destination etc with each other. Otherwise there is an underlying uncertainty with too many unknown variables.

    "there will be a rapid and poorly-considered political reaction, and regulation will be passed that isn’t an effective method of reducing risk, but certainly creates the impression that it is. "

    So, business as usual from our political lords and masters, then. It's the 'We-must-be-seen-to-be-doing-something' syndrome that all government bodies seem to suffer from.

    The best solution for the rabbits is to have an autonomous turret hidden in the grill... but on Halloween, if you dress like a rabbit, to bad for you.

Join the discussion!

Trending Stories Right Now