A Google Self-Driving Car Got Into A Crash With A Bus (And That's OK)

A Google Self-Driving Car Got Into a Crash With a Bus (And That's OK)

Well, it finally happened. One of Google's autonomous vehicles might have caused a minor fender-bender. (Luckily, it appears nobody was hurt.) And guess what — it's probably going to happen again. And that's fine. In a California DMV report discovered by Mark Harris, one of Google's autonomous Lexus cars was stopped at a Mountain View intersection on February 14 when it needed to manoeuvre around sandbags placed in the right-hand lane. The car assumed — as a human driver might have — that as it nudged back into traffic, a slowing city bus was allowing the car to merge. It wasn't, and the car struck the bus.

According to the DMV report, Google's car "sustained body damage to the left front fender, the left front wheel and one of its driver's-side sensors". No injuries were reported.

This is exactly how Google is trying to teach its cars to think more like humans. Google's explanation of the incident in its monthly report will be out tomorrow, but here's what the department said about the crash in a copy given to The Verge this morning:

Our test driver, who had been watching the bus in the mirror, also expected the bus to slow or stop. And we can imagine the bus driver assumed we were going to stay put. Unfortunately, all these assumptions led us to the same spot in the lane at the same time. This type of misunderstanding happens between human drivers on the road every day.

This is a classic example of the negotiation that's a normal part of driving — we're all trying to predict each other's movements. In this case, we clearly bear some responsibility, because if our car hadn't moved there wouldn't have been a collision. That said, our test driver believed the bus was going to slow or stop to allow us to merge into the traffic, and that there would be sufficient space to do that.

This morning, at a transportation event sponsored by the Los Angeles Times, Google's self-driving project lead Chris Urmson spoke about safety as the primary motivator for the self-driving car project, with the goal to reduce the estimated 40,000 roadway deaths that occur on US streets every year — 94 per cent of which are caused by human error.

Urmson also fielded several questions from audience members this morning who were clearly worried about robot drivers swarming the streets (including one person who asked about the theoretical "how many kids would you kill" ethical dilemma known as the Trolley Problem). While the chances of this particular situation happening is exceedingly rare, said Urmson, which "reduces the chances of that ethical dilemma to almost nothing", in fact, there is a hierarchy. "We try hardest to avoid unprotected road users," he said, like pedestrians and bicyclists, followed by moving objects on the road, followed by static objects.

One might make the argument that the only way autonomous vehicles can truly be safe is if every single car (and city bus) on the road is self-driving. But Urmson said that even a single self-driving car is improving conditions for all. "Having one of them on the road makes that person safer and makes everyone around them safer."

[The Verge and @meharris]

Top image: a Google car on the streets. AP Photo/Tony Avelar


Comments

    Always expect the unexpected and you will always be prepared for it. How do you program that? The only way to know if the bus was going to allow something to happen, would be for an actual indication to be given, e.g. in merging in slow moving traffic, someone might flash lights briefly or whatnot.
    It'll have fun in Adelaide with people absolutely not allowing any merging, and being surprised when you do, even after signalling for a LONG time...

      I thought it was just use in WA having problems with merging - worse is people treating merging lanes as overtaking lanes.

      During my 4 years living in Adelaide i was always amused by the race to stop someone merging.
      But it's not as frustrating as the Sydney trick of stopping in the merging lane to wait for a gap because the mean old cars are going too fast.

    It is fine is it? How many lives is 'acceptable' to lose with smart car crashes in Australia? The road toll at them moment averages under 400 a year, but why don't you put your whole family together in one room and imagine they are the road toll... How many is acceptable now? ZERO!

    So maybe you shouldn't promote that it's ok unless you think it's ok as long as it doesn't affect your personally mentality.

    Also, what time is the DJI Phantom 4 announcement? I am very excite to see what's new.

      Solution: Ban everything in the world. EVERYTHING. If you can't do anything, no-one gets hurt.

      What happens if driverless cars bring that 400 toll down to 200, or 100, or 20? Where did you get the 400 figure by the way, I'm seeing its still over 1000 a year. Not a biggie, just curious.

      There are people that will expect perfection, when its simply not possible, so at what point is it acceptable? Thats the key question the legislators need to decide on, and what they need to work with

      In 1975, there were roughly 25 deaths per 100,000 population, today its down to 5/100k. If this got it down to 1/100k, isnt that significantly better?

      Nobody was hurt and it seems the human involved was more at fault than anyone.

      The development of autonomous cars is designed to lower the road toll, sure at this point they're aiming (apparently successfully) at matching the safety level of a human driver, improving that is the next step.

      As has been said elsewhere it sounds like an asshole aggressive bus driver, if he was replaced with an autonomous vehicle that could communicate then it likely wouldn't have occurred unlike if there were two humans.

        In the US, it's over 32,000 deaths a year. If that gets down to 1000, we'd be saving 31000 lives a year. However, all families of those 1000 would be suing Google.

        "Nobody was hurt and it seems the human involved was more at fault than anyone. "
        And if they are programming the cars to behave more like humans, then there are going to be more of these crashes.

          Gee way to take one bit really out of context.

          They're designed to think like a human in terms of basic problem solving. If you reach that baseline then you can further improve safety to get beyond the status quo.

          Honestly there's probably an accident like this happening right now and pretty much nobody gives a shit.

      How many is acceptable now?
      As a morality question, zero.
      As a realistic pursuit, while zero is great, it's not realistic. People die all the time, from a lot of various things. That's just the reality of life; You can't even guarantee 0 deaths from living in a bubble. We should all use due diligence, but at some point you're going to have to accept that death is a part of life and deal with it.
      How active are you about suicide prevention? It's far more preventable than car accidents, yet you are almost three times as likely to die from suicide than a car accident.

        we know how to get the death toll down to zero, we put the speed limit down to 10kph. a head-on at 10kph is extremely unlikely to cause death or even injury. And we ban bicycles as they do not have a good solid metal cage around them, after all, we humans are not like kangaroos, we're not built for colliding with things at any great speed..

        but there's the convenience issue. My guess about self driving cars is that should the roads become populated with them, they'll all travel very slowly.

        I've also been thinking about the liability issues and I can see how the companies can resolve this - you do not own a car, you just lease them - that takes care of who bent who's wing mirror.. they may even go a share car route.. park your car and walk away, find another when you need one. Hope you didn't leave anything in it..

          Doesn't really make sense. If self driving cars incorporated 100% of the road going traffic, I'd expect speeds to increase, not decrease, due to better safety and efficiency.

      Not a problem for you because your a big supporter of public transport right? due to the low fatality rate of public transport your family will be much safer than in cars. So while the rest of us adopt smart cars and reduce the fatality rates of cars, you my friend will be statistically safer on a bus.

    Just another case of an asshole of a bus driver, being an inconsiderate asshole.

    well, at least no one dies from the accident.. next time all cars should be a self-driving car, and everyone should take public transport.

    In an ideal world the bus would've also been AI controlled on the same network as the car allowing the two to communicate and avoid a crash all together.

    This accident could've been avoided, if the bus driver was more careful. Sure both vehicle were at fault but it shows that autonomous vehicle is the future.

    I quite embrace the idea of autonomous vehicles, there's just way too many accidents and uneducated drivers on the roads these days. We need less idiots/uneducated drivers and more smart drivers on the road....

    And I can already see, people disagreeing with my statement...

    Last edited 01/03/16 2:04 pm

      Were they both at fault? It sounds like the Google vehicle was changing lanes and made an incorrect assumption as to whether the bus driver was letting them in or not. Was the bus obliged to give way? The article doesn't seem to suggest it was.

        When a vehicle is changing lanes you don't have a legal right to ram into them.

        I wish more drivers realised this. Similarly speeding up to prevent people changing lanes is also illegal, if a car in front of you is indicating to change lanes then you should (if it's safe) be allowing them to do so.

          No, but you change lanes when you have space to do so while factoring in what's going on around you. While the traffic wasn't going fast, the Google vehicle was going substantially slower than the traffic it pulled into . Both the report and article indicate a judgement error and the human witness (Google Employee) isn't overly convincing in supporting a belief the bus driver was at fault. They "thought" the bus would slow or stop, but the report doesn't indicate they witnessed any braking to back that up.

          The report says the car hit the side of the bus, which makes it sound as though the bus was parallel with the car at the time of the incident and not a case of the car merging into space and then being t-bones or rear ended by the bus.

          None of that's to say the bus shouldn't have stopped. I'm just not convinced they are at fault.

          Last edited 02/03/16 8:41 am

            Sounded like the right lane was closed due to some roadworks or other obstruction which implies a basic merging, generally merging requires traffic to slightly slow down.

            We're both making assumptions but honestly it feels like the google car is getting a lot of blame that's unwarranted, at worst it's probably both their faults but I lean towards rude impatient bus driver before I'll think the google car (With a much better safety record) is fully at fault.

              Honestly, while I blame the Google car their track record has been pretty amazing so I'm not all that fussed. A minor ding here or there, while unwanted, is a good learning experience and will go a long way to improving the technology.

              What would be interesting is to know how the same scenario would have played out if the bus was replaced with a second Google vehicle. Would the autonomous vehicle had let the other one in or assumed ownership of the lane as the bus driver did. I assume they aren't programming them to be overly aggressive and it would let a car in, but would be interesting to know for sure.

              Last edited 02/03/16 4:27 pm

Join the discussion!

Trending Stories Right Now