How Do You Police Cars That Drive Themselves?

Google is already testing its autonomous cars on the roads of California, and plenty of other manufacturers are starting to muscle in on the act too. But when they hit the roads, how do you go about policing a city full of self-driven cars?

The New York Times reports that that very question is causing lawyers and government officials to break out in a cold sweat. It's not just working out how the police could pull such a car over, either. How well would those cars interact with normal ones? And how the hell do you insure a car that drives itself in case it crashes?

Add to that the fact that human road users tend to, uh, bend the rules a little now and then, and the problems become more evident. On that topic, Sven A. Beiker, executive director of the centre for Automotive Research at Stanford University, told the New York Times:

"Everybody might be bending the rules a little bit. This is what the researchers are telling me — because the car is so polite it might be sitting at a four-way intersection forever, because no one else is coming to a stop."

Sadly, there isn't an obvious answer to any of the questions. The simple truth is that the federal government doesn't — hell, nobody does — have enough information to know exactly what to do.

In truth, we won't see our roads full of autonomous vehicles until the artificial intelligence involved reaches a point where people are truly convinced that things are safe. That could be anything from five to 20 years away. Fortunately, that gives the powers that be enough time to try and figure out some of the finer points, like insurance and liability, properly.

In the meantime, we can probably expect cars to become gradually more autonomous, all the while still having a proper driver who can at least take over if things go wrong. Brad Templeton, a software designer and a consultant for the Google project, suggested as much to the New York Times. "It won't truly be an autonomous vehicle," he said, "until you instruct it to drive to work and it heads to the beach instead." [New York Times]

Image: Jurvetson


Comments

    a true autonomous car wouldn't need pulling over...

    I don't know... "How Do You Police Cars That Drive Themselves?"

    I ask myself this question every morning.

      "How do you headlines that write themselves?"

      WTF Seriously?

      Paul, is that you, are you working for Kotaku as an editor now?

      At first I thought that the headline was just very poor english but after reading it a couple of times I think it makes sense. If you take police to mean 'monitor' and not a car with lights and sirens then the sentence becomes:

      "How do you monitor cars that drive themselves".

        The headline makes perfect sense if you have even a basic grasp on the English language.

          Agreed. I don't know how anyone with a decent grasp of the English language has issues reading this?

          Exactly, like most headlines it's a play on words "police" is meaning "monitor" while also implying the fact that it's a law enforcement article "who gets charged if they break the law".. Clever.. To those that are confused "u can go back 2 ur txting and fb"

    I'm intrigued about what the point would be of the police pulling an automated vehicle over. What are they going to do, spank the computer for being naughty?

    There are serious questions about liability though. If an automated vehicle causes a fatality, then who is at fault? The owner, google, the government that registered the vehicle? And how much liability can they be expected to take on, I doubt a manslaughter charge would be prosecutable under these circumstances.

      Well, who has to accept blame if an aircraft crashes due to a control software fault, as opposed to pilot error? Aircraft are mostly robotic these days.

        It's my understanding that the airline company assumes responsibility if its a maintenence issue, or if its a design flaw, then it would have to be the manufacturer.
        In the case of a personal car, I guess that would translate to the owner for a poorly maintained vehicle, or again, the manufacturer if its a design flaw.
        Now that I think about it, these automated cars would have to be fitted with "black box" data recorders so that if accidents do occur, investigators can figure out what went wrong.

      There are plenty of reasons for Police to pull a vehicle over, and the fact that the car is automatically driven doesn't remove the bulk of them. Even limiting the discussion to the car itself (and not the potential for non-traffic related criminality of the driver - e.g. pulling someone over who has outstanding warrants for other offences), the car itself could be unroadworthy for instance.

    It's the same question as medical bots capable of making a diagnosis.
    If they get something wrong, is it the fault of the owner? The doctor standing over it? The programmer?

    I think in the case of cars or medibots you'd have either a driver or supervising doctor in most cases, but if we started having automatic trucks with no driver it'd have to be the company owning the truck, who (assuming the truck was in good condition), should be able to then make the manufacturer liable.

    Even if you're not controlling the vehicle directly, as the "driver", you're still ultimately responsible.

    An autonomous vehicle is unlikely to be trying to break any road rules.
    Only humans seem to feel the need to break rules for their own benefit.
    The whole point of automation is to remove the human factor.
    Autonomous vehicles are far more likely to be programed to obey the rules, not let themselves get sucked into a race between the lights.
    Or so you would hope.

    Today- If your driving up to a set of traffic lights and the brake fails, causing you to plough through and hit someone who is crossing, who's fault is that? Will the driver walk free or will they be charged?
    Future- I believe that although the cars will drive themselfs, the driver will still have ultimate control over what it does or intends to do. If the car is going a little too fast coming up to a set of traffic lights, a quick depression on the break will slow it down, if your texting while your car is driving around looking for a parking spot, your responsible. If you hit someone, although the car might not have picked it up, you still could have.

    As for breaking the laws, why would an automatic car speed, unless it was hacked or told to by the driver.. Drivers fault..

Join the discussion!

Trending Stories Right Now