U.S. Tesla Driver on Trial for Autopilot Crash That Killed Two

U.S. Tesla Driver on Trial for Autopilot Crash That Killed Two

A judge in California has ruled that the driver of a Tesla Model S on Autopilot involved in the deaths of two motorists must stand trail for manslaughter. This is the first felony prosecution in the U.S. against a driver using a partially-automated system, according to the Associated Press.

It’s also the first case involving Autopilot, one of the most widely-available semi-autonomous systems on the market.

State prosecutors believe there’s enough evidence to try Kevin George Aziz Riad, 27, on two felony counts of vehicular manslaughter, each carrying a prison sentence of up to six years in California. The prosecutors initially charged Riad in January of 2022, but it was unclear at that time whether the case would proceed to trial.

The case dates back to December 29, 2019, when Riad’s Tesla Model S exited a freeway in Gardena and ran a red light. The Tesla was doing 119 km per hour through the intersection when it crashed into a Honda Civic.

The Civic’s occupants, Gilberto Alcazar Lopez, 40, and Maria Guadalupe Nieves-Lopez, 39, died at the scene of the crash. Their relatives later told reporters the two were on their first date that night. Riad and an unidentified woman in the Tesla were hospitalized with non-life threatening injuries.

State prosecutors said that two of Tesla’s automated systems were active at the time of the collision: Autosteer and Traffic Aware Cruise Control. A Tesla engineer testified that sensors in the Tesla Model S showed Riad had his hand on the steering wheel for several minutes up to the moment of impact — a safety measure required by Tesla to ensure the driver is engaged while Autopilot is active.

Crash data showed that, while Riad had a hand on the wheel, he had not applied the brakes in the six minutes before the crash. A police officer testified that Riad drove past several traffic signs warning drivers to slow down as they exited the freeway.

Tesla’s official reply in such cases is that Autopilot and its “Full Self-Driving” systems can’t drive themselves, so drivers have to be prepared to retake control of their cars at any time. But this admonition doesn’t fully come across in the way that Tesla’s CEO, Elon Musk, talks about so-called FSD.

And the debate over fault and liability when partially-automated cars injure or kill people is still ongoing. It has prompted Congress and NHTSA to look into autonomous vehicle systems in the U.S., while lawmakers abroad try to agree on regulations, too.

U.S. Tesla Driver on Trial for Autopilot Crash That Killed Two
Photo: David Zalubowski, AP

The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.