There Is Absolutely No Reason To Trust The Safety Record Of Tesla’s Autopilot System

There Is Absolutely No Reason To Trust The Safety Record Of Tesla’s Autopilot System

Tesla has long lurked in a category of its own in the self-driving car race; where Uber and Google’s Waymo are building fully autonomous vehicles essentially from the ground up, Elon Musk’s electric car company is slouching towards autonomy through a series of increasingly sophisticated updates to its semi-autonomous Autopilot system.

Because Teslas are not totally self-driving, and because they are already on the roads, this puts the company in a sort of grey area — even greyer than the already grey area where standard autonomous vehicles dwell — when it comes to regulation and oversight.

This is a problem. Musk is a nonstop booster and font of optimism for Autopilot’s self-driving capabilities, he has millions of diehard devotees and customers, and Autopilot has so far been enabled during at least four fatal Tesla crashes.

It’s a volatile and increasingly dangerous situation, especially as Musk continues to vouch for its safety, and make claims such as the one about how there will be a million autonomous Teslas on the road next year.

Meanwhile, no one really has any good data about how and in what circumstances crashes that involve Autopilot happen. US states don’t require that kind of data be collected because Teslas aren’t technically autonomous cars. So investigators have access to only small slivers of said data, and Tesla refuses to share any of its own trove.

All of which is why Matt Drange’s epic investigation into Autopilot’s safety record for The Information should absolutely be making a bigger splash than I’ve at least personally seen it making.

Perhaps it’s because it’s hard to get anything to stand out these days. Perhaps it’s because it literally costs hundreds of dollars to subscribe to The Information and to read stories that serve the public interest such as this one. Who knows.

But while the entire piece is full of good reporting and interesting insights about the many, many challenges regulators and safety officers face in coming to grips with the Autopilot situation, one thing stuck out: Tesla’s refusal to even comment on the record in any official capacity about Autopilot’s safety record.

Not only will Tesla apparently not make public the Autopilot data itself, or share it with regulators, it wouldn’t even discuss the numbers with Drange.

Musk has in the past been lauded for his transparency — see: His detailing very specific, elaborate plans to bring Tesla to the mainstream, or open-sourcing tossed-off Hyperloop specs — and has also been chastised for being too transparent. See: His Twitter feed, which is perpetually on the brink of a very public and very expensive train wreck.

So the fact that he will not cough up any of the data about Autopilot feels pretty telling. If it put Tesla in a positive light, there seems to be little question Musk would loose it upon the world. That’s what he does.

Instead, Tesla publishes its own quarterly vehicle safety reports that purport to demonstrate how driving with Autopilot is much safer than driving without it.

As Drange notes, “the reports only show a rate of collisions on a per-miles-driven basis and don’t disclose what caused the crash and whether the Tesla driver was at fault.”

The experts he cites aren’t buying it either. “It’s obviously a misrepresentation,” Hemant Bhargava, a UC Davis professor of technology management told Drange. “You’re only in autonomous mode in the best scenarios, so the number of crashes will be lower.”

(Tesla wouldn’t share any data with me, either — a spokesperson referred me to the same Vehicle Safety Report.)

As such, there is absolutely no conceivable reason to trust the safety record of Tesla’s Autopilot system — and the stakes are only getting higher. Musk continues to all but encourage users to switch on Autopilot and let the software take over in his public appearances.

He remains so full-bore bullish on Autopilot, so deeply convinced of its safety, that it can seem at times that his own staff must have pulled a White House-staff-in-Japan and somehow hidden from his feeds news that four people have died while driving with the software enabled. Because at this point, it’s approaching levels of delusionality.

Until recently, Tesla even sold its cars with the promise that they all came equipped with “full self-driving hardware” — a phrase that was plastered on its website when I wrote about the nascent industry’s recklessness a few months ago.

After I argued that promoting the feature might be creating a culture of belief in the system, Tesla’s PR team angrily contested the accusation at length; it was probably one of most contentious conversations I’ve fielded in my entire career. (Now, it looks as though they’ve at least adjusted the language on the website.)

But drivers have already absorbed Tesla and Musk’s techno-optimism in these systems, which, again, we have little reason to trust.

Before we can, Tesla has to get real about Autopilot — share its collision data with safety investigators, or better yet make it public; tone down the aggressive and unrealistic autonomy rhetoric; be honest about the state of its self-driving ambitions — or a fifth Autopilot fatality is all but inevitable.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.