Report: Most Model 3 Owners Feel Autopilot Makes Them Safer

Report: Most Model 3 Owners Feel Autopilot Makes Them Safer

We here at Gizmodo and our sister sites have done our fair share of Autopilot-bashing; or, more specifically, bashing the way Tesla promotes Autopilot as “Full Self-Driving” when it is most definitely not, to say nothing of the people who blatantly misuse the technology by falling asleep at the wheel and committing other nonsense.

But a new survey by Bloomberg sheds some light on how (presumably) normal Tesla owners feel about Autopilot. And the overwhelming consensus is they believe Autopilot makes them safer.

To find this out, Bloomberg polled 5,000 Model 3 owners about various aspects of the car, including the car quality, service, and charging. When asked about Autopilot specifically, Bloomberg reports that “more than 90% of owners said driving with Autopilot makes them safer.”

Of course, technology isn’t perfect, and the survey reflected that. Thirteen per cent of owners told Bloomberg that Autopilot “has put them in a dangerous situation,” but more than double that number, 28 per cent, said Autopilot saved them from a dangerous situation.

Paradoxically, many of the survey respondents who told Bloomberg that Autopilot makes them feel safer included, in Bloomberg’s words, “the respondents who simultaneously faulted the software for creating dangerous situations.” So, Autopilot makes them feel safer, but also creates dangerous situations. Go figure!

Just spitballing here, but I suspect the reason for this contradiction is that Autopilot’s implementation in the real world is too complex to summarise with a single review, star-rating, or takeaway. Sometimes it works really well, and sometimes it really doesn’t. At the bottom of Bloomberg’s post is a really neat graphic with thousands of comments from Autopilot users that illustrates this well. I highly recommend checking it out.

The upshot is, if drivers use Autopilot a lot, they will have a range of experiences that makes it difficult to distill the feature into a single rating. Some people will rate it highly if it prevents more crashes than it causes; others might feel one dangerous situation instigated by Autopilot is too many. And sometimes Autopilot will screw up without causing any crash at all because the human driver is actually paying attention to the road as he or she should; some humans might shrug this off as an acceptable risk of using Autopilot while others might deem it troublesome. In any event, different people will balance these pros and cons in various ways with the hours in which Autopilot works exactly as intended to come up with one rating.

It’s worth remembering, though, that Tesla owners are, by and large, early adopters and fans of technology in general. There’s nothing wrong with that, but it does suggest they’ll be more forgiving of technological errors as a small misstep on the long road to progress than others might.

And it also suggests they’re likely more forgiving of Autopilot’s mistakes than they would be if a human-made the same ones. Here’s something my colleague Jason Torchinsky brought up when we were discussing the survey results: Consider, for instance, if someone’s chauffeur made these same exact driving mistakes. Imagine your chauffeur (yes, you have one, you rich, pretty person) didn’t slow down enough for a tight bend in the road, drifted into the oncoming lane, and hit a truck. That is what one person’s Model 3 did, according to a comment Bloomberg collected.

That Tesla owner still gave Autopilot four out of five stars for safety, as if they deducted one star for the near-fatal crash. Do you suppose that person would have given a chauffeur four out of five stars for safety if they had done exactly the same thing? Or would they have been fired on the spot?

Bloomberg’s takeaway from all this will be familiar to anyone who has read my or my colleague Jason Torchinsky’s work on Level 2 automation:

These Autopilot stories illustrate the messy middle ground in which the automotive world now finds itself. Ever-vigilant vehicles running automated-driving technology can perform superhuman manoeuvres to keep drivers safe—and can also fail in decidedly sub-human ways. Close supervision is needed at all times, which is easy to forget when Autopilot is able to drive for long stretches without intervention.

Hell, Jason wrote a whole chapter in his book about this problem, a book that is good and you should buy (I also wrote an article about it but Jason’s book is better).

Humans are, in general, very bad at providing “close supervision… at all times,” as Bloomberg says Autopilot requires, especially on long uneventful highway stretches. Which is precisely why Jason, myself, and a little company called Waymo are believers in skipping Level 2 automation altogether as not only unnecessary, but dangerous. Nevertheless, it doesn’t look like that’s going to happen. Autopilot is here to stay, for better or for worse.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.