Tesla Begins Deploying Full Self-Driving Beta To Select Customers But It Is In No Way ‘Full Self-Driving’

Tesla Begins Deploying Full Self-Driving Beta To Select Customers But It Is In No Way ‘Full Self-Driving’

This week, hardcore Tesla fans got complicated feelings in their bathing suit areas as Tesla announced the limited deployment of their new Full Self-Driving (Beta) software, which, based on the name and the hype, sure sounds like it’s a fully autonomous driving system for Teslas.

Except it’s not. At all.

Yes, it does plenty of things and is extremely technically impressive, but it’s still a Level 2 system, a very advanced driver-assist system, not by any means fully autonomous. There’s a lot of confusion about this, confusion that Tesla themselves are causing, and it could be dangerous.

Elon Musk, Tesla’s CEO and man who keeps his non-running project car in heliocentric orbit, announced the release with a note of caution in a tweet:

This tweet is a nice encapsulation of the confusing way Tesla is approaching this, where caution is mentioned, but at the same time the extremely deceptive and confusing term “FSD” (for Full Self-Driving) is used, something that can only suggest a fully autonomous car, one capable of, you know, driving itself.

But it’s not capable of driving itself, not really or safely. This is referenced in the description of the software on a dialog box on the car’s display screen:

The text reads:

Full Self-Driving is in early limited access Beta and must be used with additional caution. It may do the wrong thing at the worst time, so you must always keep your hands on the wheel and pay extra attention to the road. Do not become complacent.

When Full Self-Driving is enabled your vehicle will make lane changes off highway, select forks to follow your navigation route, navigate around other vehicles and objects, and make left and right turns. Use Full Self-Driving in limited Beta only if you will pay constant attention to the road, and be prepared to act immediately, especially around blind corners, crossings, intersections, and in narrow driving situations.”

What this is saying is that the system, for all it can do, is just a Level 2 driver-assist system, just like every other semi-autonomous system on the market.

It doesn’t fundamentally matter if it can change lanes, go around objects, follow forks in the road, or whatever. Those are all extremely impressive technical achievements, but the system requires the driver to remain vigilant and ready to take over at any second should the system become confused or fail.

That’s by definition not “full self-driving,” and Tesla’s continued use of that descriptor is dangerous.

It’s dangerous because all Level 2 systems are inherently problematic, not because of the silicon computers but because of the wet biological computers held in the bone cases we like to grow hair on and put hats on. Level 2 systems make demands of people that humans simply aren’t good at, and this is a problem.

So many wrecks with Teslas on Autopilot happen because the deceptive and confusing way the system is marketed, named, and discussed makes people think they don’t really need to pay attention. And so they don’t.

It’s not just me saying this; autonomy and automation researchers have understood this basic concept for decades: when you switch humans from an active task, like driving, to a “vigilance task” like monitoring a system that’s doing about 80 to 90 per cent of the driving tasks, they do not perform well at all.

Here’s what Dr. Michael Nees of Lafayette College, an expert on human interaction with automated systems, has to say about this kind of thing:

“We think of automation as a machine doing a task that a human used to do… you might think that means a human does nothing. But in fact there’s abundant literature that shows the human is not incurring no workload, the human is now doing a different task and that task tends to be monitoring, a vigilance task, looking for rare events…that is a task that humans are not well-equipped to do.”

When that automated system is a car driving down public roads surrounded by other cars at normal car-speeds, the reaction time needed to take control from the car when needed is measured in whole seconds if you’re lucky, which means that unless the person behind the wheel is really staying alert and ready, there could be trouble.

FSD Autopilot is bad because it’s promising something that, if you’re actually using it the way the customers would want to use it and how it’s marketed, then you’ll always be using it wrong, and wrong in a way that could put the Tesla’s owner and others in real danger.

The inherent problems are clear whenever an accurate description of the system is given, like in this Electrek article (emphasis mine):

“As expected, the new update enables turns on city streets and links to Tesla’s Navigate on Autopilot features, delivering a virtually feature-complete self-driving system that drivers need to constantly monitor.

Look, if you need to “constantly monitor” it, you really can’t call it a “feature-complete self-driving system.” Because it’s not really self-driving. It’s doing most of what it needs to be self-driving, until it can’t, and then, without any warning, the driver has to jump in.

I’m not saying the system isn’t technically impressive — it absolutely is! Videos we’ve seen from users are impressive:

https://twitter.com/a/status/1319149428091932674

But, since the system has no failover, that just makes it worse, because the more it does, the more the driver feels like the car is actually driving itself, the more trust gets put into the system, the more complacent the driver becomes, until they’re no longer really paying attention and something goes wrong.

And, remember, these Teslas are using the existing hardware setup of cameras and radar emitters on the car, with no lidar or any mechanisms to keep the cameras clean. Driving through a puddle or a swarm of bugs or getting splashed by a truck or heavy rain or some snow or ice or any number of things that happen all the time when driving could compromise the system enough to require immediate driver input.

Or, the system could get confused by a fucking shadow, which happens all the time. I’m sure improvements have been made, but, fundamentally, it doesn’t matter, because this is still a Level 2 system, despite all the hype.

The fact that Tesla is deploying this as a “Full Self-Driving” Beta on public roads feels questionable at best. It seems strange that these systems somehow get a free pass to go out and give it a go, why not, when other companies testing autonomous driving systems have to get permits in most states, including Tesla’s home state of California. If this is really supposed to be “full self-driving,” why would these cars not need to be permitted?

It’s possible there’s a blanket permit for them, which still seems pretty lax, considering the owners/drivers aren’t permitted or vetted individually, like every other driving permit.

The NHTSA is at least aware of the situation, and released this statement:

Attributable to NHTSA: NHTSA has been briefed on Tesla’s new feature, which represents an expansion of its existing driver assistance system. The agency will monitor the new technology closely and will not hesitate to take action to protect public against unreasonable risks to safety. As we have stated consistently, no vehicle available for purchase today is capable of driving itself. The most advanced vehicle technologies available for purchase today provide driver assistance and require a fully attentive human driver at all times performing the driving task and monitoring the surrounding environment. Abusing these technologies is, at a minimum, distracted driving. Every State in the Nation holds the driver responsible for the safe operation of the vehicle.

The expanded feature set of this Beta FSD Autopilot will likely just mean that now we’ll get to see videos of Tesla owners sleeping while their cars drive on city streets instead of just highways. I’m not sure that’s progress.

What would have been progress would be if more development was spent making Autopilot able to failover in a safe and controlled way, without requiring immediate input from the person behind the wheel.

The problem is that doing that is a lot less fun and sexy than making the car able to mostly navigate more complex surroundings, and I totally get that. While none of this is easy, at least making more complex behaviours has benefits that you can show off and have fun with — safe failover is just one boring thing that happens when things go wrong.

Plus, in many ways, getting to Levels 3 or 4, where immediate driver takeover is not required, may be a more difficult problem to solve than getting the car better at driving. For a safe, controlled failover, the car has to be able to deal with whatever caused the confusion or issue (bad sensor input, confusing situation, etc) and somehow still find a safe way to get the car out of harm’s way and stop.

That’s really, really complicated. If the car is compromised, it might require communication with surrounding cars and/or other automated or even human-controlled systems, and that means standards development and cooperation and all kinds of messy, slow things. It’s not just an individual technical problem at that point, this is the point where we have to start thinking about AVs on a larger systems-level scale, which would mean Tesla and all the other companies playing in this space will need to talk and share and work together, and all that is hard and unsexy.

So, of course, Tesla doesn’t want to do that. It’s much more fun and exciting to write new code that lets Teslas navigate intersections and change lanes and skillfully whip around overturned dumpsters or whatever, at least until they can’t.

There are other reasons why more and more advanced Level 2 systems are getting deployed, even if long-term they don’t make sense:

Who would have thought?

So, while you’re reading all the excited tweets from the Tesla stans and watching their impressive videos and the breathless articles, remember, this is still just a very fancy driver-assist system. It is not a full self-driving system. It’s going to be confusing, and every fibre of you may want to believe, but don’t.

It doesn’t matter how much Level 2 systems will be able to do. If a human is always the immediate failover, widespread autonomy will simply not happen. Don’t believe the hype and feed this already out-of-control bullshit.

Oh, I reached out to Tesla for comment, and you know the rest.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.