Tesla to Expand Full Self-Driving Beta, But Top Safety Official Says It Needs to Tackle ‘Basic Safety Issues’ First

Tesla to Expand Full Self-Driving Beta, But Top Safety Official Says It Needs to Tackle ‘Basic Safety Issues’ First
Photo: Justin Sullivan, Getty Images

Tesla is getting ready to roll out a sweeping update for its “Full Self-Driving” mode that would extend the feature’s beta testing to more customers and areas. But before it does, the automaker needs to address some “basic safety issues,” said Jennifer Homendy, head of the U.S. National Transportation Safety Board, in a recent interview with the Wall Street Journal.

Full Self-Driving is a more advanced version of Tesla’s assisted driving system designed for navigating highways, Autopilot. Despite their namesakes, neither version of Tesla’s driver-assistance software is fully autonomous, and Tesla warns that a human driver must remain alert at the wheel and ready to take over at any moment.

Homendy called it “misleading and irresponsible” for Tesla to advertise its software as “full self-driving,” adding that the company has “clearly misled numerous people to misuse and abuse technology.”

A beta version of Full Self-Driving mode launched in October 2020 for a select few Tesla drivers. After announcing plans for a wider release by the end of September, Tesla CEO Elon Musk said Friday that drivers who want to try out the latest version of Full Self-Driving mode will have access to a “beta request button” around Oct. 1.

“Beta button will request permission to assess driving behaviour using Tesla insurance calculator,” he wrote on Twitter. “If driving behaviour is good for 7 days, beta access will be granted.”

The update is also expected to add new tools to help drivers navigate city streets as well as highways. But Homendy believes this move is dangerously premature:

“Basic safety issues have to be addressed before they’re then expanding it to other city streets and other areas,” she told the Journal.

The NTSB, which can conduct investigations and share recommendations but has no regulatory authority, has previously investigated three fatal Tesla crashes that involved the company’s Autopilot system. It launched a fourth inquiry on Friday after two people were killed in a vehicle crash involving a Tesla Model 3 in Coral Gables, Florida. In February 2020, the board determined Tesla’s Autopilot software was one of the possible causes behind a fatal 2018 crash in Mountain View, California, where the driver was playing a mobile game when the incident occurred.

In 2017, the NTSB advised Tesla and five other automakers to improve the safety of their semi-autonomous vehicles so that it’s more difficult for drivers to misuse them. The other five companies responded and agreed to adopt more stringent safeguards. Tesla alone ignored the NTSB’s recommendations, though it has worked on some of its safety features in the years since, such as increasing the frequency of alerts if a driver using Autopilot takes their hands off the wheel.

Tesla did not immediately respond to Gizmodo’s request for comment. The company has largely stopped addressing media inquiries since dissolving its PR department in October 2020.