Imagine this scenario. You’re driving down the highway in a spiffy new Tesla with your legs stretched out as the Autopilot driver-assistance feature maintains your speed when suddenly the car’s stereo starts blasting at full force. During your panicked scramble to pull over, you notice your windows strangely start lowering as if controlled by a ghost. Then, out of nowhere, the doors suddenly spring open.
Those are just some of the high-stakes hijinks 19-year security researcher says he can pull after remotely hacking into at least 25 Teslas spread out across 13 countries. The researcher, named David Colombo, posted some details via a Twitter thread on Tuesday where he claimed he could remotely run commands on the affected vehicles without their owners’ knowing. In addition to adjusting stereo volume and manipulating the vehicle’s doors and windows, Colombo claimed he could also start the vehicles remotely, obtain their exact location and determine whether or not a driver was present in the car.
I could also query the exact location, see if a driver is present and so on. The list is pretty long.
And yes, I also could remotely rick roll the affected owners by playing Rick Astley on Youtube in their Tesla‘s????
— David Colombo (@david_colombo_) January 11, 2022
Colombo did not provide specific details on how he obtained access to the vehicle’s system but noted it wasn’t the result of a vulnerability in Tesla’s underlying infrastructure. The researcher claimed he was actively trying to notify the owners of the affected vehicles, and that he would release more technical details once the affected drivers were “able to take appropriate measures.”
Colombo did not immediately respond to Gizmodo’s request for comment and Gizmodo could not independently confirm the veracity of his findings, so please take them with a grain of salt.
Yes, I potentially could unlock the doors and start driving the affected Tesla‘s.
No I can not intervene with someone driving (other than starting music at max volume or flashing lights) and I also can not drive these Tesla‘s remotely.
— David Colombo (@david_colombo_) January 11, 2022
That caveat aside, the researcher’s findings seem to have gotten Tesla’s attention. In an update, the researcher said Tesla’s security team had reached out and was launching its own investigation into his findings. Gizmodo reached out to Tesla as well but hasn’t heard back. (Tesla shut down its PR department in 2020 and limits its public comments.)
If Colombo’s claims are true, this wouldn’t be the first time hackers and researchers have gained remote access to Tesla vehicles. In 2020, a security researcher from the U.K. named Lennert Wouters demonstrated how a vulnerability in Tesla’s keyless entry feature could potentially allow bad actors the ability to rewrite a key fobs’ firmware over Bluetooth to unlock and potentially steal a Model X vehicle.
Then, last year, a pair of security researchers were able to remotely hack into a Tesla’s infotainment system using a drone. In that case, the researchers were reportedly able to remotely unlock doors, change seat positions, play music, and mess with the climate control settings.
Tesla addressed each of these vulnerabilities in the past and maintains an active bug bounty program where pre-approved security researchers can register vehicles for testing. Those researchers in turn can reportedly receive anywhere from $US100 ($138)-$US15,000 ($20,636) for discovering a qualifying vulnerability. It’s unclear whether or not these rewards would apply to Colombo’s findings.
Regardless, if Colombo’s findings are legitimate, they could add yet another headache for Tesla, which in recent months has had to contend with multiple recalls, a federal investigation, and reports of a first major crash involving its Full Self Driving beta feature.
The most recent recall, which involved 356,309 Model 3 sedans and 119,009 Model S vehicles, revolved around issues with a rearview camera harness and a misaligned latch in the front trunk of certain vehicles. Unlike a previous recall of 11,704 vehicles that Tesla was able to patch via an over-the-air update, it appears some vehicles implicated in the more recent recall required physical repairs in service centres.
Tesla’s FSD beta is also coming under renewed scrutiny this week in California, where the state’s Department of Motor Vehicles is reportedly reviewing the feature to determine whether or not it meets the legal definition of “autonomous,” The Washington Post notes. That’s potentially significant because recognition of Tesla’s vehicles as autonomous under California law could open the company up to new rules and regulations.
“The DMV has notified Tesla that the department will be initiating further review of the technology on their vehicles, including any expansion of the current programs or features,” a DMV spokeswoman told the Post. “If the capabilities of the features meet the definition of an autonomous vehicle according to California law and regulations, DMV will take steps to make certain that Tesla operates under the appropriate autonomous vehicle permits.”
Though Tesla and CEO Elon Musk have exaggerated the ability of its driver assistance features in the past, the company has walked those claims back and stated FSD isn’t currently capable of full autonomy.