A video appears to show Tesla's Autopilot system steering a Model X into the divider where a different Tesla Model X crashed on March 23. The driver in the deadly crash was also using Autopilot at the time of the crash, Tesla later disclosed. The new video, filmed by a Tesla Model S owner, suggests that Autopilot might have a vulnerability in similar situations.
The video was uploaded on April 2 by Youtube user Shantanu Joshi. In the video, Joshi drives his Model S on Highway 101 up until part of it divides off into Highway 85, the site of the crash. The car can be seen veering left, heading straight for a divider, apparently following a white line. Joshi then takes control of the steering wheel. It does not appear as if Autopilot warns Joshi to take control at any point in the video, though the system does tell drivers to pay attention and keep your hands on the wheel when it is activated.
A Tesla spokesperson gave a statement to Jalopnik when asked for comment (link to ABC 7's reporting is mine):
Autopilot does not, as ABC 7's reporting suggests, make a Tesla an autonomous car or allow a driver to abdicate responsibility. To review it as such reflects a misrepresentation of our system and is exactly the kind of misinformation that threatens to harm consumer safety. We have been very clear that Autopilot is a driver assistance system that requires the driver to pay attention to the road at all times, and it has been found by NHTSA to reduce accident rates by 40%. It would be highly unfortunate if news stories like this influenced people to not use a system that adds to safety.
Included in Tesla owner's manuals are multiple warnings for drivers using Autopilot that they still must pay attention to the road when Autopilot is engaged and take corrective action if needed, and Tesla maintains that when driving a Tesla equipped with the system "you are 3.7 times less likely to be involved in a fatal accident." If you crash a Tesla while using Autopilot, in other words, it's still your fault.