Model Y

Make
Tesla
Segment
SUV

Fact: there are exactly zero fully self-driving vehicles available for purchase in the United States in 2021. But if you talked to an average Tesla owner, they might seem confused by this statement. That could be because they paid $10,000 for a feature called Autopilot with "Full Self-Driving Capability." A feature that, as of this writing, does not function as its name suggests. Tesla's fine print clearly states that this system is not fully autonomous and requires driver intervention, but that hasn't stopped a series of Autopilot-related crashes, including several fatal incidents.

According the to an AP News report, the National Highway Traffic Safety Administration has opened an investigation into Tesla's Autopilot system. This is not the first time the NHTSA has investigated a Tesla crash, but this one is pretty extensive, covering 765,000 vehicles (every Tesla Model Y, X, S and 3 sold from 2014 to 2021).

The NHSTA has pinpointed 11 crashes since 2018 where a Tesla vehicle collided with another car at a scene where first responders were involved and using their flashing lights, flares, or hazard cones. One possible instance happened in March of this year, when a Tesla Model 3 struck a parked police car, though it was unclear if Autopilot was engaged during that specific example. Of the crashes identified by the NHTSA, 17 people were injured and one was killed.

"The investigation will assess the technologies and methods used to monitor, assist and enforce the driver's engagement with the dynamic driving task during Autopilot operation," NHTSA said. "NHTSA reminds the public that no commercially available motor vehicles today are capable of driving themselves. Every available vehicle requires a human driver to be in control at all times, and all state laws hold human drivers responsible for operation of their vehicles."

Systems like Tesla's Autopilot are capable of saving lives, like safely bringing the car to a stop when a drunk driver passed out. However, they can also be quite dangerous when users don't comprehend the full breadth of their capabilities. We've already seen how easy it is to fool Autopilot into operating without a driver behind the wheel by using a basic cheat device.

It's unclear was will come from this latest NHTSA investigation, but it shows the agency could crack down on Level 2 driving systems. The agency may limit Autopilot usage to areas where it may be safer, such as highways, or require Tesla to make sure drivers are paying attention when using the system.