Model Y

Make
Tesla
Segment
SUV

Over the last few years, we've reported on multiple Tesla investigations that have been linked to its Autopilot and Full Self-Driving (FSD) suites. Last month, we reported on a federal investigation relating to Autopilot technologies that concern no less than 830,000 vehicles. And just last week, we reported on a fatal crash involving a Tesla Model S in Florida that killed two people.

Over the last six or so years, 37 investigations by the National Highway Traffic Safety Administration (NHTSA) have been opened concerning 18 deaths, and all are suspected to have been linked to Autopilot. We have also seen how Tesla's FSD Beta software has repeatedly missed the mark, sometimes almost colliding with other road users. Unfortunately, Tesla's troubles in this area are far from over as the NHTSA is now investigating the 38th incident after the death of a motorcyclist who was involved in a collision with a Model Y earlier this month.

The precise details of what occurred in the July 7 collision between the motorcyclist and the 2021 Tesla Model Y remain unclear, but we do have a basic idea of what unfolded. My LA News reported that the motorcyclist was riding in the HOV lane when the Model Y approached from behind. The rider of the Yamaha motorcycle was thrown onto the freeway and later pronounced dead. The Tesla driver was uninjured and alcohol was not believed to be a factor.

This incident now joins an alarmingly lengthy list of fatal crashes involving Teslas. According to Reuters, Tesla declined to comment on the issue and, while the NHTSA did not specify which crash it was investigating this time, the latest California one seems to be a likely candidate.

Two other NHTSA probes have been opened recently. One involves a Florida crash that resulted in the deaths of two elderly people in a Tesla, and another involved a 2018 Tesla Model 3 and a pedestrian. While human error can't be unequivocally ruled out in all of these cases, it appears obvious that the EV automaker's advanced driver-assist systems are problematic.

This week, a German court said that Tesla should pay a Model X owner over $100,000 after finding that Autopilot posed a "massive danger" to the driver and other road users. Specific problems included the system's tendency to brake too often - increasing the risk of being rear-ended - and its inability to identify obstacles like narrowing lanes.

We would love to get behind technology that improves the driving experience and makes cars safer, but the almost shockingly high number of ongoing recalls and investigations serve as proof that Tesla's "self-driving" tech is accomplishing the opposite.