Model 3

Make
Tesla
Segment
Sedan

This month, the National Highway Safety Administration (NHTSA) released new data on 11 crashes from mid-May to September 2022.

In the report, the NHTSA found that 10 of the 11 deaths investigated involved a Tesla vehicle, like the Model 3. We note that the NHTSA did not name Tesla's Autopilot or Full Self-Driving specifically. The reports also fail to say whether human error was the cause.

Additionally, the 11th crash was found to be one involving a Ford pickup, reportedly not using Ford's own autonomous driving technology.

Of the 10 fatal accidents, four involved motorcycles. As reported previously, the NHTSA is currently investigating 38 cases involving Autopilot. Musk's launch date for Full Self-Driving has also been pushed back, as the authorities are not yet convinced that the tech should be allowed to roam freely.

The NHTSA also displays data from earlier in 2022, showing that six people were killed and five more were "seriously injured" in crashes involving autonomous driving software.

Concerning the latest deaths, the NHTSA stated that five occurred in Teslas, and one occurred in a Ford. For those six deaths, the NHTSA says that advanced driver assistance systems (ADAS) were only active in the five Teslas.

The NHTSA is not just analyzing Tesla's culpability in autonomy-related crashes. It is building a database to assess the overall safety of all manufacturers' autonomous and ADAS systems. The brands include Tesla, GM, Ford, and others.

Automotive technology companies like Waymo must report all crashes where autonomous vehicles are involved, as well as those with driver-assist systems.

/wordpress/wp-content/uploads/gallery-images/original/1052000/0/1052003.jpg

The NHTSA points out another reason for the Tesla-heavy skew in its data, at least for now.

Tesla uses telematics to monitor all of the 830,000+ cars it has on the road and obtains real-time crash reports as a result. Other brands have yet to field this tech, and their reports trickle in more slowly as a result.

The NHTSA still has plenty of questions about Tesla's systems, however. It notes that the technology was being used in places where it was not as reliable as it usually would be. As a result, the NHTSA found that drivers weren't taking steps to avoid a potential accident despite warnings from their cars.

Remember that Teslas are rated at Level 2+, which requires the driver to keep their hands on the wheel and pay attention at all times.

/wordpress/wp-content/uploads/gallery-images/original/1052000/0/1052001.jpg

Meanwhile, Elon Musk insists that the systems in his cars are still safer than humans based on the rate of crashes and total miles driven. In September at Tesla's AI day, he said: "At the point of which you believe that adding autonomy reduces injury and death, I think you have a moral obligation to deploy it."

"Even though you're going to get sued and blamed by a lot of people. Because the people whose lives you saved don't know that their lives were saved. And the people who do occasionally die or get injured, they definitely know, or their state does, that it was, whatever, there was a problem with Autopilot."

He claims Teslas with FSD and Autopilot have covered north of 3 million miles. "That's a lot of miles driven every day. And it's not going to be perfect. But what matters is that it is very clearly safer than not deploying it."

/wordpress/wp-content/uploads/gallery-images/original/1052000/0/1052002.jpg