This is something that an over the air update can't fix.
As details of the recent fatal crash involving a Tesla Model S driving itself using Autopilot software begin to emerge, the news will hit us in waves. What ensues will change the future landscape of the automotive world since it is tragedies like these that usually spur the necessary technological change. What we do know about the accident so far is that the rare situation that caused the sensors to fail to see the semi-truck that was struck by the Model S is one that can happen again.
What's more is that news is beginning to emerge that shows a hole in Tesla's army of sensors that are in place to read the surroundings. Currently, the Model S has 12 ultrasonic sensors, a forward facing camera, and a radar sensor to help it navigate freeways and follow cars at a safe distance. The camera is used to distinguish the lane markings to help keep the Tesla in its lane while the ultrasonic sensors, which provide a 360 degree view at a range of 16 feet, are called upon when merging lanes or using the self-park feature. Problematically, as the recent fatality in Florida and a low-speed accident in Utah are uncovering, the Model S and likely the X seem to be blind above the hood line of the car.
Even Tesla's owners manual backs this up with a disclaimer for its Autopilot and Autopark functionality. It reads, "Please note that the vehicle may not detect certain obstacles, including those that are very narrow (e.g. bikes), lower than the fascia, or hanging from the ceiling." This disclaimer fully explains why the Autopark sensors failed to sense a tall tractor-trailer in Utah when an owner summoned the car. Essentially, aside from the location of impact on the truck, the crash mirrors the fatal accident in Florida that claimed the life of Navy SEAL and tech buff Joshua Brown. Given the height of the truck, the impact point strikes the Tesla right in the middle of the A-pillars, an especially deadly type of crash given the lack of safety reinforcements.
The lack of reinforcements as well as the fact that the Tesla has a blind spot in said vulnerable area goes to show that the car is designed well for autonomous driving during certain specific parameters. This works for a Beta version of Autopilot that is supposed to only be operated when the driver has their full attention on the road, but will not do for fully autonomous versions of the software that allow drivers to be distracted on the highway or city streets. Furthermore, Tesla supplier Mobileye, which sells the EV company sensors for use on its Autopilot enabled cars, has spoken out to confirm that the sensors are not designed to spot a laterally crossing vehicle that comes into the Tesla's line of trajectory.
Lateral Turn Across path functionality should debut on these sensors beginning in 2018 with Euro NCAP requiring them by 2020, but as of now there is a lack of such sensors. In the Florida crash, Tesla blames the camera for not being able to read the tractor's white trailer against the unusually bright sky. The stereoscopic sensors would have been useless as well given that their placement low to the ground only contributes to the blind spot above and even if they could see it, their short 16-foot range would have detected the truck far too late to stop in time. What this points to more than anything is a hardware problem. Tesla has arguably proven itself as capable of making autonomous software that works well, especially since it's continually updated.
The problem is that if Tesla expects to be able to simply update its existing fleet with full autonomy, it will be making a mistake. In order to get autonomous vehicles to gain public acceptance and go mainstream, they need to eliminate as many holes as possible, especially blind spots that a driver who is alert would easily spot. That's why Tesla needs to either add more sensors or upgrade the existing hardware to make it capable of seeing the entire car's surroundings, not just the ones that are most likely to be hit. It doesn't matter how unlikely the accident is, the point of an autonomous vehicle is to have it be safer than a human driver. Statistically, Autopilot already is safer than human drivers having gone 130 million miles before its first fatality.
This means that an Autopilot will enable an additional 36 million miles of death-free driving over a human pilot, a start, but far from what the end goal should be. To double or quadruple the safety factor, or even eliminate deaths entirely, Tesla needs to look into improving its eyes on the road. By 2017, Tesla expects to roll out its most important car yet, the $35,000 Model 3 sedan. With nearly a half million preorders for the Autopilot-enabled Model 3, this means that it will be one of the first vehicles to introduce autonomous driving to the masses, and Tesla needs to be sure to have all of the kinks worked out by then. Of course, putting everything on hold to redesign the sensors would cause massive delays, but unlike Falcon Wing doors, that's a delay we can all stand.
Join The Discussion