Please take caution when using it.
Tesla released the ninth version of its Full Self Driving Beta this weekend; news we felt deserved a critical public safety announcement: there are no self-driving cars on sale in 2021. Zero, zilch, nada. Tesla's "Full Self Driving (FSD)" name might make it sound fully autonomous, but even Tesla will admit (likely for legal reasons) that it's only an advanced Level 2 system, like GM's Super Cruise. When people think of a "self-driving car," they typically imagine Level 4 (where drivers only intervene under certain circumstances) or Level 5 (where there is no steering wheel or pedals).
So how far is FSD from Level 5? Based on some videos demonstrating the V9 software in action, we'd say very far. One video filmed in downtown San Francisco using a Tesla Model 3 shows the software making multiple mistakes. Had the driver not been paying attention, these mistakes would have caused a collision.
The first mistake happens just 14 seconds into the video, with the software pulling the car into a bus lane. Next, the system disengages after driving down the wrong lane, then rubs into a bush that's overgrown into the road. It proceeds to drive on a shoulder with a bus lane and parking spots before nearly driving off the road during a left-turn maneuver. For a system that claims to be "self-driving," the driver had to take over quite a few times.
Other videos show FSD V9 navigating more successfully, but the system clearly needs more work. Tesla will point out FSD is still in beta, meaning only a few thousand people have access to test it. This is fairly common in the video gaming world to help developers iron out small bugs in their games, but this is the first time we've seen public beta testing like this in the automotive sector. There's a good reason for that.
While a video game glitch might cause some dissatisfaction from the player, an FSD failure could kill the driver or another person on the road. Beta testing is great because it has users opting in to test a pre-finished software to speed up its development with real-world use. Tesla owners knowingly opted into the FSD program, but what about other people on the road? Did we opt in to test Tesla's unfinished software? No, we didn't.
As a reminder, Tesla chargers customers $10,000 for FSD Capability, with the understanding it will eventually drive itself completely. The current systems can supposedly navigate on Autopilot, auto lane change, auto park, summon, and read traffic lights and stop signs. Later this year, Tesla claims it will autosteer on city streets.
Other Level 2 systems use Lidar to map the road, but Tesla Pure Vision Autopilot with FSD uses cameras and artificial intelligence. The V9 update allows it to detect vehicle brake lights, and in the future, it will read turn signals, hazards, emergency vehicle lights, and hand signals.
Tesla admits, "the currently enabled features require active driver supervision and do not make the vehicle autonomous. The activation and use of these features are dependent on achieving reliability far in excess of human drivers, as demonstrated by billions of miles of experience, as well as regulatory approval, which may take longer in some jurisdictions. As these self-driving features evolve, your car will be continuously upgraded through over-the-air software updates."
In the meantime, we'd like to see Tesla use paid engineers to test this software on the road rather than leave it to customers, who may (or may not) use it properly. And until the FSD software does what was originally promised (driving itself), stop charging $10,000 for it.