Maybe Tesla should invest in this instead of camera-based Tesla Vision.
SiLC Technologies, a cutting-edge company based in California, has announced its latest development in the world of LiDAR. Optimizations made to its Eyeonic Vision Sensor have given it the ability to detect, identify, and avoid obstacles that are more than 3,280 feet (0.62 miles) away.
The company claims it is the first to achieve this distance with LiDAR, but AEye's system comes rather close. SiLC has essentially doubled the range of its Eyeonic Vision Sensor. First previewed at the Consumer Electronics Show (CES) 2022, the system previously demonstrated a detection range of 1,640 feet - half of what the updated version can achieve.
An advancement of this nature could revolutionize autonomous driving by predicting hazards even further down the road than current systems are capable of seeing.
The Eyeonic Vision Sensor is a trailblazer in terms of LiDAR technology. Dr. Mehdi Asghari, founder and CEO of SiLC, explained why this development is so important. "Our technology platform is flexible enough to address ultra-long-range to ultra-short-range applications which speak to our understanding of what is needed to truly make machine vision as good or better than human vision."
Of course, this brings untold benefits to the automotive industry. As the world's car companies throw their weight behind the self-driving future, this will certainly make autonomous vehicles safer and more perceptive to potential dangers on the road ahead. So, how does it work? At the center of the Eyeonic Vision Sensor is SiLC's silicon photonic chip, which incorporates FMCW LiDAR functionality into a tiny chip.
Despite the compact dimensions, the technology boasts some incredible capabilities. While lesser systems merely measure reflected light, the Eyeonic Vision Sensor goes further than that. As per MotorTrend, it can detect more details in the photons and tell the difference between various materials, such as a human being's skin or another vehicle's metal.
What's more, the small packaging should make it easy to implement for automotive use, without requiring manufacturers to redesign their vehicles. But the technology can be implemented in so much more than self-driving applications. For example, it can also be used to control deforestation and guide flying drones away from each other.
"The highly detailed, accurate instantaneous velocity and ultra-long-range information that our Eyeonic Vision Sensor provides is the key to helping robots classify and predict their environment - in the same way that the human eye and brain process together, added Dr. Asghari.
We've already seen the benefits of LiDAR technology versus a basic camera setup. At this year's CES event, Luminar demonstrated the efficacy of its system, pitting a LiDAR-equipped Lexus against a Tesla Model 3 in a pedestrian detection test. While the Lexus detected the child dummy and came to a stop, the Tesla (which uses a camera-based system) proceeded and ran the safety dummy down.
The Eyeonic Vision Sensor has already been put to use in a fleet of autonomous Chrysler Pacifica minivans, controlled by AutoX, a RoboTaxi company. If it proves successful, it could be widely adopted by more brands. The question is whether the likes of Tesla will change their tune on using cameras over LiDAR, given the new technology's capabilities.