Be Smart, Avoid a Vision-Only Self-Driving System When Visibility Is Low. Simple.

https://www.bbc.com/news/articles/cg75zv4gny2o
https://electrek.co/2025/03/23/everyones-missing-the-point-of-the-tesla-vision-vs-lidar-wile-e-coyote-video/
https://www.youtube.com/watch?v=_W1JBAfV4Io

Not here to debate whether Elon Musk is right to insist on self-driving using vision-based AI only, but the logic is straightforward: if it’s a vision-based system, avoid using it when visibility is low. The limitations of such systems become painfully apparent in adverse weather conditions, fog, heavy rain, or darkness. In these scenarios, the training data designed to help the AI recognize and respond to its environment falls short. The system may not have been trained on enough examples of these extreme situations, leading to poor decision-making and increased risk.

Moreover, vision-only systems can be easily compromised by environmental factors. For example, strong infrared or laser beams can effectively blind visual cameras, rendering them useless when they are needed the most. This vulnerability raises significant safety concerns, especially when we consider the potential consequences of a vehicle that cannot accurately perceive its surroundings.

During my experiences in automated taxis similar to Waymo in China, I felt a tangible sense of safety that I attribute to their comprehensive array of sensors. These vehicles are equipped with not just cameras but also LiDAR, radar, and ultrasonic sensors, which collectively provide a far richer understanding of the environment. It’s comforting to know that these systems can detect objects and measure distances through various means, rather than relying solely on visual data.

The added cost of these sensors for safety is what I expect from a commercial taxi service. When passengers are entrusting their lives to these vehicles, a multi-faceted approach to sensing is not just an enhancement; it’s a necessity. The integration of diverse sensory inputs allows for more accurate object detection, better situational awareness, and ultimately, a safer ride.

As we look toward the future of autonomous vehicles, it becomes clear that a vision-only approach is insufficient in ensuring safety and reliability. It’s imperative for manufacturers and developers to prioritize systems that utilize a full spectrum of sensors, especially in conditions where visibility is compromised. This is not just about technological advancement; it’s about protecting lives and enhancing the trust that users place in autonomous systems.

Other Notes:

  • The notion that the world is designed for human visual consumption sounds like a good argument, but it can still be a poor design. Our visual capabilities are not perfect and can degrade over time. How can a one design fits all and forever be the right argument?
  • Roads, even with traffic lights, roundabouts, etc, were not designed for fast modern vehicles, certainly not designed for autonomous driving.