When you drive a car, you're performing a constant miracle of perception and judgement — reading road markings, anticipating the driver in front, interpreting a cyclist's wobble, understanding that a football bouncing into the road probably means a child is about to follow it. You do all this effortlessly because you have a brain that evolved over millions of years to model the world. Self-driving cars have to do all of it from scratch, using sensors and software.
Step 1: Sensing the world
Self-driving cars carry a suite of sensors that feed them a constant picture of their surroundings. Cameras capture visual information in every direction. Radar measures the speed and distance of nearby objects. LiDAR (Light Detection and Ranging) fires thousands of laser pulses per second and measures how long they take to bounce back, building a precise 3D map of everything around the car. Together, these sensors produce a rich, real-time model of the environment.
🦇 Bats navigate by echolocation — they emit sound pulses and listen to the echoes to build a picture of their surroundings in the dark. LiDAR works on the same principle, just with laser pulses instead of sound. While you're reading the road visually, a self-driving car is doing something closer to what a bat does — pinging the world constantly and listening for what bounces back.
Step 2: Understanding what it sees
Raw sensor data isn't useful until the car knows what it's looking at. This is where AI comes in. Machine learning models — trained on millions of examples — classify every object in the scene: this is a pedestrian, this is a traffic light (and it's red), this is a cyclist, this is a parked car. The system also predicts what each object will do next — that pedestrian is walking towards the crossing, that car is about to change lanes.
Step 3: Deciding what to do
Given a model of the world and predictions about how it will change, the car's planning system decides on a course of action: maintain speed, brake, steer slightly left. It has to weigh up dozens of competing factors simultaneously — safety, traffic rules, the comfort of passengers, and the behaviour of other road users who may not behave predictably.
Why is it so hard?
The so-called "long tail" of unusual situations is the real challenge. A system can handle 99% of driving scenarios reliably. The problem is that the remaining 1% — a traffic warden in an unexpected position, a collapsed scaffolding, a child's balloon drifting across the road — is endless and unpredictable. Humans deal with novelty using general intelligence and common sense. Current AI systems struggle. It's why truly driverless cars remain confined to specific cities with good mapping, and why "full self-driving" has been promised and then delayed for over a decade.