Articles, Blog

How do self-driving cars “see”? – Sajan Saini

How do self-driving cars “see”? – Sajan Saini


It’s late, pitch dark, and a self-driving
car winds down a narrow country road. Suddenly, three hazards appear
at the same time. What happens next? Before it can navigate this
onslaught of obstacles, the car has to detect them— gleaning enough information about
their size, shape, and position, so that its control algorithms
can plot the safest course. With no human at the wheel, the car needs smart eyes, sensors
that’ll resolve these details— no matter the environment,
weather, or how dark it is— all in a split-second. That’s a tall order, but there’s a
solution that partners two things: a special kind of laser-based probe
called LIDAR, and a miniature version of
the communications technology that keeps the internet humming,
called integrated photonics. To understand LIDAR, it helps to start
with a related technology— radar. In aviation, radar antennas launch pulses
of radio or microwaves at planes to learn their locations by timing
how long the beams take to bounce back. That’s a limited way of seeing, though, because the large beam-size
can’t visualize fine details. In contrast, a self-driving car’s
LIDAR system, which stands for Light Detection
and Ranging, uses a narrow invisible infrared laser. It can image features as small as the
button on a pedestrian’s shirt across the street. But how do we determine the shape,
or depth, of these features? LIDAR fires a train of super-short laser
pulses to give depth resolution. Take the moose on the country road. As the car drives by, one LIDAR pulse
scatters off the base of its antlers, while the next may travel to the tip
of one antler before bouncing back. Measuring how much longer
the second pulse takes to return provides data about the antler’s shape. With a lot of short pulses, a LIDAR system
quickly renders a detailed profile. The most obvious way to create a pulse
of light is to switch a laser on and off. But this makes a laser unstable and
affects the precise timing of its pulses, which limits depth resolution. Better to leave it on, and use something else to periodically
block the light reliably and rapidly. That’s where integrated photonics come in. The digital data of the internet is carried by precision-timed
pulses of light, some as short as a hundred picoseconds. One way to create these pulses is
with a Mach-Zehnder modulator. This device takes advantage of a
particular wave property, called interference. Imagine dropping pebbles into a pond: as the ripples spread and overlap,
a pattern forms. In some places, wave peaks add
up to become very large; in other places, they completely
cancel out. The Mach-Zehnder modulator
does something similar. It splits waves of light along two
parallel arms and eventually rejoins them. If the light is slowed down and
delayed in one arm, the waves recombine out of sync and
cancel, blocking the light. By toggling this delay in one arm, the modulator acts like an on/off switch,
emitting pulses of light. A light pulse lasting a hundred
picoseconds leads to a depth resolution of a
few centimeters, but tomorrow’s cars will need
to see better than that. By pairing the modulator with a super-
sensitive, fast-acting light detector, the resolution can be refined
to a millimeter. That’s more than a hundred times better than what we can make out with
20/20 vision, from across a street. The first generation of automobile LIDAR
has relied on complex spinning assemblies that scan from rooftops or hoods. With integrated photonics, modulators and detectors are being shrunk
to less than a tenth of a millimeter, and packed into tiny chips that’ll one
day fit inside a car’s lights. These chips will also include a clever
variation on the modulator to help do away with moving parts
and scan at rapid speeds. By slowing the light in a modulator
arm only a tiny bit, this additional device will act more
like a dimmer than an on/off switch. If an array of many such arms, each with
a tiny controlled delay, is stacked in parallel, something novel
can be designed: a steerable laser beam. From their new vantage, these smart eyes will probe and
see more thoroughly than anything nature could’ve imagined— and help navigate any number
of obstacles. All without anyone breaking a sweat— except for maybe one disoriented moose.

Tagged , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

100 thoughts on “How do self-driving cars “see”? – Sajan Saini

  1. I'm actually a team leader for a self driving racing car competition in the UK 'FS-AI'. This is a top quality video, however, LiDAR hasn't quite been explained perfectly becuase even the best LiDAR radars only see a very limited slice of the world. Whilst we have LiDAR it's kinda useless, the limited slice that the LiDAR can see makes it very difficult for object recognition as there simply isn't enough data coming from the LiDAR to make any reasonable assumption of the object. We use the Robosense V16 Lidar, which, despite it's high price tag is simply not useable. To get better resolutuion you have to pay vast amounts of money even though they fundemantly have the same issue. What we use instead is stereo vision (Zedd Camera) as you're able to make better object detection useing each image from the camera and using machine vision and the principles of the Fundemental Matrix, we're able to rebuild a higher quaility map of the world.

  2. “Lidar is a fool’s errand,”
    “Anyone relying on lidar is doomed.
    Doomed!
    They are expensive sensors that are unnecessary. It’s like having a whole bunch of expensive appendices. Like, one appendix is bad, well now you have a whole bunch of them, it’s ridiculous,
    you’ll see.”

  3. But it shouldn’t even be a car then. It should just be something without a wheel that also takes you from point a to b.

  4. So, why do the cars need such fine vision resolution? If something is smaller than a few centimeters, it can probably be ignored.

  5. this is great and all but what if the moose has a panic attack reacting faster than the car if you know what i mean you seen the pictures where a car and moose become one if a car is going to drive it does have a feature to honk its own horn but it should of done that first nature is unpredictable living or not you cant be certain that the moose will stay i know this is part of a thought experiment but still God bless everyone and have a wonderful day

  6. 1:53 I would suggest thickening the beam, and using a color other than green to represent light bouncing back. The green beam is harder to see because it is against a blue background, and it practically disappears if viewed on the lowest resolution, 144p.

  7. Yea LiDar is useless a camera is better in this scenario and using Radar is far more important to see in the invisible photon range unlike LiDar that is a visible photon range. If you allow the car to see better than us then we will be safe using them. If you really believe in redundancy you might include the LiDar as a back up vision system but RADAR is the best for almost any spacial measuring and detection system as well as radar has less interference from the sun than lasers.

  8. Honestly, I don’t think humans need driver-less cars. I think we’re fine the way we are, and that research for this idea is a waste. No offense to anyone, that’s just my opinion.

  9. How did I drive to work today? I used my eyes as visual sensors and learned intelligence. Tesla is doing this with autopilot and AI. With AI, the only function till perfect driving is time.

  10. I like how this video completely ignores the contribution of Machine Learning, Computer Vision and Automatic Control in self-driving cars and only tells the story on the sensor level. It's like talking about human vision on the eye level and completely ignoring the brain. But what do you expect from a photonics guy….

  11. Didn't Elon Musk say in his last presentation, that LIDARs are obsolete and every company that will stick to it will fail?

  12. Ted-Ed describes things with so much detail in a way that no other science youtube channel does. Thank you for that. You are an amazing teacher and educator

  13. The way Ted-Ed describes science to me is like Hank Pym explaining quantum physics to Antman. All those fancy words

  14. 1:53 That moose just shot a laser out of its eye… Well… Humanity's dead
    P.S. I know that's not what's actually happening

  15. Collecting cloud of several million precisely positioned 3D points is kind of easy. Figuring out the actual object from that cloud, now THAT is a challenge!

  16. The man's voice is handsome. I'm not a native english speaker so I don't know differents kind of accents but this voice its very elegant.

  17. For the same reasons and the cost Tesla, leading in self driving, is NOT using LIDAR but radar and camera footage combined

  18. 00:27 when the hazards were dark red I thought they were blood stains and that the car just crushed three people or smth

  19. Ok, but to respond to a traffic light, the car needs to ACTUALLY see, not just measure distance. How does that work?

  20. So the proposed LIDAR system w be sim to phased array radar, wicked. Read in a few sites they're trying to make phased array much smaller as well as SAR (Synthetic Aperture Radar) for sim applications as well as a wearable vision system for ppl w vision issues, proposed for yrs down the road and inspired by the visor on the Star Trek Next Gen character Geordi LaForge's visor. Gotta love remote sensing tech !!

  21. My English teacher told me that we're supposed to use double inverted commas only for direct speech

  22. Ignores that the only production self-driving cars on the road, hundreds of thousands of which, don't use lidar…

  23. Great job, you have done a great deal of emphasis on the sensor part of a self-driving car. However, you did not mention at all the decision-making mechanism, which is in my opinion, the real challenge of self-driving, I think that video on deep learning would be highly appreciated.

  24. My English is bad , so… hope you guys can understand my meaning ,
    I have a question , if one super short laser into the another LIDAR system will have problem?

  25. All you really had to show was a car and a bunch of people literally burning in a big fire because that is exactly what happens.

  26. When it snows….nothing goes…..and when these self driving cars get into accidents, who is going to be legally responsible? This technology is never going to take over as it cannot do certain things like work in snow. When the sensors are covered with snow or ice, my adaptive cruise does not work. The auto wipers do not work. The bright light auto control does not work. The lane detection warnings do not work. Heavy rain also has its effects rendering much of this supposed advanced technology worthless.

Leave a Reply

Your email address will not be published. Required fields are marked *