LIDAR Perception Challenges

March 09, 2020 //By Sarven Ipek, Analog Devices
LIDAR Perception Challenges
A successful autonomous vehicle is going to have to use a tightly integrated system of sensors to replicate the driving abilities of a human being. The typical human driver uses two eyes, two ears, and the tactile feedback of the car in order to drive. Their brain processes all this information in real time and references their vast database of driving experience. The sensors required to replicate human driving will include radar, LIDAR, cameras, inertial measurement units (IMUs), and ultrasonic sensors. Each of these systems will have its strengths, but also literal blind spots.

It is highly unlikely one sensor will ever be so refined as to negate the need for the others. In this article, we are going to look at the top level design considerations for LIDAR, which is a sensor that will provide significant data to any autonomous driving solution.

Figure 1. Spider chart comparing vision, radar, and LIDAR.

LIDAR is a close partner of radar in an autonomous vehicle. Both of these technologies operate without visible light, which is crucial for night driving or low light conditions. Radar is good for long distance detection and tracking, while LIDAR provides higher angular resolution allowing for object recognition and classification. Put another way, radar is good for detecting there might be something out there, while LIDAR can tell you more about that something once the radar finds it.

Figure 2. LIDAR perception for autonomous vehicles.

There are technical challenges when designing a LIDAR system, the obvious one being staying below the eye safety limitations for near infrared wavelengths. These safety guidelines are outlined in IEC 60825-1. This is not to diminish the importance of eye safety—the aspects discussed here all play into decisions that affect eye safety. There are many different LIDAR system topologies, with varying degrees of design complexity, which come with their own advantages and disadvantages.

At the core, all designs have the same fundamental aspects that need attention. Let’s focus on considerations other than eye safety that affect system design, namely: maximizing SNR, detection requirements, field of view, thermal considerations, power consumption, and dead reckoning.

Design category: 

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.