Agile sensor technology may surpass lidar

December 11, 2017 // By Christoph Hammerschmidt
Robotic perception startup AEye (Pleasanton, CA) has demonstrated what it calls a new form of intelligent data collection through optical sensors: iDAR. The acronym is not a typo but stands for “Intelligent Detection and Ranging” and is designed to enable rapid dynamic perception and path planning for autonomous vehicles and robots.

The technology combines an “agile” Micro-optical Mechanical (MOEMS) LiDAR, pre-fused with a low-light camera and embedded artificial intelligence and thus is creating software-definable and extensible hardware that can dynamically adapt to real-time demands. iDAR promises to deliver higher accuracy and longer range than existing lidar designs, enabling improved autonomous vehicle safety and performance at a reduced cost. 

According to AEye, a shortcoming of traditional LiDAR is that most systems oversample less important information like the sky, road and trees, or undersample critical information such as a fast-approaching vehicle. They then have to spend significant processing power and time extracting critical objects like pedestrians, cyclists, cars, and animals. AEye’s iDAR technology mimics how a human’s visual cortex focuses on and evaluates potential driving hazards: it uses a distributed architecture and at-the-edge processing to dynamically track targets and objects of interest, while always critically assessing general surroundings.


s