Californian startup AEye revisits LiDARs with AI: Page 2 of 3

January 02, 2018 // By Julien Happich
Forget about brute-force high resolution LiDARs that scan everything under the sun as fast as they can, churning out 1's and 0's for processors to make sense of gigabytes of indiscriminately collected data. Californian startup AEye Inc. promises to change the LiDAR landscape with what it describes as a 2D colour and 3D perception engine, dubbed iDAR (Intelligent Detection and Ranging).

Today's latency-laden serial path planning loop.

"What is important to understand is that we call it agile LiDAR because it is a random access LiDAR. By software we can drive any type of scanning pattern you want, the system then figures out what is the fastest way to do it. You can control 20 parameters completely by software so the scanning patterns can adapt to the environment as it changes. At high speed on the highway, that may be long range and situational awareness. In a city, that may be all around awareness. For us, it is just a software switch, this is a big departure from what other startups are doing" explained Dussan.

The CEO also envisages the iDAR platform to be used as the primary sensor, augmented by independent views from other sensor inputs such as Radar and ultrasonic sensors.

"Because we have introduced AI at the edge, the iDAR can intelligently decide what it needs to do. That makes huge savings in processing power, system cost and it operates with a reduced data bandwidth. In comparison, today's LiDARs rely on a fixed path planning route, sending data to a central processing unit, that's not fast enough to perform decision analytics.

AEye's iDAR path planning loop.

We are solving the main problem of our customers, which is that LiDARs are either under-sampling or over-sampling. How do you get the right perception of a dynamic environment if you are having to scan the sky at the same resolution as a small brick or an obstacle on the road ahead? This is a very inefficient use of photons, lots of 1's and 0's that are useless. In the end, 80 to 97% of the data is thrown away as it cannot be processed more than a certain amount of time.

We are the first to fix that problem, the camera guides the LiDAR where to look, and the LiDAR can find objects that the camera can't always see" continued the CEO.

Vous êtes certain ?

Si vous désactivez les cookies, vous ne pouvez plus naviguer sur le site.

Vous allez être rediriger vers Google.