Thermal is the missing link
Events like last year’s tragic Uber accident in Arizona show the challenges for AV systems to “see” and react to pedestrians in all conditions, be it on a dark country road or a cluttered city environment—especially in inclement weather like thick fog or blinding sun glare. In these uncommon, but very real scenarios, thermal cameras can be the most effective in quickly identifying and classifying potential hazards both near and far to help the vehicle react accordingly.
Especially for visible cameras, classification is challenging in poor lighting conditions, nighttime driving, sun glare, and poor weather (Fig. 2). As thermal sensors detect a longer wavelength of the electromagnetic spectrum versus visible cameras, the technology is unaffected by night or daylight in detecting and reliably classifying potential road hazards such as vehicles, people, bicyclists, animals, and other objects, even from up to 200 meters away.
Furthermore, thermal cameras offer redundant but separate data for a visible camera, LiDAR, or radar systems. For example, a radar or LiDAR signal from a pedestrian can be camouflaged by a nearby vehicle’s competing signal or other nearby objects in a cluttered environment. If a pedestrian is crossing between two cars or partially obstructed by foliage, there will be little to no reﬂected signal to detect the pedestrian, or the reflected data could confuse the prevailing intelligent systems designed to inform the vehicle’s movements.
In such cases, thermal cameras can see through light foliage by detecting the heat of a person or animal versus the surrounding environment. This distinct advantage, with the addition of machine-learning classification, enables a person or animal to literally stand out from the background, enabling the vehicle to behave appropriately (Fig. 3).