After eyes, cars get ears for the environment

After eyes, cars get ears for the environment

Technology News |
Modern cars assess their environment with "sense organs" such as radar, lidar or cameras. In the future, they may also have a sense of hearing. Researchers at the Fraunhofer Institute for Digital Media Technology IDMT in Oldenburg (Germany) have developed prototypes for detecting external sounds such as sirens.
By Christoph Hammerschmidt


Driver assistance systems in today’s cars take the strain off the driver. Camera, lidar and radar detect relevant objects in the environment, acting as eyes, so to speak. What automobiles still lack is the sense of hearing – systems that can perceive and classify external sounds. In the future, such “ears” will form the basis for autonomous driving in combination with intelligent radar and camera sensor technology. To this end, researchers at the Fraunhofer IDMT in Oldenburg are developing AI-based technologies for acoustic event recognition.

External acoustic perception systems do not yet exist for autonomous vehicles, despite their high application potential. Such systems could, for example, signal in a fraction of a second when a vehicle is approaching with the siren switched on. In this way, the autonomous vehicle knows that it must take evasive action. In addition, there are numerous other scenarios where an acoustic early warning system would be helpful: for example, when entering residential streets where potentially children are playing, but also for detecting dangerous situations or errors. In addition, such a system could monitor the condition of the vehicle or act as an emergency call point via voice recognition.

The challenges in the development of an acoustic sensory organ for vehicles include optimal signal reception through sensor positioning, signal pre-processing and improvement, and noise reduction. Special beamforming algorithms developed at Fraunhofer IDMT enable the dynamic localization of moving sound sources, such as the siren on an emergency vehicle. IDMT’s event detectors get previously trained using machine learning methods with the acoustic signatures of the relevant sounds. Appropriate acoustic libraries were created for this purpose. The result is intelligent sensor platforms with effective detection performance. AI-based algorithms for audio analysis determine the disturbing and target sounds. “We apply methods of machine learning. We train our algorithms with a wide variety of previously collected sounds,” says Danilo Hollosi, group leader for acoustic event detection at Fraunhofer IDMT. Together with industrial partners, the first prototypes have already been realized and should be ready for the market by the middle of the next decade.

The acoustic sensor technology developed by IDMT researchers consists of encapsulated microphones, control unit and software. Mounted on the outside of the vehicle, the microphones pick up the airborne sound. The sensors transmit the audio data to an ECU, where they are processed into relevant metadata.

The research team’s computer-based event detection methods can also be used in adapted variants in other sectors and markets, for example for quality assurance in industrial production. Here, intelligent acoustic sensors process battery-operated audio signals from machines and systems. From the information, which is transmitted wirelessly to an evaluation unit, conclusions can be drawn about the condition of the production facilities and possible damage can be avoided. Automatic speech recognizers enable contactless documentation systems for professional applications, for example in turbine maintenance.

More information:


Related articles:

Satellite cameras provide more visibility around the car

VCSEL array as core component for Ibeo’s solid-state lidar


Linked Articles
eeNews Automotive