idw - Informationsdienst
Wissenschaft
The Fraunhofer Institute for Digital Media Technology IDMT will be presenting a car that can hear at the IAA MOBILITY trade show in Munich. The researchers have developed a prototype that incorporates acoustic sensors into the vehicle technology, thereby enhancing safety and reliability on the road.
Present-day vehicles are equipped with various driver assistance systems, including cameras, lidar and radar, to help with parking and staying in their lane. They act as the car’s “eyes,” registering relevant objects in its surroundings. But what vehicles have lacked so far is ears. “Being able to perceive exterior sounds and attribute them accurately is a crucial part of attentively observing the full traffic environment. After all, many situations on the road are preceded by an acoustic signal. Take an approaching emergency vehicle, for example, which alerts people to its presence by using a siren,” explains Moritz Brandes, who leads the The Hearing Car project at Fraunhofer IDMT.
Essential to autonomous driving: acoustic event recognition
At the Oldenburg Branch for Hearing, Speech and Audio Technology HSA, a team of researchers headed by Brandes works closely with automotive manufacturers and suppliers on the sensor and analysis technologies needed for a vehicle that can hear. For their research, they use a special vehicle equipped with a measurement system from Fraunhofer IDMT. The vehicle acts as a rolling demo platform, making it possible to collect important training data.
In the future, the acoustic environment analysis feature will be able to detect not only ambulances but also other sounds, such as human voices or the sounds of children playing, as the car turns into an area of calm traffic. Acoustics does not require a clear line of sight, like optical systems do — so the car can literally hear what’s coming around the corner. This will allow automated driving systems to respond and operate with increased caution, much like a human driver who hears children playing before they can be seen. In addition, exterior noises will be transmitted into the vehicle interior via the headrest during certain driving maneuvers to call the driver’s attention to important sounds in the surrounding area.
Talking to the car
“Hey car, open the trunk” — people will be able to communicate with their vehicles like this and in similar ways in the future. The new speaker verification function also makes it possible to restrict interactions with the vehicle to authorized persons only. The researchers are not only developing AI algorithms for acoustic event recognition but also working on optimum signal recording through sensor positioning, signal pre-processing and improvement and noise reduction.
Microphone technology that stands up to wind and weather
For these technologies to be incorporated and work together, there need to be high-quality microphones built into the vehicle’s exterior and connected to the onboard electrical system. To minimize the influence of airflow sounds, the team working on The Hearing Car is also developing and testing suitable housings and screens for sensors for airborne sounds. Brandes explains: “The number and placement of the microphones are crucial to detecting sounds from the vehicle’s environment. Our team has developed solutions that can stand up to wind and weather and work at extreme temperatures. We’ve used our demo vehicle for testing at several locations between Portugal and the Arctic Circle in order to trial the technologies in different conditions. The results are really promising, and they show the potential of our developments for the future of autonomous driving.”
Intelligent attention measurement using mobile EEG systems and personalized sound experiences with YourSound
The interior features of The Hearing Car are also impressive: Different technologies are to unlock a new level of monitoring health and detecting the driver’s status inside the vehicle. For example, a short-range radar sensor collects information on the driver’s vital signs and offers contactless monitoring of limb movement and respiratory and heart rates through innovative analysis algorithms. A mobile EEG sensor system developed by Fraunhofer IDMT-HSA measures the electrical activity in the driver’s brain to detect changes in attention levels, especially during monotonous drives. In addition, a feature that analyzes occupants’ voices detects stress and excitement and reports this information back to the occupants.
YourSound technology, a system for individual improvement of the sound inside the vehicle, ensures top-notch entertainment while in the car. The new technology offers users of audio devices such as vehicle infotainment systems a playful way to adjust the audio settings to their own listening preferences, without needing to know specific levels or frequencies. The system operates as a virtual assistant, optimizing sound reproduction and thereby improving people’s acoustic comfort.
The development of the technology for The Hearing Car is receiving funding through the Vorab program operated by the Ministry for Science and Culture of Lower Saxony and the Volkswagen Foundation and from the German Federal Ministry of Research, Technology and Space (BMFTR) as part of the Integral Agile E/E Development for Fusion and Standardized Power and Data Wiring Systems research project (short: KI4BoardNet).
From September 9 to 12, researchers from Fraunhofer IDMT in Oldenburg will showcase their demo vehicle The Hearing Car in Hall A2, Booth C10, at the IAA MOBILITY 2025 in Munich.
https://www.fraunhofer.de/en/press/research-news/2025/september-2025/the-hearing...
High-quality microphones incorporated into the vehicle’s exterior and connected to its electrical sy ...
Copyright: © Fraunhofer IDMT/Leona Hofmann
Merkmale dieser Pressemitteilung:
Journalisten
Elektrotechnik, Informationstechnik, Maschinenbau, Verkehr / Transport
überregional
Forschungs- / Wissenstransfer, Forschungsprojekte
Englisch
Sie können Suchbegriffe mit und, oder und / oder nicht verknüpfen, z. B. Philo nicht logie.
Verknüpfungen können Sie mit Klammern voneinander trennen, z. B. (Philo nicht logie) oder (Psycho und logie).
Zusammenhängende Worte werden als Wortgruppe gesucht, wenn Sie sie in Anführungsstriche setzen, z. B. „Bundesrepublik Deutschland“.
Die Erweiterte Suche können Sie auch nutzen, ohne Suchbegriffe einzugeben. Sie orientiert sich dann an den Kriterien, die Sie ausgewählt haben (z. B. nach dem Land oder dem Sachgebiet).
Haben Sie in einer Kategorie kein Kriterium ausgewählt, wird die gesamte Kategorie durchsucht (z.B. alle Sachgebiete oder alle Länder).