Cheers! AI augments hearing aids with virtual microphones ‘mounted’ on the forehead
Amid the clinking, chattering and chortling of a cocktail party, wearers of hearing aids struggle to follow any one conversation. Help may be on the way in the form of hearing aids incorporating virtual sensing aided by deep learning, according to a study conducted at the University of Bern in Switzerland.
Most hearing aids and cochlear implants use several microphones situated alongside each other on the housing of the audio processors, Wilhelm Wimmer, PhD, and colleagues point out in a study now current in Hearing Research.
Their experimental approach delivers a more natural sound by correcting for the need to place microphones at impractical points like the forehead.
That’s what it would take to replicate an unassisted hearing experience by those with normal hearing, Wimmer and co-authors explain.
Their virtualized system estimates microphone signals using a deep neural network that doesn’t need to know how the actual microphones are arranged on real-world hearing aids and cochlear implants.
Wimmer and colleagues tested the approach with objective measures as well as a subjective listening test with 20 participants.
The team found that the virtually sensed microphone signals “significantly improved speech quality, especially in cocktail party scenarios with low signal-to-noise ratios. … [H]earing aid or cochlear impant users might benefit from virtually sensed microphone signals, especially in challenging cocktail party scenarios.”
The study is available in full for free.