Hearing Aids and Artificial Intelligence

Time to read: 10 minutes

Modern notebook computer with future technology media symbols-1To say that there’s been a technological revolution over the past century (and we’re talking the 21st century here) would be quite the understatement. One of the things propelling this accelerated growth in technology can been attributed to four factors:

  • Miniaturisation of computer power
  • The advance in battery technology
  • More direct physical interaction with technology (finger-touch controls on Smartphones and Tablets for example)
  • The exponential improvement in machine learning and artificial intelligence (AI)

In terms of hearing technology, hearing aids have become smaller and better at providing better speech processing across a wide range of circumstances over the past 20 or so years. Today’s hearing aids offer more programs and user friendly features including Bluetooth connectivity and streaming. On top of that, rechargeable batteries offer long lasting operation without the waste of disposable batteries.

This brings us to the fourth item on the list - Artificial Intelligence. Let’s start with giving it a definition.

Francois Chollet, AI researcher at Google and creator of the machine-learning software library Keras, has said intelligence is tied to a system's ability to adapt and improvise in a new environment, to generalise its knowledge and apply it to unfamiliar scenarios.

"Intelligence is the efficiency with which you acquire new skills at tasks you didn't previously prepare for," he said. "Intelligence is not skill itself, it's not what you can do, it's how well and how efficiently you can learn new things."

No surprise that if after reading that you have visions of HAL from 2001: A Space Odyssey or Skynet from The Terminator films.

The good news (for now) is that Artificial Intelligence is far from reaching ‘self-awareness stage’

Businesswoman hand touch cube as symbol of problem solvingAI cannot invent things. It has no imagination or creativity, nor can it think outside the box to solve problems. That kind of "general intelligence" is still the province of living creatures. And human beings are masters of it. Invention necessarily needs a mother, you might say.

It is in the area of Narrow AI where machines excel. That means they are very, very good - better than people - at working in a very narrow set of parameters. Still, that’s nothing to sneeze at. Siri, Alexa and Cortana are all examples of Narrow AI. So too are those message bots that pop up on some web sites to ask if they can help you.

We encounter Narrow AI everywhere. It shapes what we see on search engines, it makes movie recommendations on Netflix, it recognises our faces at passport control at airports.

Now it’s being integrated into hearing aids and the opportunities to make your life more convenient and safe are growing just as fast as the technology itself.

Let’s take a look at hearing aid manufacturers who are promoting the use of AI in their hearing aids and associated apps.

Starkey ITC-R ThumbnailStarkey

Starkey’s Livio Edge AI uses artificial intelligence in three ways:

Hearing performance - which is the first and most important thing for a hearing aid to do. Here, artificial intelligence is used to provide superior sound quality. Starkey's Edge Mode conducts an AI-based analysis of the environment to make immediate adjustments designed to improve speech audibility issues caused by background noise and even face masks.

(Value Hearing Audiologist Emma Russell looks at the Starkey Livio in this video here.)

Starkey also uses AI in what they call 'healthable technology' which, through the Thrive app features: 

  • Body Tracking which tracks your daily steps, measures movement and monitors more vigorous physical activity.
  • Brain Tracking which monitors the brain-health benefits of wearing hearing aids by tracking active listening.
  • Fall Detection and Alerts that detects when you fall and sends an alert message to selected contacts.

The Intelligent Assistant feature offers a wide range of helpful little features including a Siri-like assistant to provide quick answers about your hearing aid and to set personal reminders. Another helpful feature is the ability for your clinician to perform remote support adjustments on your hearing aids as you're wearing them, no matter where you are.

Other features include language translation, speech-to-text transcription, find my phone and find my hearing aid features.

Widex

Widex Moment ReviewThe AI technology in the Widex Moment range is called SoundSense Learn.

SoundSense Learn uses AI to personalise hearing in the moment using Machine Learning, to calculate the best possible hearing outcome for a given situation in just 20 comparisons. And every time this AI is used, it stores the information it’s given in the cloud, so that it can improve hearing for other users of this feature. 

To reach the same result, a user would have to do almost 2,500,000 comparisons for the same result.

(Value Hearing Audiologist Emma Russell reviews Widex Moment here).

Signia

Signia also uses machine learning in its hearing aid range.

Many people with hearing loss cite the sound of their own voice as a downside to using hearing aids. Signia addressed this issue by developing Own Voice Processing (OVP) for a more natural-sounding own voice.

In order to function properly, the artificial intelligence involved in OVP has to “learn” the wearer’s voice. While machine learning could take weeks in previous hearing aid models, the voice recognition program in OVP takes mere seconds.

When combined with the Signia Assistant on your mobile phone, it uses AI to tailor your hearing aid settings to your personal preferences for an even clearer sound and best speech comprehension in every situation. It also answers your questions about how to handle your hearing aids so you can easily recall every detail discussed during your appointment with your hearing care professional.

Phonak

Phonak Paradise tap control

Phonak uses AutoSense OS 4.0 to personalise experience in any sound environment. AutoSense OS 4.0 blends the right combination of sound processing features to support a variety of end-user communication needs and lifestyles. New features include: Speech Enhancer, Dynamic Noise Cancellation and Motion Sensor Hearing.

Oticon

Oticon has trained the Deep Neural Network (DNN) with 12 million real-life sound scenes. It’s our most advanced hearing aid technology ever.

The DNN has learned to recognise all types of sounds, their details, and how they should ideally sound.

This makes it so much more than a standard artificial intelligence software. It’s a unique and dedicated hearing aid technology, developed for real-time operation in everyday life.

Related Articles: