Artificial Intelligence (AI) has arrived and the ‘robots’ have much to offer people with hearing loss.
In a world where we log every step taken, navigate in real-time with GPS on our phones and leave it up to algorithms to choose our music, it makes sense that something as widely useful as a hearing aid is getting a tech makeover. With millions of Australians currently experiencing some form of hearing loss, the race is on to develop sophisticated products and services to meet our complex hearing care needs.
The first hearing aids were basically just sound amplifiers. But these days manufacturers are turning to AI and machine learning to create highly advanced, customisable, almost invisible amplification systems with an ever-growing number of features and applications.
This is great for consumers – spoilt for choice with more options than ever before. But it’s also great for manufacturers who, since the digitisation of hearing aids in the mid-1990s, can gather data about how we use them and apply this to develop the next generation of hearable technologies.
With smart speakers like Amazon Echo or Google Home, now we can answer virtually any question, play music and control our smart homes with the sound of our voices. Well, what if your hearing aids could trigger your lights to switch on in the morning, your coffee machine to start bubbling and your favourite breakfast radio to stream straight to your ears?
What if they could automatically adjust their amplification strategies to maximise hearing experiences in a wide range of noisy environments? What if they could interact with your phone, Fitbit and other devices to offer a range of interactive functions every day? Get ready, because these types of innovations are already happening.
Imagine a personal assistant in your hearing aid. Soon we’ll be able to ‘tap’ our auditory assistants awake, ask them questions, set them tasks and respond to their suggestions with a nod or shake of the head. AI will bring health benefits too, enabling our hearing aids to work in conjunction with other devices to monitor things like heart-rate, body temperature, and even oxygen and glucose levels. Sensors can track physical activity, including falls – a major cause of death in seniors.
AI technology is already enabling on-the-spot speech transcription, allowing deaf users to participate in spoken conversations. Real-time language translation and conversion of sign language to text are next. Check out this video of an early phase of Hungarian company SignAll’s body sensor technology in action. US outfit KinTrans are taking this one step further, developing an avatar that can convert speech to signs to communicate back to deaf users.
The ‘robots’ aren’t just coming, they’re already here, but so far they’re pretty handy to have around. Simple AI applications like live lecture streaming from smartphones to hearing aids can have life-changing impacts, and this is just the beginning. With new, highly desirable features being rolled out constantly, hearable technologies become a lot more attractive for reluctant users. As AI transforms hearing aids into multipurpose devices, stigmas evaporate as the conversation shifts from ‘what if?’ to ‘what next?’.
The Amplifon blog is our place to explore ideas and themes of interest. For professional audiology advice, please contact your local clinic for a consultation.