Helen: An AI-Powered Lipreading System for Accessibility and Enhanced Speech Recognition
Hearing impaired patients have a tough time communicating. However, existing hearing aids either fail to function efficaciously in noisy environments, or are too expensive for mass adoption. Helen, an AI Based Lip Reader, circumvents these issues by opening up a new dimension of quasi-audio-independent communication. Building on recent advancements in Visual and Audiovisual Speech Recognition, Helen uses a chain of neural networks to map sequences of lip movements to spoken words, and provides a transcription of spoken content even in non-ideal scenarios. While Helen was initially created as an accessibility wearable for the hearing impaired to supplement hearing aids, it became apparent that the system could also be used to substitute or supplement audio speech recognition in noisy environments, and enhance the usability of speech-to-text applications. Since debuting in April 2019, Helen has won multiple international awards including the Global Finals of the IET PATW in London, the James Dyson Award (National Runners up in Hong Kong), and the HKUST President’s Cup research and innovation challenge, among others, and its wearable form is currently patent pending.
Padmanabhan Krishnamurthy (Paddy) is a third year student of Computer Science and an undergraduate researcher at the Hong Kong University of Science and Technology. Hailing from Bangalore, India, he possesses a keen interest in computer science, and looks for ways to apply emergent technology to enhance the environment, healthcare and other aspects of life. Since moving to Hong Kong in 2017, he has engaged himself in researching machine learning and applying it to multiple domains, including developing neural networks to optimise taxi dispatches by profiling city routines (an effort that won his team hackUST, one of Asia’s largest hackathons), leading the development of machine learning systems for quality control and driver integrity verification while interning at Uber, implementing deep learning algorithms for the early detection of cancer tumour cells as an undergraduate research endeavour, and building marine waste detection systems for an autonomous ocean-cleanup robot, ClearBot, that placed second at the finals of the Global Grand Challenges Summit in London in 2019. Most recently, he has worked alongside his friend and batch mate, Amrutavarsh Kinagi, to create Helen, a multiple award winning accessibility device that uses deep learning to perform automated lipreading. Outside academia. Padmanabhan is a public speaker, avid reader and Wodehousian, overtly critical supporter of Liverpool FC, and recreational Mridangam (Indian classical percussion instrument) and Piano player.