How AI can empower the blind community
In this demo based talk, Anirudh will discuss the real-life impact that AI is already bringing in the daily lives of the blind and low vision community. Learning from failures and success while converting research to product, the talk showcases a range of real-world scenarios which can benefit from both classical computer vision as well as deep learning based techniques. Separating hype from reality, it also highlights open opportunities for innovation where many traditional datasets & benchmarks do not convert to in-the-wild usage beyond fancy demos. Deep learning techniques can also help improve human-computer interaction, which might be the key to making these advances usable. The key underlying theme to recognize is how developing for differently abled communities can lead to innovation for mainstream audiences.
Anirudh Koul is the Head of AI & Research at Aira (Visual interpreter for the blind), and upcoming author of 'Practical Deep Learning for Cloud and Mobile'. Previously at Microsoft AI & Research, he founded Seeing AI App - often considered the defacto app in the blind and low vision community. With features shipped to a billion people over the decade, he has also been developing for building tools for communities with visual, hearing and mobility impairments. Some of his recent work, which IEEE has called ‘life-changing’, has been honored by CES, FCC, Cannes Lions, American Council of the Blind, and more.