While doing a PhD in Cognitive Science and Linguistics at UCSD, my interest in phonetics and NLP led to a dissertation using neural networks to model how speakers of a language form new words via paradigm patterning and token analogy. After that, I did R&D and product development in conversational speech recognition, including discourse analysis and prosody-based language identification. I spent 12 years at Nuance in user-centered design, analysis and user testing of IVRs and in-car interfaces. At Amazon Lab 126, I focused on incorporating speech into the design of interactive devices.
At Gamgee, we have created a multimodal virtual assistant platform (22Otters) to help users with procedure prep and ongoing health management. We incorporate speech recognition and NLP to provide hands-free interaction through vocal navigation, vocal topic search, and a question-answer system for diet, medications, activities and conditions. I'm particularly interested in creating a user experience that meets the needs of older and novice users.