Driver Monitoring For Connected Semi-autonomous Vehicles & The Future Of Automotive HMI
This session will cover an artificially intelligent driver attention, cognitive awareness and emotion distraction monitoring system. We reveal how the technology reads facial micro expressions in real time to authenticate drivers and distinctly detect their seven universal emotions, gender, age group, eye tracking, 3D head pose and gaze estimation. During the second half of this session, we will cover a number of driver derivative metrics that trigger the activation of various reactive support systems, necessary to saving lives and improving driving behavior through better Human Machine Interfaces. This session will end with a highly rated 1-minute live demo on stage!
Modar will also be presenting on the AI Assistant stage on day one. This session covers the recent advancements in vision-based Human Behavior Understanding (HBU) technologies that augment vision-enabled applications of AI assistant interactions and their overall performance. By adding the latest in face analytics and body language reading AI in the mix, traditional interactions via speech and language understanding can be augmented with improved level of contextual accuracy, personalization and customizable user experiences.
Modar is a serial entrepreneur and expert in AI-based vision software development. He is currently founder and CEO at Eyeris, developer of a Deep Learning-based emotion recognition software, EmoVu, that reads facial micro-expressions. Eyeris uses Convolutional Neural Networks (CNN's) as a Deep Learning architecture to train and deploy its algorithm in to a number of today’s commercial applications. Modar combines a decade of experience between Human Machine Interaction (HMI) and Audience Behavioral Measurement. He is a frequent keynoter on “Ambient Intelligence”, a winner of several technology and innovation awards and has been featured in many major publications for his work.