Daniel McDuff

Chevron down

Turning Everyday Devices into Health Sensors

Today's electronics have very sensitive optical and motion sensors. These can captures subtle signals resulting from cardiorespiratory activity. I will present how webcam(s) can be used to measure important physiological parameters without contact with the body. In addition, I will show how an ordinary smartphones can be turned into a continuous physiological monitors. Both of these techniques reveal the surprising power of devices with around us all the time. I will show how deep learning are helping us create highly scalable and low-cost applications based on these sensor measurements.

Daniel McDuff is Principal Research Scientist at Affectiva. He is building and utilizing scalable computer vision and machine learning tools to enable the automated recognition and analysis of emotions and physiology. At Affectiva Daniel is building state-of-the-art facial expression recognition software and leading analysis of the world's largest database of human emotions (currently with 8B+ data points). Daniel completed his PhD in the Affective Computing Group at the MIT Media Lab in 2014 and has a B.A. and Masters from Cambridge University. His work has received nominations and awards from Popular Science magazine as one of the top inventions in 2011, South-by-South-West Interactive (SXSWi), The Webby Awards, ESOMAR and the Center for Integrated Medicine and Innovative Technology (CIMIT). His work has been reported in many publications including The Times, the New York Times, The Wall Street Journal, BBC News, New Scientist and Forbes magazine. Daniel is also a Research Affiliate at the MIT Media Lab.

Buttontwitter Buttonlinkedin

As Featured In

Original
Original
Original
Original
Original
Original

Partners & Attendees

Intel.001
Nvidia.001
Graphcoreai.001
Ibm watson health 3.001
Facebook.001
Acc1.001
Rbc research.001
Twentybn.001
Forbes.001
Maluuba 2017.001
Mit tech review.001
Kd nuggets.001