Taco Cohen

Chevron down

Accelerating algorithmic and hardware advancements for power efficient on-device AI

Artificial Intelligence (AI), specifically deep learning, is revolutionizing industries, products, and core capabilities by delivering dramatically enhanced experiences. However, the deep neural networks of today are growing quickly in size and use too much memory, compute, and energy. Plus, to make AI truly ubiquitous, it needs to run on the end device within a tight power and thermal budget. One approach to address these issues is Bayesian deep learning. This talk will discuss:

• Why AI algorithms and hardware need to be energy efficient

• How Bayesian deep learning is making neural networks more power efficient through model compression and quantization

• How we are doing fundamental research on AI algorithms and hardware to maximize power efficiency

Taco Cohen is a machine learning research scientist at Qualcomm Research Netherlands and a PhD student at the University of Amsterdam, supervised by prof. Max Welling. He was a co-founder of Scyfer, a successful deep learning services company, acquired by Qualcomm in 2017. He holds a BSc in theoretical computer science from Utrecht University and a MSc in artificial intelligence from the University of Amsterdam (both cum laude). His research is focused on understanding and improving deep representation learning, in particular learning of equivariant and disentangled representations, data-efficient deep learning, learning on non-Euclidean domains, and applications of group representation theory and non-commutative harmonic analysis. He has done internships at Google Deepmind (working with Geoff Hinton) and OpenAI. He received the 2014 University of Amsterdam thesis prize and a Google PhD Fellowship.

Buttontwitter Buttonlinkedin
This website uses cookies to ensure you get the best experience. Learn more