Andreas Damianou

Machine Learning Scientist
Amazon

Probability & Uncertainty in Deep Learning

In this talk I will motivate the need for introducing probabilistic and Bayesian flavor to "traditional" deep learning approaches. For example, Bayesian treatment of neural network parameters is an elegant way of avoiding overfitting and "heuristics" in optimization, while providing a solid mathematical grounding. I will also highlight the deep Gaussian process family of approaches, which can be seen as non-parametric Bayesian neural networks. The Bayesian treatment of neural networks comes with mathematical intractabilities, therefore I will outline some of the approximate inference methods used to tackle these challenges.

I completed my PhD studies under Neil Lawrence in Sheffield, and subsequently pursued a post-doc in the intersection of machine learning and bio-inspired robotics. I've now moved to Amazon as a machine learning scientist, based in Cambridge, UK. My area of interest is machine learning, and more specifically: Bayesian non-parametrics (focusing on both data efficiency and scalability), representation and transfer learning, uncertainty quantification. In a recent strand of work I seek to bridge the gap between representation learning and decision-making, with applications in robotics and data science pipelines

Buttontwitter Buttonlinkedin

As Featured In

Original
Original
Original
Original
Original
Original

Partners & Attendees

Intel.001
Nvidia.001
Graphcoreai.001
Ibm watson health 3.001
Facebook.001
Acc1.001
Rbc research.001
Twentybn.001
Forbes.001
Maluuba 2017.001
Mit tech review.001
Kd nuggets.001