Jörg Bornschein

Research Scientist
DeepMind

Memory & Rapid Adaption in Generative Models

One of the most important algorithms in deep learning is stochastic gradient descent and its variants: slowly adapting the models parameters one mini-batch at a time. But we sometimes face situations where we would like to rapidly adapt our models based on only very few training examples. This situation is called few-shot learning and might arise in supervised, unsupervised and in reinforcement learning. Here I will talk about recent approaches to augment generative models with memory subsystems, how they add few-shot learning capabilities to our models, and how to generate new samples based on very few training examples.

Jorg Bornschein was previously a Global Scholar with the Canadian Institute for Advanced Research (CIFAR) and postdoctoral researcher in Yoshua Bengio’s machine learning lab at the University of Montreal. He is currently concentrating on unsupervised and semisupervised learning using deep architectures. Before moving to Montreal Jorg obtained his PhD from the University of Frankfurt working on large scale bayesian inference for non-linear sparse coding with a focus on building maintainable and massive parallel implementations for HPC clusters. Jorg was also chair and one of the founders of the german hackerspace “Das Labor” which was awarded in 2005 by the federal government for promoting STEM programs to prospective students.

Buttontwitter

As Featured In

Original
Original
Original
Original
Original
Original

Partners & Attendees

Intel.001
Nvidia.001
Graphcoreai.001
Ibm watson health 3.001
Facebook.001
Acc1.001
Rbc research.001
Twentybn.001
Forbes.001
Maluuba 2017.001
Mit tech review.001
Kd nuggets.001