Urs Köster

Chevron down

Deep Learning at Scale

Deep learning has had a major impact in the last three years. Imperfect interactions with machines, such as speech, natural language, or image processing have been made robust by deep learning and deep learning holds promise in finding usable structure in large datasets. The training process is lengthy and has proven to be difficult to scale due to constraints of existing compute architectures and there is a need of standardized tools for building and scaling deep learning solutions. I will outline some of these challenges and how fundamental changes to the organization of computation and communication can lead to large advances in capabilities.

Urs has over 9 years of research experience in machine learning, spanning areas from computer vision and image processing to large scale neural data analysis. His data science experience ranges from working with national laboratories in applying deep learning to understand climate change, to helping customers solve challenging computer vision problems in medical imaging. Urs works on making the fastest implementations of convolutional and recurrent networks. For his postdoc at UC Berkeley, he used unsupervised machine learning algorithms such as Restricted Boltzmann Machines to understand the visual system.

Buttontwitter Buttonlinkedin

As Featured In

Original
Original
Original
Original
Original
Original

Partners & Attendees

Intel.001
Nvidia.001
Ibm watson health 3.001
Acc1.001
Rbc research.001
Forbes.001
Twentybn.001
Kd nuggets.001
Mit tech review.001
Facebook.001
Maluuba 2017.001
Graphcoreai.001
This website uses cookies to ensure you get the best experience. Learn more