The Design and Implementation of fastai for PyTorch
The recently-released fastai library introduces a number of new concepts (such as the Data Block API, and consolidated optimized data augmentation), and takes some existing concepts further than they've been taken before (such as a callbacks system on which every piece of training functionality is built, including mixed precision training, mixup augmentation, and more). In this talk we will outline some of the new approaches in fastai, explain how they impact developers, researchers, and users, and show some of the more interesting implementation details behind them.
Jeremy Howard is an entrepreneur, business strategist, developer, and educator. Jeremy is a founding researcher at fast.ai, a research institute dedicated to making deep learning more accessible. He is also a Distinguished Research Scientist at the University of San Francisco, a faculty member at Singularity University, and a Young Global Leader with the World Economic Forum.
Jeremy’s most recent startup, Enlitic, was the first company to apply deep learning to medicine, and has been selected one of the world’s top 50 smartest companies by MIT Tech Review two years running. He was previously the President and Chief Scientist of the data science platform Kaggle, where he was the top ranked participant in international machine learning competitions 2 years running. He was the founding CEO of two successful Australian startups (FastMail, and Optimal Decisions Group–purchased by Lexis-Nexis). Before that, he spent 8 years in management consulting, at McKinsey & Co, and AT Kearney. Jeremy has invested in, mentored, and advised many startups, and contributed to many open source projects. He has many television and other video appearances, including as a regular guest on Australia’s highest-rated breakfast news program, a popular talk on TED.com, and data science and web development tutorials and discussions.