Compilers for Deep Learning @ Facebook
With the growth in the complexity of our modeling tools (new operations, heavily dynamic graphs, etc), the changes in our numerical demands (new numerical formats, mixed precision models, etc), and our exploding hardware ecosystem (custom ASIC/FPGA accelerators, new instructions such as VNNI and WMMA, etc), it's getting harder for our traditional ML graph interpreters to deliver high performance in a reliable and maintainable fashion. We'll talk about some of our work at Facebook on ML compilers, our production applications, the exciting research questions and new domains these tools open up.
I'm a research engineer at Facebook, working on the Facebook AI Research and Applied Machine Learning teams to drive the large amount of AI applications at Facebook. At Facebook, I've worked on the large scale event prediction models powering ads and News Feed ranking, the computer vision models powering image understanding, and many other machine learning projects. I'm a contributor to several deep learning frameworks, including Torch and Caffe. Before Facebook, I obtained a masters in mathematics from the University of Cambridge, and a bachelors in mathematics from the University of Sydney.