Large Scale Machine Learning in Production
Delivering insights on big data means large scale machine learning projects. This starts from being obsessive about data quantity, data quality, data refresh to the core of machine learning which is modelling and testing. This process has most certainly outgrown the notebook era which should be restricted to eda and poc only. This talk briefly takes a look at the components of a production system such as model tracking, concept drift, model training, etc and why that is important. We will also dive into some of the available machine learning workflows with cloud integration and share resources on how to get started.
• Different types of Machine Learning workflows • Basic components of a production environment for machine learning • Scaling to production via cloud
Roshini has a background in AI and electronics from Edinburgh university. She has more than eight years of experience in applying machine learning techniques to design scalable robust solutions in the fields of e-commerce, travel and finance. She has worked with modelling user behaviour, preditive models, recommendation systems, generative adversarial models with deep learning frameworks and is currently working on models in finance to assess risk. She is very interested in understanding how AI techniques can be applied in various industries to make them more efficient and accurate. She is also very passionate about encouraging more women to enter and lead in this field and runs the London chapter for women in machine learning and data science. In her free time she dabbles with creating artwork with neural style transfer and travel photography to show how easily AI can be integrated with day to day activities and enhance our creativity.