Stop Operating Your Models in the Dark! Lessons Learned From the Field
As AI is becoming ubiquitous, machine learning practitioners are faced with a new challenge: the day after production. As ML systems are inherently data-dependent, trying to ensure their proper behaviour “in production” can be thorny: from drifts to bias or data quality issues, through missing labels.
In this session, we will share best practices to monitor AI in production in the financial sector, and maximize the value of your AI program for all stakeholders.
Key Takeaways: • To stop the "black box effect" of models in production you need to monitor them • AI assurance is about monitoring metrics and empowering data scientists as well as operations stakeholders • You can't scale your AI activities without proper assurance
Pearl Lieberman is the head of product marketing at superwise.ai, the startup devoted to assuring the health of AI models in production. With over 10 years experience translating sophisticated technology into valuable business benefits for the financial sector, Pearl has a track record of bringing together teams across the enterprise to advance and scale innovation.