Emily Curtin

Making Code and Humans GPU-Capable at Mailchimp

What happens when you have a bunch of data scientists, a bunch of new and old projects, a big grab-bag of runtime environments, and you need to get all those humans and all that code access to GPUs? Come see how the ML Eng team at Mailchimp wrestled first with connecting abstract containerized processes to very-not-abstract hardware, then scaled that process across tons of humans and projects. We’ll talk through the technical how-to with Docker, Nvidia, and Kubernetes, but all good ML Engineers know that wrangling the tech is only half the battle and the human factors can be the trickiest part.

3 Key Takeaways • An overview of the call stack from container, orchestration framework, OS, and all the way down to real GPU hardware • How ML Eng at Mailchimp provides GPU-compatible dev environments for many different projects and data scientists • An experienced take on how to balance data scientist’s human needs against heavy system optimization (spoiler alert: favor the humans)

Emily May Curtin is a Senior Machine Learning Platform Engineer at Mailchimp, which is definitely what she thought she’d be doing back when she went to film school. She combines her wealth of experience in DevOps, data engineering, distributed systems, and “cloud stuff” to enable Data Scientists at Mailchimp to do their best work. Truthfully, she’d rather be at her easel painting hurricanes and UFOs. Emily lives (and paints) in her hometown of Atlanta, GA, the best city in the world, with her husband Ryan who’s a pretty cool guy.

Buttontwitter Buttonlinkedin
This website uses cookies to ensure you get the best experience. Learn more