Deep Learning in Big Data Infrastructure
Applying Artificial Intelligences in a global financial institution is a big challenge. It requires not only knowledge about data science, deep learning models and algorithms, but also about data storage infrastructure, security requirements, transformation pipelines, multi-tenancy computation clusters and model management. These are, among other aspects, some of the points which companies must take into account in order to benefit fully from this technology. With all of these tasks in mind, we will see how the life cycle of deep learning projects is handled in BBVA, from data ingestion to model prediction. We will talk about data processing, neural network distributed training, open source software, and several use cases.
Emiliano is a senior engineer of BBVA Innovation Labs department, in the AI division. He is specialized in distributed computation and functional programming in Scala. Currently he is immersed in different projects with the aim of implementing and evolving AI technology on a big scale in the company.