Sebastian Ruder

Cross-Lingual Transfer Learning

Research in natural language processing (NLP) has seen striking advances in recent years, mainly driven by large pretrained language models. However, most of these successes have been achieved in English and a small set of other high-resource languages. In this talk, I will highlight methods that enable us to scale NLP models to more of the world's 7,000 languages, challenges, and promising future directions.

Sebastian Ruder is a research scientist in the Language team at DeepMind, London. He completed his PhD in Natural Language Processing and Deep Learning at the Insight Research Centre for Data Analytics, while working as a research scientist at Dublin-based text analytics startup AYLIEN. Previously, he studied Computational Linguistics at the University of Heidelberg, Germany and at Trinity College, Dublin.

Buttontwitter Buttonlinkedin
This website uses cookies to ensure you get the best experience. Learn more