Language Understanding with Memory Networks
Despite recent advances in AI, deep understanding of natural language by machines still remains highly challenging. In this talk, we will present Memory Networks, an attention-based neural network architecture that operates an external symbolic memory component to perform reasoning. Memory Networks can achieve interesting performance on various tasks of question answering and dialogue management, and appear to be a promising avenue towards a better machine comprehension of language.
Antoine Bordes is a research scientist at Facebook Artificial Intelligence Research. Prior to joining Facebook in 2014, he was a CNRS researcher in Compiegne in France and a postdoctoral fellow in Yoshua Bengio's lab of University of Montreal. He received his PhD in machine learning from Pierre & Marie Curie University in Paris in 2010 with two awards for best PhD from the French Association for Artificial Intelligence and from the French Armament Agency. Antoine’s current interests are centered around natural language understanding using neural networks, with a focus on question answering and dialogue. He published more than 40 papers cited more than 3,000 times.