Deep Learning and Cognition
Neural networks and deep learning have been inspired by brains, neuroscience and cognition, from the very beginning, starting with distributed representations, neural computation, and the hierarchy of learned features. More recently, it has been for example with the use of rectifying non-linearities (ReLU) - which enables training deeper networks - as well as the use of soft content-based attention - which allow neural nets to go beyond vectors and to process a variety of data structures and led to a breakthrough in machine translation. Ongoing research is now suggesting that brains may use a process similar to backpropagation for estimating gradients and new inspiration from cognition suggests how to learn deep representations which disentangle the underlying factors of variation, by allowing agents to intervene in their environment and explore how to control some of its elements.
Yoshua Bengio (PhD in CS, McGill University, 1991), post-docs at M.I.T. (Michael Jordan) and AT&T Bell Labs (Yann LeCun), CS professor at Université de Montréal, Canada Research Chair in Statistical Learning Algorithms, NSERC Chair, CIFAR Fellow, member of NIPS foundation board and former program/general chair, co-created ICLR conference, authored two books and over 300 publications, the most cited being in the areas of deep learning, recurrent networks, probabilistic learning, natural language and manifold learning. He is among the most cited Canadian computer scientists and is or has been associate editor of the top journals in machine learning and neural networks.