You must login before you can post a comment.
UCL Centre for AI is partnering with DeepMind to deliver a Deep Learning Lecture Series.
Deep Learning for Natural Language Processing by Felix Hill
1. Motivation for modelling language with ANNs: language is highly contextual, typically non-compositional and relies on reconciling many competing sources of information. Elman's Finding Structure in Time + simple recurrent networks. Importance of context: Transformers. 2. Unsupervised / representation learning for language. From Word2Vec to BERT. 3. Situated language understanding. Grounding. Embodied language learning. Reading: https://arxiv.org/abs/1912.05877
Bio: Felix Hill is a Research Scientist working on grounded language understanding, and has been at Deepmind for almost 4 years. He studied pure maths as an undergrad, then got very interested in linguistics and psychology after reading the PDP books by McClelland and Rumelhart, so started graduate school at the University of Cambridge, and ended up in the NLP group. To satisfy his interest in artificial neural networks, he visited Yoshua Bengio's lab in 2013 and started a series of collaborations with Kyunghyun Cho and Yoshua applying neural nets to text processing. This led to some of the first work on transfer learning with sentence representations (and a neural crossword solver). He also interned at FAIR in NYC with Jason Weston. At DeepMind, he's worked on developing agents that can understand language in the context of interactive 3D worlds, together with problems relating to mathematical and analogical reasoning.