Thursday 19 March 18:30 - 20:00

UCL AI Centre
Darwin Building B40 LT,
London
WC1E 6XA

Tickets Unavailable

UCL x DeepMind Deep Learning Lecture - Attention & Memory in Deep Learning

Science & Technology

UCL x DeepMind Deep Learning Lecture Series - Attention and Memory in Deep Learning by Alex Graves

UCL Centre for AI is partnering with DeepMind to deliver a Deep Learning Lecture Series.

Attention and Memory in Deep Learning by Alex Graves

Attention and memory have emerged as two vital new components of deep learning over the last few years. This lecture covers a broad range of contemporary attention mechanisms, including the implicit attention present in any deep network, as well as both discrete and differentiable variants of explicit attention. It then discusses networks with external memory and explains how attention provides them with selective recall. It briefly reviews transformers, a particularly successful type of attention network, and lastly looks at variable computation time, which can be seen as a form of 'attention by concentration'.

Bio: Alex Graves completed a BSc in Theoretical Physics at the University of Edinburgh, Part III Maths at the University of Cambridge and a PhD in artificial intelligence at IDSIA with JΓΌrgen Schmidhuber, followed by postdocs at the Technical University of Munich and with Geoff Hinton at the University of Toronto. He is now a research scientist at DeepMind. His contributions include the Connectionist Temporal Classification algorithm for sequence labelling (widely used for commercial speech and handwriting recognition), stochastic gradient variational inference, the Neural Turing Machine / Differentiable Neural Computer architectures, and the A2C algorithm for reinforcement learning.

Hide Comments Comments

You must login before you can post a comment.