Transformers in All Glory Details

About this event

In this talk, Lucas will take a deep dive into the Transformer architecture and the attention mechanism. He will provide context around the model, and explain how it has recently been adapted to various ML communities/modalities. The slides are available at http://lucasb.eyer.be/transformer

Speaker

Lucas Beyer ><

Lucas Beyer grew up in Belgium wanting to make video games and their AI, went on to study mechanical engineering at RWTH Aachen in Germany, did a PhD in robotic perception/computer vision there too, and is now researching representation learning and vision backbones at Google Brain in Zürich.