The Quarks of Attention

Date: February 3, 2022
Time: 2:00 pm
Room: Zoom
Speaker: Pierre Baldi 
(UCI)
Abstract:

Attention plays a fundamental role in both natural and artificial intelligence systems. In deep learning, several attention-based neural network architectures have been proposed to tackle problems in natural language processing (NLP)  and beyond, including transformer architectures which currently achieve state-of-the-art performance in NLP tasks. In this presentation we will:

  1. identify and classify the most fundamental building blocks (quarks)
    of attention, both within and beyond the standard model of deep learning;
  2. identify how these building blocks are used in all current attention-based architectures, including transformers;
  3. demonstrate how transformers can effectively be applied to new problems in physics,
    from particle physics to astronomy; and 4) present a mathematical theory of attention capacity where, paradoxically, one of the main tools in the proofs is itself an attention mechanism.

Joint work with Roman Vershynin.

Close Menu
Skip to content