Zoom (Please contact amug@uwaterloo.ca for meeting link)
Mohamed Hibat Allah, Perimeter Institute
Title
Language models for many-body physics
Abstract
Despite the inherent exponential complexity of natural language processing (NLP) tasks, language models have sparked phenomenal performance over the last few years with interesting real-world applications ranging from speech recognition, and machine translation to text completion. Famous contemporary examples are ChatGPT and GPT-4, which are large language models, exhibiting human-level performances on different professional and academic tasks. Interestingly, the benefits of language models extend beyond NLP. In this talk, we focus on applications to many-body physics. In particular, we demonstrate the promise of language models, through the example of recurrent neural networks (RNNs), in studying quantum many-body systems. By re-designing and adapting RNNs, we show that these models can achieve state-of-the-art results and compete with traditional algorithms that were originally invented in the physics community. Additionally, we further demonstrate that RNNs supplemented with ideas from physics can tackle challenging combinatorial optimization problems of interest not only to physics but also to biology, medicine, economics, and beyond.