Speaker: Danqi Chen, Princeton University
Abstract: Open-domain question answering, the task of automatically answering questions posed by humans in a natural language, usually based on a large collection of unstructured documents, has (re-)gained a lot of popularity in the last couple of years. In this talk, I will discuss many recent exciting developments which have greatly advanced the field, including several works of ours. In particular, I would like to discuss the importance of pre-training for question answering, learning dense representations for retrieval in place of sparse models, the role of structured knowledge, as well as the trade-off between open-book and closed-book models. I will conclude with current limitations and future directions.
Bio: Danqi Chen is an Assistant Professor of Computer Science at Princeton University and co-leads the Princeton NLP Group. Her research focuses on deep learning for natural language processing, especially in the intersection of text understanding and knowledge representation & reasoning and applications in question answering, information extraction, and conversational systems. Before joining Princeton, Danqi worked as a visiting scientist at Facebook AI Research in Seattle. She received her Ph.D. from Stanford University (2018) and B.E. from Tsinghua University (2012), both in Computer Science. In the past, she was a recipient of the 2019 Arthur Samuel Best Doctoral Thesis Award at Stanford University, a Facebook Fellowship, a Microsoft Research Women’s Fellowship, and paper awards at ACL’16 and EMNLP’17.