Attentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation For Pretrained Models

TitleAttentive Student Meets Multi-Task Teacher: Improved Knowledge Distillation For Pretrained Models
Publication TypeJournal Article
Year of Publication2019
AuthorsLiu, L., H. Wang, J. Lin, R. Socher, and C. Xiong
JournalArXiv
Volumeabs/1911.03588
URLhttp://arxiv.org/abs/1911.03588