PhD Comprehensive Seminar | Yanming Kang, Multi-level TransformerExport this event to calendar

Monday, June 20, 2022 11:00 AM EDT

MC 6460 and Zoom (please email amgrad@uwaterloo.ca for the Zoom meeting link)
<--break-><--break-><--break-><--break->

Candidate

Yanming Kang | Applied Mathematics, University of Waterloo

Title

Multi-level Transformer

Abstract

Transformer models have become the most popular choice for NLP tasks. In general, transformer models with longer input sequences can achieve higher accuracy.  However, due to the quadratic space complexity of dot-product attention, hardware constraints limit the maximum input length for transformers. There have been previous works that address the problem by applying fixed sparsity patterns on the attention matrix or using methods such as k-means clustering and local sensitive hashing. We present Multi-level Transformer, which uses a hierarchy of resolutions when computing the dot-product attention. In Multi-level Transformer, information is summarized using convolution to varying degrees depending on the distance between the input and output token. The multi-level attention has O(N log N) complexity in time and space. We found that compared to the transformer, Multi-level Transformer requires much less memory and is faster for longer inputs. Our preliminary results in language modeling on Wikitext-103 showed that Multi-level Transformer can achieve comparable perplexity (6% higher) to the transformer.

Event tags 

S M T W T F S
29
30
31
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
1
2
3
4
  1. 2024 (70)
    1. June (3)
    2. May (7)
    3. April (12)
    4. March (19)
    5. February (15)
    6. January (14)
  2. 2023 (96)
    1. December (6)
    2. November (11)
    3. October (7)
    4. September (8)
    5. August (12)
    6. July (5)
    7. June (6)
    8. May (5)
    9. April (14)
    10. March (7)
    11. February (8)
    12. January (7)
  3. 2022 (106)
  4. 2021 (44)
  5. 2020 (33)
  6. 2019 (86)
  7. 2018 (70)
  8. 2017 (72)
  9. 2016 (76)
  10. 2015 (77)
  11. 2014 (67)
  12. 2013 (49)
  13. 2012 (19)
  14. 2011 (4)
  15. 2009 (5)
  16. 2008 (8)