Automatic generation of text is an important topic in natural language processing with applications in tasks such as machine translation and text summarization. In this thesis, we explore the use of deep neural networks for generation of natural language. Specifically, we implement two sequence-to-sequence neural variational models - variational autoencoders (VAE) and variational encoder-decoders (VED).
Events by date
Thursday, July 26, 2018 — 10:00 to 10:00 AM EDT