Building Controllable and Efficient Natural Language Generation Systems
Large pre-trained language models have enabled rapid progress in natural language generation (NLG). However, existing NLG systems still largely lack control over the content to be generated, and thus suffer from incoherence and unfaithfulness. In this talk, I will first introduce a neural generation framework that separately tackles the challenges of content planning and surface realization, built upon large models. Experiment results show that the model is more effective in various tasks: constructing persuasive arguments, writing opinion articles, and generating news stories. It alleviates existing models' issue of producing bland and incorrect text, a result of lacking global planning. I then discuss how to extend the model to conduct dynamic content planning with mixed language models. Finally, I present our recent long document summarization work where efficient attentions are designed to handle more than 10k tokens while prior work can only process hundreds of words.
Lu Wang is an Assistant Professor in Computer Science and Engineering at University of Michigan, Ann Arbor since 2020. Previously, she was an Assistant Professor in Khoury College of Computer Sciences at Northeastern University from 2015 to 2020. She received her Ph.D. in Computer Science from Cornell University and her bachelor degrees in Intelligent Science and Technology and Economics from Peking University. Her research focuses on designing machine learning models for natural language processing tasks, including language generation, abstractive text summarization, argument mining, discourse analysis, and their applications in computational social science (e.g. detecting media bias and polarization). Lu received an outstanding paper award at ACL 2017 and a best paper nomination award at SIGDIAL 2012. She won the NSF CAREER award in 2021.
Events at the University
Browse upcoming public lectures, exhibitions, family events, concerts, shows and festivals across the University.