AIM Seminar: Training Neural NLP Models in Minimally Supervised Settings
Speaker: Byron Wallace (CCIS, NU)
Abstract: Modern neural models have achieved remarkably strong results in natural language processing (NLP) in recent years, achieving state-of-the-art performance across a range of domains and tasks. However, such models tend to be data-hungry, requiring large annotated training corpora to work well. This requirement impedes their use in specialized domains wherein direct supervision is expensive to collect, and hence sparse. I will discuss strategies for training such models with minimal labeled data. In particular these include strategies for active learning (AL) and approaches that exploit domain knowledge and other sources of indirect supervision. Finally, I will discuss approaches to efficient model transfer, including learning disentangled representations of texts.
Tuesday, October 9, 2018 at 11:15am to 12:15pm
Northeastern University, 511 Lake Hall
No recent activity