Title: End-to-end Neural Structured Prediction

Advisor: Luke Zettlemoyer

Supervisor Committee: Luke Zettlemoyer (Chair), Emily Bender (GSR, Linguistics), Yejin Choi, and Noah Smith

Abstract: Many core challenges in natural language understanding are formulated as structured prediction problems, and recently introduced neural modeling techniques have significantly improved performance in nearly every case.  However, these techniques can be challenging to develop, often requiring a trade-off between tractable inference and the use of neural representations for full output structures. We outline two approaches that make different trade-offs, as demonstrated by models for two different tasks. In CCG parsing, an A* parser improves accuracy by recursively modeling entire parse trees while maintaining optimality guarantees. In coreference resolution, where the model must scale to large documents, simpler inference is critical. Instead, an end-to-end span-level model that does not explicitly model clusters can produce state-of-the-art results.

Place: 
CSE 503
When: 
Thursday, June 8, 2017 - 15:30 to 17:30