Percy Liang

University of California-Berkeley

What is the total population of the ten largest capitals in the US?

Building a system to answer free-form questions such as this requires modeling the deep semantics of language.  But to develop practical, scalable systems, we want to avoid the costly manual annotation of these deep semantic structures and instead learn from just surface-level supervision, e.g., question/answer pairs.  To this end, we develop a new tree-based semantic representation which has favorable linguistic and computational properties, along with an algorithm that induces this hidden representation.  Using our approach, we obtain significantly higher accuracy on the task of question answering compared to existing state-of-the-art methods, despite using less supervision.

 

Biography

 

Percy Liang obtained a B.S. (2004) and an M.S. (2005) from MIT and is now completing his Ph.D. at UC Berkeley with Michael Jordan and Dan Klein.  The general theme of his research, which spans machine learning and natural language processing, is learning richly-structured statistical models from limited supervision.  He has won a best student paper at the International Conference on Machine Learning in 2008, received the NSF, GAANN, and NDSEG fellowships, and is also a 2010 Siebel Scholar.

4:15pm

B17 Upson Hall

Thursday, March 31, 2011

Refreshments at 3:45pm in the Upson 4th Floor Atrium

 

Computer Science

Colloquium

Spring 2011

www.cs.cornell.edu/events/colloquium

Deep Semantics from Shallow Supervision