- About
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Fall 2024 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University - High School Programming Contests 2024
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
"Task-Independent Language Understanding"
Abstract: This talk deals with the goal of task-independent language understanding: building machine learning models that can learn to do most of the hard work of language understanding before they see a single example of the language understanding task they're meant to solve, in service of making the best of modern NLP systems both better and more data-efficient. I'll survey the (dramatic!) progress that the NLP research community has made toward this goal in the last year. In particular, I'll dwell on GLUE and SuperGLUE—two open-ended shared task competitions that measure progress toward this goal for sentence understanding tasks—and I'll preview a few recent analysis papers that attempt to offer a bit of perspective on this progress.
Bio: Sam Bowman has been on the faculty at NYU since 2016, when he completed PhD with Chris Manning and Chris Potts at Stanford. At NYU, Sam is jointly appointed between the new school-level Center for Data Science, which focuses on machine learning, and the Department of Linguistics, and is also a co-PI of the CILVR machine learning lab and an affiliate member of the Courant Institute's Department of Computer Science. Sam's research focuses on data, evaluation techniques, and modeling techniques for sentence and paragraph understanding in natural language processing, and on applications of machine learning to scientific questions in linguistic syntax and semantics. Sam organized a twenty-three person research team at JSALT 2018 and received a 2015 EMNLP Best Resource Paper Award and a 2017 Google Faculty Research Award.