- About
- Events
- Calendar
- Graduation Information
- Cornell Learning Machines Seminar
- Student Colloquium
- BOOM
- Fall 2024 Colloquium
- Conway-Walker Lecture Series
- Salton 2024 Lecture Series
- Seminars / Lectures
- Big Red Hacks
- Cornell University - High School Programming Contests 2024
- Game Design Initiative
- CSMore: The Rising Sophomore Summer Program in Computer Science
- Explore CS Research
- ACSU Research Night
- Cornell Junior Theorists' Workshop 2024
- People
- Courses
- Research
- Undergraduate
- M Eng
- MS
- PhD
- Admissions
- Current Students
- Computer Science Graduate Office Hours
- Advising Guide for Research Students
- Business Card Policy
- Cornell Tech
- Curricular Practical Training
- A & B Exam Scheduling Guidelines
- Fellowship Opportunities
- Field of Computer Science Ph.D. Student Handbook
- Graduate TA Handbook
- Field A Exam Summary Form
- Graduate School Forms
- Instructor / TA Application
- Ph.D. Requirements
- Ph.D. Student Financial Support
- Special Committee Selection
- Travel Funding Opportunities
- Travel Reimbursement Guide
- The Outside Minor Requirement
- Diversity and Inclusion
- Graduation Information
- CS Graduate Minor
- Outreach Opportunities
- Parental Accommodation Policy
- Special Masters
- Student Spotlights
- Contact PhD Office
Quantifying Availability and Discovery in Recommender Systems via Reachability
Abstract: Personalized preference models mediate access to a large portion of online content through their use in recommendation systems. It is therefore important to look beyond measures of accuracy towards notions of access. In this talk, I propose an evaluation based on reachability, which quantifies the maximum probability of recommending a target piece of content over a set of allowable strategic modifications of a user's features. While much recent work in robust machine learning casts strategic manipulation as undesirable, we view the ability of users to influence the model in a positive light.
The proposed framework allows us to compute an upper bound on the likelihood of a recommendation with minimal assumptions about user behavior. Stochastic reachability can be used to detect biases in the availability of content and diagnose limitations in the opportunities for discovery granted to users. We will see that this metric can be computed efficiently and that it is not inherently at odds with accuracy. We demonstrate evaluations of recommendation algorithms trained on large datasets and end with a discussion of open problems.
Joint work with Mihaela Curmei, Ben Recht, and Sarah Rich.
Biography: Sarah is an incoming assistant professor in the Computer Science Department at Cornell. She recently completed her PhD in EECS from UC Berkeley, where she was advised by Ben Recht, and is currently a postdoc with Jamie Morgenstern at the University of Washington. Sarah is interested in the interplay between optimization, machine learning, and dynamics in real-world systems. Her research focuses on understanding the fundamentals of data-driven control and decision-making, broadly categorized into two thrusts: guaranteeing safety in feedback control and ensuring values in social-digital systems. This work is grounded in and inspired by collaborative projects in applications ranging from robotics to recommendation systems.