Media

Our group works on a lot of new problems, where part of the contribution is explaining what makes the problem important and exciting. To that end, we put a lot of care into presenting our research in a way that reaches broad audiences. This has also helped us connect with many experts from different disciplines, which has become an integral part of our work. Below you will find a sampling of videos, talks, and media coverage on various projects.

Selected Research Videos & Talks

(A longer list of publications with links to project websites can be found on our publications page.)

Noise-Coded Illumination ACM TOG / SIGGRAPH 2025

We use coded noise to add an invisible watermark to lighting that helps detect fake or manipulated video.

Personal Time-Lapse, UIST 2024

One of our UIST 2024 papers, which presents an application for capturing long-term visualizations of the body, particularly to monitor healing and growth.

Press: Chronicle

ReCapture: AR-Guided Time-lapse Photography

Our UIST 2022 paper presenting an application for capturing long-term visualizations of the body, particularly to monitor healing and growth.

Press: Technology.org | The Chronicle | News Atlas | Cornell CIS

Image-Space Modal Bases / Interactive Dynamic Video

Recovering an image-space modal basis from video of an object that can be used to simulate physical deformations of that object. This work is quite a bit older, but has seen increased interest recently after the Best Paper Award at CVPR 2024 went to a paper from Google Research that used it to train a network that predicts these simulations from a single image. Plus, it's a fun video.

Media Coverage



Noise-Coded Illumination

Lots of coverage for this project, so this is probably not an exhaustive list.

Some others: The Chronicle, Quantum Zeitgeist, Interesting Engineering, inkl, WebProNews, ITC, New Atlas, TechEBlog, The Hindu, Mid-Day




Abe / The Group

There has also been a lot of press on older projects that Abe has worked on. A sampling can be found here.