By Patricia Waldron
An app developed by Cornell researchers uses augmented reality to help users repeatedly capture images with a phone or tablet from the same location to make time-lapse videos – without leaving a camera on site.
Time-lapse photography, which involves combining photos taken over long periods of time, provides a powerful way to visualize phenomena such as the changing seasons or the movement of the sun. Traditionally, photographers would leave out a camera on a tripod for the duration of the event, but researchers in Abe Davis’ group have developed a more convenient method. Their iOS app, ReCapture, is now freely available in the Apple app store.
Ruyu Yan ’23, a computer science major in Cornell Engineering who was the lead developer of the app, presented the work, “ReCapture: AR-Guided Time-lapse Photography,” at the 2022 Association for Computing Machinery (ACM) Symposium on User Interface Software and Technology (UIST) on Nov. 1.
ReCapture works by repeatedly guiding users back to the same location so they can record new images from a precise viewpoint over time. Depending on how much information is available, the app has three capture modes that cover a range of scenarios. One works best for landscapes, one helps capture close-up scenes, and a third collects a range of images that can be used to reconstruct the scene in 3D offline.
Each capture mode uses different information about the scene. The simplest mode uses an overlay of previous shots to help the user line up new photos. For close-up scenes, which tend to be more difficult to capture, the application tries to figure out where the camera is in 3D space and uses arrows to tell the user how to move and tilt their phone toward the correct location.
The researchers believe this is the first application designed for creating time-lapse videos from handheld devices.
The work grew out of Yan’s summer research with Davis through the Computer Science Undergraduate Research Program (CSURP). Yan had mentioned an interest in geocaching, an activity where participants use a GPS to locate a box of trinkets called a cache, hidden by other geocaching enthusiasts. Meanwhile, Davis had been envisioning a project that would help field researchers repeatedly find and re-photograph precise locations from their field sites to track any changes. Together, they came up with the idea of “geocaching with pictures,” which ultimately evolved into ReCapture.
“Geocaching may be something that people are doing for fun, but if you're a scientist and you're doing field work, then there's a similar kind of problem at play,” said Davis, assistant professor of computer science in the Cornell Ann S. Bowers College of Computing and Information Science.
Jiatian Sun, a doctoral student in the field of computer science, and Longxiulin Deng ’23, a computer science major in Cornell Engineering, also assisted with the study.
Yan said the hardest part was developing the app interface to guide users through the process because “what works intuitively for me may not work intuitively for others.” She sought feedback from 20 beta testers and also worked with the XR Collaboratory at Cornell Tech, which advises researchers on augmented reality, virtual reality, and mixed reality applications.
Additionally, she had to figure out how to manage the mountains of data associated with the photos. “The app used to crash a lot,” she said. This was a problem, because if the app was too slow, or constantly crashing, people wouldn’t collect enough footage, leading to jerky, poor quality videos.
In future versions of the app, Davis thinks they may be able to smooth out gaps and abrupt transitions in the footage using recent machine learning techniques, which would yield higher quality videos.
Besides making gifs and videos, the app may also have valuable scientific applications, as Davis had envisioned. The team has shared the app with field researchers in other departments at Cornell, and colleagues in the School of Integrative Plant Science have already begun using it to collect data.
So far, Davis is ReCapture’s number one user. This may not be surprising, since he is helping to develop the app, but he also admits to spending more time using it than any other app on his phone – usually during his commute.
“You notice these little things that you would normally just walk past,” he said. “When you're able to compare an image of the way things are right now compared to yesterday, these differences tell a story.”
This work was partially funded through a gift from Meta.
Patricia Waldron is a writer for the Cornell Ann S. Bowers College of Computing and Information Science.