Photometric Ambient Occlusion

Daniel Hauagge, Scott Wehrwein, Kavita Bala, Noah Snavely

We present a method for computing Ambient Occlusion (AO) for a stack of images of a scene from a fixed viewpoint. Ambient occlusion, a concept common in computer graphics, characterizes the local visibility at a point: it approximates how much light can reach that point from different directions without getting blocked by other geometry. While AO has received surprisingly little attention in vision, we show that it can be approximated using simple, per-pixel statistics over image stacks, based on a simplified image formation model. We use our derived AO measure to compute reflectance and illumination for objects without relying on additional smoothness priors, and demonstrate state-of-the art performance on the MIT Intrinsic Images benchmark. We also demonstrate our method on several synthetic and real scenes, including 3D printed objects with known ground truth geometry.



Full Paper PAMI2015
PDF (17MB)
Full Paper CVPR2013
PDF (23MB)
Supplemental Material
Full results on MIT intrinsic benchmark, and comparisons on Lightwell.
Presentation CVPR2013
Slides used in oral presentation in various formats (PDF, Keynote, QuickTime).
ZIP (116MB)
Mathematical Derivation
Mathematica notebook with derivation of formulas in the paper.
NB (0.1MB)
MATLAB code used to generate results in the paper.
ZIP (2.3MB)
Full set of images and 3D models for Tentacle and Lightwell datasets.
ZIP (325MB)

BibTeX entry

   Title = {Photometric Ambient Occlusion},
   Author = {Daniel Hauagge and Scott Wehrwein and Kavita Bala and Noah Snavely},
   booktitle = {Proceedings of CVPR},
   Year = {2013}

Acknowledgments.This work was supported in part by the NSF (IIS-0963657, IIS-1149393, and IIS-1111534) and the Intel Science and Technology Visual Computing Center. We also thank the following people for their help and advice: Wenzel Jakob, Sean Bell, Pramook Khungurn, Steve Marschner, and Albert Liu.