Band-Sifting Decomposition for Image Based Material Editing

Ivaylo Boyadzhiev (Cornell University), Kavita Bala (Cornell University), Sylvain Paris (Adobe), Edward H. Adelson (MIT)
ACM Transactions on Graphics 2015
Input Wet/oily skin (more) Wet/oily skin (more) Smooth/shiny glow (more) Blemishes (more)
Input image (with mask) Wet/oily skin (more) Wet/oily skin (less) Smooth/shiny glow (more) Blemishes (more)
High frequencies High frequencies high amplitudes High frequencies high amplitudes positive Boost High frequencies high amplitudes positive Output: wet/oily skin
Step 1: Sift high frequencies Step 2: Sift high amplitudes Step 3: Sift positive coefficients Step 4: Multiply by 2 Output: Wet/oily skin

Abstract: Photographers often ``prep'' their subjects to achieve various effects; for example, toning down overly shiny skin, covering blotches, etc. Making such adjustments digitally, after a shoot, is possible but difficult without good tools and good skills. Making such adjustments to video footage is harder still. We describe and study a set of 2D image operations, based on multi-scale image analysis, that are easy and straightforward, and that can consistently modify perceived material properties. These operators first build a subband decomposition of the image and then selectively modify the coefficients within the subbands. We call this selection process band sifting.

We show that different siftings of the coefficients can be used to modify the appearance of properties such as gloss, smoothness, pigmentation, or weathering. The band-sifting operators have particularly striking effects when applied to faces; they can provide ``knobs'' to make a face look wetter or drier, younger or older, and with heavy or light variation in pigmentation. Through user studies, we identify a set of operators that yield consistent subjective effects for a variety of materials and scenes. We demonstrate that these operators are also useful for processing video sequences.





We would like to thank the anonymous reviewers for their constructive comments. We would like to acknowledge our funding agencies NSF CGF 1161645 and funding from Adobe. We thank all the participants of our user study.

Page maintained by Ivaylo Boyadzhiev
Last update: September 2015
Valid HTML 4.01 Transitional