
CS6644 Programming Assignment 1  Photometric Stereo
Fall 2014
Due date: 11:59pm on 9/30/2014


Overview
In this assignment you'll be implementing a basic version of Photometric Stereo. Given images with different light source directions and corresponding images of a chrome ball to calibrate light directions, you'll recover surface normals, grayscale albedo, and color albedo. Then you'll implement the uncalibrated version, using the SVD to recover normals and grayscale albedo without knowledge of the light directions.
Data
Download the data here. We provide 12 images of each of 6 test objects, as well as corresponding images of a chrome sphere for light direction calibration. Each object has a mask file which specifies which pixels are on the object. Each object also comes with a .txt file specifying the number of images for that object and listing the filenames of the images to simplify file processing.
Implementation
We recommend using MATLAB to implement this assignment, since it has many of the useful tools you'll need conveniently built in. That said, you may use another language at your own peril.

Calibration: The data we've provided is already quite nicely calibrated. Camera response is linear, the focal length is long, thus providing a good approximation an orthographic projection, and the light intensities are all equal. All that remains is to compute light directions for each image using the chrome sphere object. For each image of the chrome sphere, you need to come up with a 3vector specifying the light direction. This requires the following steps:
 Compute the centroid of the ball mask to find the centerpoint of the ball.
 Compute the radius of the ball.
 For each image, find the centroid of the highlight.
 Compute the surface normal at the highlight based on the distance from the center of the sphere.
 Reflect the viewing direction vector about the surface normal to find the light direction. See the lecture slides for details on this step.

Surface Normals: Now that we have light directions and images, we can solve for the surface normal and albedo at each pixel. As discussed in lecture, we formulate and solve this as a least squares problem. See the lecture slides for details; make use of Matlab's builtin least squares solving capabilities using the / or \ operators. You should produce a grayscale albedo map and a colorcoded normal map with the red, green, and blue channels representing x, y, and z components, respectively, of the surface normal vectors.

Color albedo: Now that we have normals and a grayscale albedo, we can take these normals as known and solve for albedo per channel. For each channel, find the a that minimizes I  a(L*N)^2. Hint: write the sum of squares explicitly, differentiate with respect to a, and set the derivative to zero. Also see the lecture slides.
 Uncalibrated lights:
Implement the uncalibrated photometric stereo method [Hayakawa 1994] discussed in lecture, using the SVD to decompose the measurement matrix. For these simple test objects, you can manually impose constraints on albedo or light intensity in order to resolve the ambiguity in the decomposition. Keep in mind that the simple disambiguation strategy discussed in class and shown in the Hayakawa paper only resolve the ambiguity up to an unknown rigid rotation of the surface normals.

Extra credit: Choose and implement one or more of the following addons.

Depth from normals: Assuming the surface is smooth, we can use the normal information to impose constraints on the depth values based on the curvature of the surface. Specifically,
 nx = nz z(x+1,y)  nz z(x,y)
 ny = nz z(x,y+1)  nz z(x,y)
For pixels on the boundary of the object, where nz is zero, we can use a different constraint that requires the surface normal to be orthogonal to the viewing direction:
 nx z(x,y+1)  nx z(x,y) = ny z(x+1,y)  ny z(x,y)
Form these constraints into a matrix equation of the form Mz = b, where M is 2*nPixelsbynPixels and b is 2*nPixelsby1, then use backslash to solve for z. Notice that M is very large, but also very sparse  use a sparse matrix to keep the memory requirements and solve time reasonable.
Note that this naive method may not work perfectly, even for wellcalibrated data. If you're feeling ambitious, there are more robust methods. See for example this iterative method due to Horn and Brooks.
 Shadows: Come up with a fancier objective function for the normal computation that handles shadows and/or highlights better. For example, a very simple thing to try would be to weight the contribution of each measurement by its intensity value, thereby downweighting the importance of shadowed (dark) pixel observations. Show comparisons between your objective function and the stock approach, and a discussion of the differences you found.
 Data: Run your code on more interesting data; this could mean objects with (more) shadows, interreflections, subsurface scattering, specularities, etc. Here are a few ideas for other datasets, or you could capture your own:
http://vision.seas.harvard.edu/qsfs/Data.html
http://vision.ucsd.edu/~nalldrin/research/cvpr08/datasets/
http://vision.ucsd.edu/~leekc/ExtYaleDatabase/ExtYaleB.html
http://www.cs.cornell.edu/projects/photoao/
What to hand in

A zipfile with your code, including a readme.txt describing how to run it on the umodified data directory structure.
 A zipfile with a web page (html plus images, no need to make it fancy) or PDF document describing your implementation, any challenges you encountered, and design decisions you made. For each of the 6 datasets, you should show results and comment on any obvious failure modes. You should include a sample input image, the grayscale albedo and calibrated surface normals, color albedo, and uncalibrated surface normals. Also describe and show results for any extra credit you did.
Collaboration Policy
This assignment is to be done individually. Discussion of ideas and approaches is fine, but you must write your own code.
Acknowledgements
This assignment was adapted from a similar project assigned in UW CSE455.