Image Restoration using Online Photo Collections


Kevin Dale Micah K. Johnson Kalyan Sunkavalli Wojciech Matusik Hanspeter Pfister
Harvard University MIT Harvard University Adobe Systems, Inc. Harvard University
ICCV 2009


Given an input image, we query a large collection of photographs to retrieve the k most similar images. The k images define the visual context for the input image. The visual context provides a prior on colors for local color transfer. The input and color-matched images are used to estimate a global restoration that is applied to the input image to yield the final result.




We present an image restoration method that leverages a large database of images gathered from the web. Given an input image, we execute an efficient visual search to find the closest images in the database; these images define the input's visual context. We use the visual context as an image-specific prior and show its value in a variety of image restoration operations, including white balance correction, exposure correction, and contrast enhancement. We evaluate our approach using a database of 1 million images downloaded from Flickr and demonstrate the effect of database size on performance. Our results show that priors based on the visual context consistently out-perform generic or even domain-specific priors for these operations.



Kevin Dale, Micah K. Johnson, Kalyan Sunkavalli, Wojciech Matusik, Hanspeter Pfister. Image Restoration using Online Photo Collections. ICCV 2009.