Surface dependent representations for illumination insensitive image comparison

Margarita Osadchy, David W. Jacobs, Michael Lindenbaum

Research output: Contribution to journalArticlepeer-review

Abstract

We consider the problem of matching images to tell whether they come from the same scene viewed under different lighting conditions. We show that the surface characteristics determine the type of image comparison method that should be used. Previous work has shown the effectiveness of comparing the image gradient direction for surfaces with material properties that change rapidly in one direction. We show analytically that two other widely used methods, normalized correlation of small windows and comparison of multiscale oriented filters, essentially compute the same thing. Then, we show that for surfaces whose properties change more slowly, comparison of the output of whitening filters is most effective. This suggests that a combination of these strategies should be employed to compare general objects. We discuss indications that Gabor jets use such a mixed strategy effectively, and we propose a new mixed strategy. We validate our results on synthetic and real images.

Original languageEnglish
Pages (from-to)98-111
Number of pages14
JournalIEEE Transactions on Pattern Analysis and Machine Intelligence
Volume29
Issue number1
DOIs
StatePublished - Jan 2007

Bibliographical note

Funding Information:
This work was supported by the Israeli Science Foundation and by the MUSCLE NoE. The authors would like to thank Robert Adler, Irad Yavne, and the anonymous reviewers for their advice.

Keywords

  • Gaussian random surface
  • Illumination
  • Image comparison
  • Whitening

ASJC Scopus subject areas

  • Software
  • Computer Vision and Pattern Recognition
  • Computational Theory and Mathematics
  • Artificial Intelligence
  • Applied Mathematics

Fingerprint

Dive into the research topics of 'Surface dependent representations for illumination insensitive image comparison'. Together they form a unique fingerprint.

Cite this