Shape reconstruction of 3D bilaterally symmetric surfaces

Ilan Shimshoni, Yael Moses, Michael Lindenbaumlpr

Research output: Chapter in Book/Report/Conference proceedingConference contributionpeer-review

Abstract

The paper presents a new approach for shape recovery based on integrating geometric and photometric information. We consider 3D objects which are symmetric with respect to a plane (e.g., faces) and their reconstruction from a single image. Both the viewpoint and the illumination are not necessarily frontal. In principle, no correspondence between symmetric points is required, but knowledge about a few corresponding pairs accelerates the process. The basic idea is that an image taken from a general, non-frontal viewpoint, under non-frontal illumination can be regarded as a pair of images of half of the object, taken from two different viewing positions and two different lighting directions. We show that integrating the photometric and geometric information yields the unknown lighting and viewing parameters, as well as dense correspondence between pairs of symmetric points. As a result, a dense shape recovery of the object is computed. The method has been implemented and tested experimentally on simulated and real data.

Original languageEnglish
Title of host publicationProceedings - International Conference on Image Analysis and Processing, ICIAP 1999
Pages76-83
Number of pages8
DOIs
StatePublished - 1999
Externally publishedYes
Event10th International Conference on Image Analysis and Processing, ICIAP 1999 - Venice, Italy
Duration: 27 Sep 199929 Sep 1999

Publication series

NameProceedings - International Conference on Image Analysis and Processing, ICIAP 1999

Conference

Conference10th International Conference on Image Analysis and Processing, ICIAP 1999
Country/TerritoryItaly
CityVenice
Period27/09/9929/09/99

ASJC Scopus subject areas

  • Computer Vision and Pattern Recognition

Fingerprint

Dive into the research topics of 'Shape reconstruction of 3D bilaterally symmetric surfaces'. Together they form a unique fingerprint.

Cite this