DocumentCode
2472908
Title
Recovering Euclidean deformable models from stereo-motion
Author
Lladó, Xavier ; Bue, Alessio Del ; Agapito, Lourdes
Author_Institution
Univ. of Girona, Girona, Spain
fYear
2008
fDate
8-11 Dec. 2008
Firstpage
1
Lastpage
4
Abstract
In this paper we present a novel structure from motion (SfM) approach able to infer 3D deformable models from uncalibrated stereo images. Using a stereo setup dramatically improves the 3D model estimation when the observed 3D shape is mostly deforming without undergoing strong rigid motion. Our approach first calibrates the stereo system automatically and then computes a single metric rigid structure for each frame. Afterwards, these 3D shapes are aligned to a reference view using a RANSAC method in order to compute the mean shape of the object and to select the subset of points on the object which have remained rigid throughout the sequence without deforming. The selected rigid points are then used to compute frame-wise shape registration and to extract the motion parameters robustly from frame to frame. Finally, all this information is used in a global optimization stage with bundle adjustment which allows to refine the frame-wise initial solution and also to recover the non-rigid 3D model. We show results on synthetic and real data that prove the performance of the proposed method even when there is no rigid motion in the original sequence.
Keywords
feature extraction; image motion analysis; image registration; stereo image processing; 3D model estimation; Euclidean deformable models; RANSAC method; framewise shape registration; global optimization stage; motion parameter extraction; structure from motion approach; uncalibrated stereo images; Cameras; Data mining; Deformable models; Image reconstruction; Image sequences; Motion estimation; Nonlinear distortion; Robustness; Shape; Stereo image processing;
fLanguage
English
Publisher
ieee
Conference_Titel
Pattern Recognition, 2008. ICPR 2008. 19th International Conference on
Conference_Location
Tampa, FL
ISSN
1051-4651
Print_ISBN
978-1-4244-2174-9
Electronic_ISBN
1051-4651
Type
conf
DOI
10.1109/ICPR.2008.4761003
Filename
4761003
Link To Document