DocumentCode :
1800923
Title :
Integrating Motion and Illumination Models for 3D Tracking
Author :
Roy-Chowdhury, Amit K. ; Xu, Yilei
Author_Institution :
University of California, Riverside, CA
fYear :
2005
fDate :
17-18 Nov. 2005
Firstpage :
123
Lastpage :
134
Abstract :
One of the persistent challenges in computer vision has been tracking objects under varying lighting conditions. In this paper we present a method for estimation of 3D motion of a rigid object from a monocular video sequence under arbitrary changes in the illumination conditions under which the video was captured. This is achieved by alternately estimating motion and illumination parameters using a generative model for integrating the effects of motion, illumination and structure within a unified mathematical framework. The motion is represented in terms of translation and rotation of the object centroid, and the illumination is represented using a spherical harmonics linear basis. The method does not assume any model for the variation of the illumination conditions - lighting can change slowly or drastically. For the multi-camera tracking scenario, we propose a new photometric constraint that is valid over the overlapping field of view between two cameras. This is similar in nature to the well-known epipolar constraint, except that it relates the photometric parameters, and can provide an additional constraint for illumination invariant multi-camera tracking. We demonstrate the effectiveness of our tracking algorithm on single and multi-camera video sequences under severe changes of lighting conditions.
Keywords :
Computer vision; Image motion analysis; Lighting; Mathematical model; Motion estimation; Nonlinear optics; Optical variables control; Photometry; Tracking; Video sequences;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Computer Vision for Interactive and Intelligent Environment, 2005
Print_ISBN :
0-7695-2524-5
Type :
conf
DOI :
10.1109/CVIIE.2005.11
Filename :
1623774
Link To Document :
بازگشت