Author :
Ying Wu ; Jialue Fan
Author_Institution :
Northwestern Univ., Evanston, IL, USA
Abstract :
Matching based on local brightness is quite limited, because small changes on local appearance invalidate the constancy in brightness. The root of this limitation is its treatment regardless of the information from the spatial contexts. This papers leaps from brightness constancy to context constancy, and thus from optical flow to contextual flow. It presents a new approach that incorporates contexts to constrain motion estimation for target tracking. In this approach, one individual spatial context of a given pixel is represented by the posterior density of the associated feature class in its contextual domain. Each individual context gives a linear contextual flow constraint to the motion, so that the motion can be estimated in an over-determined contextual system. Based on this contextual flow model, this paper presents a new and powerful target tracking method that integrates the processes of salient contextual point selection, robust contextual matching, and dynamic context selection. Extensive experiment results show the effectiveness of the proposed approach.
Keywords :
image matching; motion estimation; brightness constancy; context constancy; dynamic context selection; linear contextual flow constraint; motion estimation; optical flow; robust contextual matching; salient contextual point selection; target tracking; Brightness; Context modeling; Image motion analysis; Motion analysis; Motion estimation; Optical noise; Optical sensors; Pattern matching; Robustness; Target tracking;
Conference_Titel :
Computer Vision and Pattern Recognition, 2009. CVPR 2009. IEEE Conference on
Conference_Location :
Miami, FL
Print_ISBN :
978-1-4244-3992-8
DOI :
10.1109/CVPR.2009.5206719