DocumentCode :
3754627
Title :
Equivalent projection based distortion invariant visual tracking for omnidirectional vision
Author :
Yazhe Tang;Shaorong Xie;Feng Lin;Jianyu Yang;Youfu Li
Author_Institution :
Temasek Laboratories, National University of Singapore, Singapore
fYear :
2015
Firstpage :
584
Lastpage :
589
Abstract :
Catadioptric omnidirectional images suffer from serious distortions because of quadratic mirrors involved. For that reason, most of visual features developed on the basis of the perspective model are difficult to achieve a satisfactory performance when directly applied to the omnidirectional image. To accurately calculate the deformed target neighborhood, this paper employs equivalent projection approach to effectively formulate the distortion of omnidirectional camera. On the basis of equivalent projection, this paper presents a distortion invariant multi-feature fusion method for robust feature representation in omnidirectional image. Given the Gaussian Mixture Model (GMM), multiple features can be integrated into a whole probability framework. In other words, GMM transforms the problem of features matching into the multi-channel clustering. The fragment-based tracking framework can robustly handle the partial occlusion relying on an adaptive weight metric mechanism. Finally, a series of experiments will be presented to validate the performance of the proposed algorithm.
Keywords :
"Conferences","Robots","Biomimetics"
Publisher :
ieee
Conference_Titel :
Robotics and Biomimetics (ROBIO), 2015 IEEE International Conference on
Type :
conf
DOI :
10.1109/ROBIO.2015.7418831
Filename :
7418831
Link To Document :
بازگشت