DocumentCode :
3863356
Title :
Inertial Guided Visual Sample Consensus based wearable orientation estimation for body motion tracking
Author :
Yinlong Zhang;Jindong Tan;Wei Liang;Yang Li
Author_Institution :
Key Laboratory of Networked Control Systems, Shenyang Institute of Automation, Chinese Academy of Sciences, Shenyang, 110016, China
fYear :
2015
Firstpage :
2271
Lastpage :
2276
Abstract :
This paper presents a novel orientation estimate scheme using Inertial Guided Visual SAmple Consensus (IGVSAC) strategy for human body motion tracking. Unlike the traditional visual based orientation estimation methods where outliers among image-pair putative correspondences are removed based on hypothesize-and-verify models such as costly RANSAC, our approach novelly exploits motion prior information (i.e., rotation and translation) deduced from quick-response Inertial Measurement Unit (IMU) as the initial body pose to assist visual sensor in removing hidden outliers, which effectively overcomes the major drawback of those sample- and-consensus models. In addition, our IGVSAC algorithm is able to ensure the estimation accuracy even in the presence of large quantity of outliers among correspondences. Apart from that, the estimated orientation from visual sensor is, in turn, able to correct the IMU estimates using feedback control tactic, which can address IMU inherent long-term drifting issue. Extensive experiments are conducted to verify the effectiveness and robustness of our IGVSAC algorithm. The comparisons with highly accurate VICON Optical Motion Tracking System prove that our orientation estimate system is quite suitable for human body joint capturing.
Keywords :
"Visualization","Cameras","Estimation","Tracking","Robustness","Linear programming","Robot sensing systems"
Publisher :
ieee
Conference_Titel :
Robotics and Biomimetics (ROBIO), 2015 IEEE International Conference on
Type :
conf
DOI :
10.1109/ROBIO.2015.7419112
Filename :
7419112
Link To Document :
بازگشت