Title :
Bayesian 3D independent motion segmentation with IMU-aided RBG-D sensor
Author :
Lobo, Jorge ; Ferreira, João Filipe ; Trindade, Pedro ; Dias, Jorge
Author_Institution :
Dept. of Electr. & Comput. Eng., Univ. of Coimbra, Coimbra, Portugal
Abstract :
In this paper we propose a two-tiered hierarchical Bayesian model to estimate the location of objects moving independently from the observer. Biological vision systems are very successful in motion segmentation, since they efficiently resort to flow analysis and accumulated prior knowledge of the 3D structure of the scene. Artificial perception systems may also build 3D structure maps and use optical flow to provide cues for ego- and independent motion segmentation. Using inertial and magnetic sensors and an image and depth sensor (RGB-D) we propose a method to obtain registered 3D maps, which are subsequently used in a probabilistic model (the bottom tier of the hierarchy) that performs background subtraction across several frames to provide a prior on moving objects. The egomotion of the RGB-D sensor is estimated starting with the angular pose obtained from the filtered accelerometers and magnetic data. The translation is derived from matched points across the images and corresponding 3D points in the rotation-compensated depth maps. A gyro-aided Lucas Kanade tracker is used to obtain matched points across the images. The tracked points can also used to refine the initial sensor based rotation estimation. Having determined the camera egomotion, the estimated optical flow assuming a static scene can be compared with the observed optical flow via a probabilistic model (the top tier of the hierarchy), using the results of the background subtraction process as a prior, in order to identify volumes with independent motion in the corresponding 3D point cloud. To deal with the computational load CUDA-based solutions on GPUs were used. Experimental results are presented showing the validity of the proposed approach.
Keywords :
Bayes methods; accelerometers; graphics processing units; image motion analysis; image segmentation; image sensors; image sequences; inertial systems; parallel architectures; pose estimation; 3D point cloud; 3D scene structure; 3D structure maps; Bayesian 3D independent motion segmentation; CUDA-based solutions; GPU; IMU-aided RBG-D sensor; accelerometers; angular pose estimation; artificial perception systems; background subtraction; biological vision systems; depth sensor; ego-motion segmentation; flow analysis; gyro-aided Lucas Kanade tracker; image sensor; inertial sensors; initial sensor based rotation estimation; magnetic data; magnetic sensors; object location estimation; optical flow; probabilistic model; rotation-compensated depth maps; two-tiered hierarchical Bayesian model; Cameras; Computer vision; Motion segmentation; Observers; Optical imaging; Optical sensors; Robot sensing systems;
Conference_Titel :
Multisensor Fusion and Integration for Intelligent Systems (MFI), 2012 IEEE Conference on
Conference_Location :
Hamburg
Print_ISBN :
978-1-4673-2510-3
Electronic_ISBN :
978-1-4673-2511-0
DOI :
10.1109/MFI.2012.6343023