DocumentCode
2540540
Title
Underwater environment reconstruction using stereo and inertial data
Author
Hogue, Andrew ; German, Andrew ; Jenkin, Michael
Author_Institution
Univ. of Ontario, Oshawa
fYear
2007
fDate
7-10 Oct. 2007
Firstpage
2372
Lastpage
2377
Abstract
The underwater environment presents many challenges for robotic sensing including highly variable lighting, the presence of dynamic objects, and the six degree of freedom (6DOF) 3D environment. Yet in spite of these challenges the aquatic environment presents many real and practical applications for robotic sensors. A common requirement of many of these tasks is the need to construct accurate 3D representations of structures in the environment. In order to address this requirement we have developed a stereo vision-inertial sensing device that we have successfully deployed to reconstruct complex 3D structures in both the aquatic and terrestrial domains. The sensor temporally combines 3D information, obtained using stereo vision algorithms with a 3DOF inertial sensor. The resulting point cloud model is then converted to a volumetric representation and a textured polygonal mesh is extracted for later processing. Recently obtained underwater reconstructions of wrecks and coral obtained with the sensor are presented.
Keywords
feature extraction; image reconstruction; image representation; robot vision; stereo image processing; 3DOF inertial sensor; point cloud model; robotic sensors; stereo vision-inertial sensing device; textured polygonal mesh extraction; underwater environment reconstruction; volumetric representation; Inspection; Layout; Monitoring; Pollution measurement; Robot sensing systems; Robot vision systems; Size measurement; Stereo vision; Temperature sensors; Video recording;
fLanguage
English
Publisher
ieee
Conference_Titel
Systems, Man and Cybernetics, 2007. ISIC. IEEE International Conference on
Conference_Location
Montreal, Que.
Print_ISBN
978-1-4244-0990-7
Electronic_ISBN
978-1-4244-0991-4
Type
conf
DOI
10.1109/ICSMC.2007.4413666
Filename
4413666
Link To Document