DocumentCode :
647426
Title :
3D High Dynamic Range dense visual SLAM and its application to real-time object re-lighting
Author :
Meilland, Maxime ; Barat, Christian ; Comport, Andrew
Author_Institution :
I3S, Univ. of Nice Sophia-Antipolis, Nice, France
fYear :
2013
fDate :
1-4 Oct. 2013
Firstpage :
143
Lastpage :
152
Abstract :
Acquiring High Dynamic Range (HDR) light-fields from several images with different exposures (sensor integration periods) has been widely considered for static camera positions. In this paper a new approach is proposed that enables 3D HDR environment maps to be acquired directly from a dynamic set of images in real-time. In particular a method will be proposed to use an RGB-D camera as a dynamic light-field sensor, based on a dense real-time 3D tracking and mapping approach, that avoids the need for a light-probe or the observation of reflective surfaces. The 6dof pose and dense scene structure will be estimated simultaneously with the observed dynamic range so as to compute the radiance map of the scene and fuse a stream of low dynamic range images (LDR) into an HDR image. This will then be used to create an arbitrary number of virtual omni-directional light-probes that will be placed at the positions where virtual augmented objects will be rendered. In addition, a solution is provided for the problem of automatic shutter variations in visual SLAM. Augmented reality results are provided which demonstrate real-time 3D HDR mapping, virtual light-probe synthesis and light source detection for rendering reflective objects with shadows seamlessly with the real video stream in real-time.
Keywords :
SLAM (robots); augmented reality; object tracking; rendering (computer graphics); 3D HDR environment maps; 3D HDR mapping; 3D high dynamic range dense visual SLAM; HDR light-field; RGB-D camera; augmented reality; automatic shutter variation; dense real-time 3D tracking; dense scene structure; dynamic light-field sensor; light source detection; low dynamic range image; pose structure; radiance map; real video stream; real-time object relighting; reflective surface; rendering; sensor integration period; static camera position; virtual augmented object; virtual light-probe synthesis; virtual omni-directional light-probe; Cameras; Light sources; Lighting; Real-time systems; Rendering (computer graphics); Solid modeling; Three-dimensional displays; MR/AR for entertainment; Real-time rendering; photo-realistic rendering; vision-based registration and tracking;
fLanguage :
English
Publisher :
ieee
Conference_Titel :
Mixed and Augmented Reality (ISMAR), 2013 IEEE International Symposium on
Conference_Location :
Adelaide, SA
Type :
conf
DOI :
10.1109/ISMAR.2013.6671774
Filename :
6671774
Link To Document :
بازگشت