DocumentCode
254575
Title
Dense View Interpolation on Mobile Devices Using Focal Stacks
Author
Sakurikar, Parikshit ; Narayanan, P.J.
Author_Institution
Center for Visual Inf. Technol., Int. Inst. of Inf. Technol. - Hyderabad, Hyderabad, India
fYear
2014
fDate
23-28 June 2014
Firstpage
138
Lastpage
143
Abstract
Light field rendering is a widely used technique to generate novel views of a scene from novel viewpoints. Interpolative methods for light field rendering require a dense description of the scene in the form of closely spaced images. In this work, we present a simple method for dense view interpolation over general static scenes, using commonly available mobile devices. We capture an approximate focal stack of the scene from adjacent camera locations and interpolate intermediate images by shifting each focal region according to appropriate disparities. We do not rely on focus distance control to capture focal stacks and describe an automatic method of estimating the focal textures and the blur and disparity parameters required for view interpolation.
Keywords
cameras; image texture; interpolation; mobile computing; mobile handsets; rendering (computer graphics); adjacent camera locations; automatic focal texture estimation method; closely spaced images; dense scene description; dense view interpolation; disparity parameters; focal region; focal stacks; focus distance control; interpolative methods; light field rendering; mobile devices; static scenes; Cameras; Equations; Estimation; Indexes; Interpolation; Kernel; Mobile handsets; Mobile computational photography;
fLanguage
English
Publisher
ieee
Conference_Titel
Computer Vision and Pattern Recognition Workshops (CVPRW), 2014 IEEE Conference on
Conference_Location
Columbus, OH
Type
conf
DOI
10.1109/CVPRW.2014.26
Filename
6909971
Link To Document