Title :
Depth from dynamic (de)focused projection
Author :
Lertrusdachakul, Intuon ; Fougerolle, Yohan D. ; Laligant, Olivier
Author_Institution :
Le2i Lab., Le Creusot, France
Abstract :
This paper presents a novel 3D reconstruction approach using dynamic (de)focused light. The method combines both depth from focus (DFF) and depth from defocus (DFD) techniques. To overcome drawbacks of surface reflectivity found in traditional methods, different optimized illumination patterns are projected on the object in order to enforce strong dominant texture and provide the projection covering the object surface as much as possible. The image acquisition system is specifically constructed to keep the whole object sharp in all captured images. Therefore, only projected patterns experience the different defocused deformation according to the object depths. The light pattern is projected onto the object at certain focused ranges similar to DFF approach, while the blur levels used to generate depth map are calculated basing on Point Spread Function (PSF) strategy as in DFD method. The final depth is then assigned to specific pixel coordinates by assumption of pre-defined overlapping pixel weights. With this approach, the final reconstruction is supposed to be prominent to the one obtained from DFD because at least one focus or near-focus image within depth of field exists in the computation. Moreover, it is also less computational extensive comparing to DFF that requires numerous input images. Experimental results on real images demonstrate effective performance of our method in which it provides reliable depth estimation and compromised time consumption.
Keywords :
image reconstruction; image resolution; DFD; PSF; depth from techniques; dynamic defocused light; image acquisition system; novel 3D reconstruction approach; optimized illumination patterns; pixel coordinates; point spread function strategy; pre-defined overlapping pixel weights; Cameras; Lenses; Lighting; Optical imaging; Optical sensors; Shape; Three dimensional displays; 3D reconstruction; active illumination pattern; blur estimation; depth from defocus; focus; range sensor;
Conference_Titel :
Image and Vision Computing New Zealand (IVCNZ), 2010 25th International Conference of
Conference_Location :
Queenstown
Print_ISBN :
978-1-4244-9629-7
DOI :
10.1109/IVCNZ.2010.6148811