DocumentCode
3372078
Title
Global depth from defocus with fixed camera parameters
Author
Wei, Yangjie ; Dong, Zaili ; Wu, Chengdong
Author_Institution
Grad. Sch. of Chinese Acad. of Sci., Chinese Acad. of Sci., Shenyang, China
fYear
2009
fDate
9-12 Aug. 2009
Firstpage
1887
Lastpage
1892
Abstract
Reconstruction depth from 2D images is an important research issue in computer vision, and depth from defocus (DFD) is an effective way which takes the blurred degree of the region images whose depth of field is limit as the tool of computing depth. Now though there are many DFD methods, they all need to change camera parameters in order to attain blurred images, such as the focal length of the lens, the radius of the lens. If cameras with high level of amplification are used, it is inhibitory to change camera parameters. Therefore, in this paper a novel DFD method is proposed. First, two different blurred images are captured through changing depth. Second, the blurred imaging model is constructed with the relative blurring and the diffusion equation, and the relation between depth and blurring is discussed from two aspects. Finally, the problem of computing depth is transformed into an optimization issue. The method proposed in this paper does not need to change camera parameters, so the process is very simple and can be used in some special applications. The simulation results show that this method can attain depth with high precision.
Keywords
cameras; computer vision; image reconstruction; optimisation; 2D images; blurred imaging model; computer vision; depth from defocus; depth reconstruction; diffusion equation; fixed camera parameters; optimization issue; relative blurring; Cameras; Computer vision; Design for disassembly; Equations; Focusing; Image reconstruction; Laboratories; Layout; Robot vision systems; Robotics and automation; 3D Reconstruction; Depth from Defocus; Diffusion Equation;
fLanguage
English
Publisher
ieee
Conference_Titel
Mechatronics and Automation, 2009. ICMA 2009. International Conference on
Conference_Location
Changchun
Print_ISBN
978-1-4244-2692-8
Electronic_ISBN
978-1-4244-2693-5
Type
conf
DOI
10.1109/ICMA.2009.5246652
Filename
5246652
Link To Document