Author/Authors :
RuigangYang، نويسنده , , GregWelch ، نويسنده , , GaryBishop، نويسنده ,
Abstract :
We present a novel use of commodity graphics hardware that effectively combines a plane-sweeping algorithm
with view synthesis for real-time, online 3D scene acquisition and view synthesis. Using real-time imagery from
a few calibrated cameras, our method can generate new images from nearby viewpoints, estimate a dense depth
map from the current viewpoint, or create a textured triangular mesh. We can do each of these without any prior
geometric information or requiring any user interaction, in real time and online. The heart of our method is to use
programmable Pixel Shader technology to square intensity differences between reference image pixels, and then
to choose final colors (or depths) that correspond to the minimum difference, i.e. the most consistent color. In this
paper we describe the method, place it in the context of related work in computer graphics and computer vision,
and present some results.