Font Size: a A A

View-dependent Pixel Coloring: A physically-based approach for two-dimensional view synthesis

Posted on:2004-11-28Degree:Ph.DType:Thesis
University:The University of North Carolina at Chapel HillCandidate:Yang, RuigangFull Text:PDF
GTID:2468390011965888Subject:Computer Science
Abstract/Summary:
The basic goal of traditional computer graphics is to generate 2D images of a synthetic scene represented by a 3D analytical model. When it comes to real scenes however, one usually does not have a 3D model. If however one has access to 2D images of the scene gathered from a few cameras, one can use view synthesis techniques to generate 2D images from various viewing angles between and around the cameras.; In this dissertation I introduce a fully automatic, physically-based framework for view synthesis that I call View-dependent Pixel Coloring (VDPC). VDPC uses a hybrid approach that estimates the most likely color for every picture element of an image from the desired view, while simultaneously estimating a view-dependent 3D model of the scene. By taking into account a variety of factors including object occlusions, surface geometry and materials, and lighting, VDPC has produced superior results under some very challenging conditions—in particular—in the presence of textureless regions and specular highlights, conditions that cause conventional approaches to fail.; In addition, VDPC can be implemented on commodity graphics hardware under certain simplifying assumptions. The basic idea is to use texture-mapping functions to warp the input images to the desired view point, and use programmable pixel rendering functions to decide the most consistent color for each pixel in the output image. By exploiting the fast speed and tremendous amount of parallelism inherent in today's graphics board, one can achieve real-time, on-line view synthesis of a dynamic scene.
Keywords/Search Tags:View synthesis, 2D images, Scene, Pixel, Graphics, VDPC
Related items