Multiview Geometry for Texture Mapping 2D Images onto 3D Range Data
Google TechTalks June 29, 2006 George Wolberg http://www-cs.engr.ccny.cuny.edu/~wolberg/ ABSTRACT The photorealistic modeling of large-scale scenes, such as urban structures, requires a fusion of range sensing technology and traditional digital photography. In this talk, we describe a system that integrates multiview geometry and automated 3D registration techniques for texture mapping 2D images onto 3D range data. The 3D range scans and the 2D photographs are respectively used to generate a pair of 3D models of the scene. The first model consists of a dense 3D point cloud, produced by using a 3D-to-3D registration method that matches 3D lines in the range images. The second model consists of a sparse 3D point cloud, produced by applying a multiview geometry (structure-from-motion) algorithm directly on a sequence of 2D photographs. This alignment is necessary to enable the photographs to be optimally texture mapped onto the dense model. The contribution of this work is that it merges the benefits of multiview geometry with automated registration of 3D range scans to produce photo-realistic models with minimal human interaction. We present results from experiments in large-scale urban scenes. Joint with with Prof. Ioannis Stamos, Lingyun Liu, Gene Yu, and Siavash Zokai.