In this talk we first give a short overview over multi-projector-camera techniques that allow displaying images on arbitrary surfaces. Real-time image correction techniques that perform per-pixel geometric warping, local radiometric compensation, and a multi-focal projection are briefly outlined. After this introduction, we describe a new reverse radiosity technique in more detail. It allows compensating global illumination effects such as the scattering of light from surface to surface. Simple virtual reality displays, such as CAVEs, two-sided workbenches, domes, or cylinders, for instance, suffer from these interreflections. They cause the displayed content to appear partially inconsistent and washed out. Our technique makes it possible to reduce --or even eliminate-- this effect. We finally describe a technique that automatically measures the light transport in arbitrarily complex scenes. This offers an automatic estimation of local and global surface parameters. In future, the knowledge over the entire light transport will enable us to compensate global illumination effects in real-world environments. Our vision is to enable monoscopic and stereoscopic augmented reality and virtual reality visualizations on real surfaces in everyday environments -- without the need of dedicated projection canvases or screen configurations. The topics of this talk include computer graphics, real-time GPU techniques, computer vision and projection technology.
Last modified: Thursday, 28-Jul-2005 17:23:30 NZST
This page is maintained by the seminar list administrator.