We propose to develop real-time 3D reconstruction, transmission, and immersive display systems for "virtual" teleportation. Development of these systems would greatly improve the realism of remote collaboration, dissolving the boundaries between the local and remote physical worlds and between the virtual and physical environments. Research in 3D reconstruction of remote environments has shown that it is possible to recover both object appearance and auditory stimuli in remote environments. The past methods for modeling object appearance, however, consume enormous computational and communication resources, and require far too many sensors to be economically feasible. These traits make real-time applications nearly impossible without fundamental algorithmic improvements. We therefore focus our attention on techniques for modeling and rendering object appearance. Recent advances at the Virtualized Reality TM laboratory at Carnegie Mellon University (CMU) demonstrate that real-time 3D shape reconstruction is possible. CMU and others have also shown that video-based view generation algorithms produce high-quality results, even in the presence of small geometric errors. We propose to combine these components with computationally efficient algorithms for video manipulation using multi-processors and SIMD instructions to construct real-time, remote immersive collaboration systems.
Keywords: Virtual Immersion; Teleportation; Remote Collaboration; 3dD