SBIR-STTR Award

Voxel-Based Immersive Environments
Award last edited on: 2/28/07

Sponsored Program
SBIR
Awarding Agency
DOD : DARPA
Total Award Amount
$1,148,461
Award Phase
2
Solicitation Topic Code
SB992-044
Principal Investigator
Peter Rander

Company Information

Zaxel Systems Inc

1366 Bordeaux Drive
Sunnyvale, CA 94089
   (408) 541-9488
   info@zaxel.com
   www.zaxel.com
Location: Multiple
Congr. District: 17
County: Santa Clara

Phase I

Contract Number: DAAH0100CR058
Start Date: 11/9/99    Completed: 7/10/00
Phase I year
1999
Phase I Amount
$98,461
We propose to develop real-time 3D reconstruction, transmission, and immersive display systems for "virtual" teleportation. Development of these systems would greatly improve the realism of remote collaboration, dissolving the boundaries between the local and remote physical worlds and between the virtual and physical environments. Research in 3D reconstruction of remote environments has shown that it is possible to recover both object appearance and auditory stimuli in remote environments. The past methods for modeling object appearance, however, consume enormous computational and communication resources, and require far too many sensors to be economically feasible. These traits make real-time applications nearly impossible without fundamental algorithmic improvements. We therefore focus our attention on techniques for modeling and rendering object appearance. Recent advances at the Virtualized Reality TM laboratory at Carnegie Mellon University (CMU) demonstrate that real-time 3D shape reconstruction is possible. CMU and others have also shown that video-based view generation algorithms produce high-quality results, even in the presence of small geometric errors. We propose to combine these components with computationally efficient algorithms for video manipulation using multi-processors and SIMD instructions to construct real-time, remote immersive collaboration systems.

Keywords:
Virtual Immersion; Teleportation; Remote Collaboration; 3dD

Phase II

Contract Number: DAAH0100CR215
Start Date: 8/24/00    Completed: 11/29/02
Phase II year
2000
Phase II Amount
$1,050,000
We propose to develop a real-time 3D reconstruction, transmission, and immersive rendering system for "virtual" teleportation. Development of this system would greatly improve the realism of remote collaboration, dissolving the boundaries between the local and remote physical worlds and between the virtual and physical environments. Our Phase I effort demonstrated the ability to reconstruct and immersively render real scenes at interactive frame rates in a proof-of-concept system. In addition, our research uncovered an innovative way to simultaneously improve quality and speed of both reconstruction and rendering for video-only immersive environments. Based on these inventions, we now propose to develop a high-quality prototype to attain very realistic immersive environments over a wide range of communication channels, both for one-to-one communication and for broadcasting (one-to-many communication). Phase II will build on the Phase I success by incorporating audio into the immersive environment and by adding capabilities to broadcast an immersive environment to many people on a broad range of communications channels, from high-bandwidth LANs to 56 Kbit connections over the Internet. Phase II will also add the ability to broadcast an immersive environment simultaneously to many people and across a mixture of communications channels.

Keywords:
Virtual Immersion; Teleportation; Remote Collaboration; 3d Pixelization / Modeling; Real-Time System