One of the biggest problems with interactive applications is how to suck the user into the experience, suspending their sense of disbelief so that they are completely immersed in the virtual world created for them. After receiving the extremely powerful SDK and accelerators from Ageia (now NVIDIA), we aimed to use this technology to bring the user one step closer to total immersion.
Since we wanted to create the most immersive and believable experience possible, we decided to initially develop for the MIDEN. The resulting codebase and application allowed for individuals to push objects out of the way by simply walking inside the MIDEN or using specially outfitted gloves which can be used to interact with the virtual world. Rigid objects as well as articulated objects such as humans were fully represented with collision and physical properties.
This project as since been ported to the latest version of our in-house engine and supports many additional features such as cloth, fluids, two-way collision response, etc. It is also an active area of research in the Lab now focused on integrating low-cost technologies such as the Kinect.