The MIDEN (Michigan Immersive Digital Experience Nexus), formerly known as the CAVE, is currently our most advanced audio-visual system for virtual reality. It provides its users with the convincing illusion of being fully immersed in a computer-generated three-dimensional world. This world is presented in life-size stereoscopic projections on four surfaces that together fill the visual field, as well as 4.1 surround sound with attenuation and Doppler effects. The immersive experience is enhanced by wireless position tracking that continually fine-tunes the audio-visual projections and facilitates unrestricted navigation (look-around, walk-around, fly-around) and interaction with virtual objects. Head-tracking allows the system to project stable illusions of 3D objects much closer to the user than would otherwise be possible. Motion parallax as well as binocular parallax enhances the sense of 3D immersion.
A CAVE-like system was first installed at the University of Michigan in 1997. During 2005-2006, almost all components were replaced with the latest state-of-the-art technologies. Since CAVE is a trademarked brand (like Kleenex and Xerox), and we use neither its hardware nor its software, we have rebranded our installation as the MIDEN.
The MIDEN is a unique resource available to the UM community for research, teaching, and service, to facilitate the exploration, understanding, and evaluation of any real or abstract environment.
Typical applications include:
- architectural walk-through’s,
- evaluation of engineering designs (virtual prototyping),
- training for dangerous situations and other scenarios,
- human modeling (human factors and ergonomics),
- virtual reconstruction of archeological sites,
- medical and biological visualization,
- artistic expression of ideas,
- and more …
The 3D Lab offers a wide range of services surrounding the usage of the MIDEN. This includes software support for the development of MIDEN applications, introductory seminars and workshops, training for MIDEN users, demonstration examples, and individual consulting.
How it Works
The CAVE concept, upon which MIDEN is based, was first developed at the University of Illinois at Chicago. It provides the illusion of immersion by projecting stereo images on the inner surfaces of a room-sized cube. Our system uses the left, front, and right walls, and the floor, as projection screens. The resulting effect is so compelling that, after a short while, the boundaries of the physical space are perceptually superseded by the projected virtual scene.
To create the stereo effect, the images for the left and the right eye are projected in a rapid, alternating sequence. The user wears special glasses that alternately block the right and the left eye in synchronization with the projected sequence. An optical motion tracking system traces reflectors attached to the glasses of the primary viewer and continuously measures the position and orientation of his or her head. These measurements are processed by rendering algorithms that calculate and adjust the projected images in real-time as the primary viewer moves about. For secondary (untracked) viewers, the virtual environment may appear slightly distorted depending on their position in the MIDEN relative to the primary viewer.
A small wireless handheld gamepad, which is also motion-tracked, facilitates navigation and interaction in the virtual scene. The user can, for example, point the gamepad in the direction of a desired move, rotate the environment, or control a virtual beam to select and interact with a virtual object.
The MIDEN operates in “see-through” mode meaning that users can see physical objects, like their own hands or equipment brought into the MIDEN. Thus, applications can integrate physical objects with the virtual environment. For example, a jet ski mockup can placed inside the MIDEN to accommodate the driver and provide the steering and throttle controls for a simulated ride over a virtual lake.
Behind the scenes is a cluster of powerful graphics computers that calculates the images and controls all other aspects of a MIDEN application in real-time. For optimal perception and smooth navigation, these computers have to re-calculate all images 20 to 30 times per second. A lower frame rate makes the immersive experience jerky and stressful for the viewers.
Below represents our current hardware configuration and capabilities.
Virtual Reality as a Surrogate Sensory Environment
Virtual Reality (VR) aims to elicit sensory responses to simulated environments. In any simulation, it’s important to determine at the outset: the critical stim…
Danielle Battaglia: The Toroidal Universe
Would-be space travelers now can fly through the universe without ever leaving Ann Arbor. For her senior integrative-project thesis, Danielle Battaglia created …
Wahoo Passenger Ferry Visualization
Students in the Department of Naval Architecture and Marine Engineering use advanced visualization to evaluate clearances and accessibility in their design of a…
3D Visualization of Renal Gene Clusters
Various forms of data are quite complex and often need new methods of exploration to intrepret the data or make discoveries. This project looks into the visuali…
Kinect in Virtual Reality – M.I.D.E.N. Test
Using multiple Kinects to explore Natural User Interfaces and interaction inside the Virtual Reality M.I.D.E.N….
UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)
Rachael Miller & Rob Soltesz presented their summer work on Kinect development, natural user interfaces, and capturing emotive qualities of users at the 201…
Generative Components and Genetic Algorithms
Genetic algorithms aim to mimic natural selection in the design process. A set of parameters or “genes” characterize a “species” of arti…
Oplontis in Virtual Reality – HiRes Texture Test
In preparation for a potential exhibit we took a look at developing a pipeline for bringing 3D models with very high resolution textures into the MIDEN. Our sub…