Overview
The MIDEN (Michigan Immersive Digital Experience Nexus), formerly known as the CAVE, is currently our most advanced audio-visual system for virtual reality. It provides its users with the convincing illusion of being fully immersed in a computer-generated three-dimensional world. This world is presented in life-size stereoscopic projections on four surfaces that together fill the visual field, as well as 4.1 surround sound with attenuation and Doppler effects. The immersive experience is enhanced by wireless position tracking that continually fine-tunes the audio-visual projections and facilitates unrestricted navigation (look-around, walk-around, fly-around) and interaction with virtual objects. Head-tracking allows the system to project stable illusions of 3D objects much closer to the user than would otherwise be possible. Motion parallax as well as binocular parallax enhances the sense of 3D immersion.
A CAVE-like system was first installed at the University of Michigan in 1997. During 2005-2006, almost all components were replaced with the latest state-of-the-art technologies. Since CAVE is a trademarked brand (like Kleenex and Xerox), and we use neither its hardware nor its software, we have rebranded our installation as the MIDEN.
The MIDEN is a unique resource available to the UM community for research, teaching, and service, to facilitate the exploration, understanding, and evaluation of any real or abstract environment.
Typical applications include:
- architectural walk-through’s,
- evaluation of engineering designs (virtual prototyping),
- training for dangerous situations and other scenarios,
- human modeling (human factors and ergonomics),
- virtual reconstruction of archeological sites,
- medical and biological visualization,
- artistic expression of ideas,
- and more …
The 3D Lab offers a wide range of services surrounding the usage of the MIDEN. This includes software support for the development of MIDEN applications, introductory seminars and workshops, training for MIDEN users, demonstration examples, and individual consulting.
How it Works
The CAVE concept, upon which MIDEN is based, was first developed at the University of Illinois at Chicago. It provides the illusion of immersion by projecting stereo images on the inner surfaces of a room-sized cube. Our system uses the left, front, and right walls, and the floor, as projection screens. The resulting effect is so compelling that, after a short while, the boundaries of the physical space are perceptually superseded by the projected virtual scene.
To create the stereo effect, the images for the left and the right eye are projected in a rapid, alternating sequence. The user wears special glasses that alternately block the right and the left eye in synchronization with the projected sequence. An optical motion tracking system traces reflectors attached to the glasses of the primary viewer and continuously measures the position and orientation of his or her head. These measurements are processed by rendering algorithms that calculate and adjust the projected images in real-time as the primary viewer moves about. For secondary (untracked) viewers, the virtual environment may appear slightly distorted depending on their position in the MIDEN relative to the primary viewer.
A small wireless handheld gamepad, which is also motion-tracked, facilitates navigation and interaction in the virtual scene. The user can, for example, point the gamepad in the direction of a desired move, rotate the environment, or control a virtual beam to select and interact with a virtual object.
The MIDEN operates in “see-through” mode meaning that users can see physical objects, like their own hands or equipment brought into the MIDEN. Thus, applications can integrate physical objects with the virtual environment. For example, a jet ski mockup can placed inside the MIDEN to accommodate the driver and provide the steering and throttle controls for a simulated ride over a virtual lake.
Behind the scenes is a cluster of powerful graphics computers that calculates the images and controls all other aspects of a MIDEN application in real-time. For optimal perception and smooth navigation, these computers have to re-calculate all images 20 to 30 times per second. A lower frame rate makes the immersive experience jerky and stressful for the viewers.
Specifications
Below represents our current hardware configuration and capabilities.
[project_person_group title=”Physical Configuration”] [project_person name=”Size” desc=”10′ x 10′ x 10′ (3.048 x 3.048 x 3.048 m)”] [project_person name=”Screens (4)” desc=”Left, front, and right walls with rear-projection, floor with down projection”] [/project_person_group] [project_person_group title=”Display”] [project_person name=”Projectors (4)” desc=”Christie Mirage S+4K projectors and mirrors”] [project_person name=”Resolution” desc=”1024 x 1024 pixels per screen”] [project_person name=”Stereoscopy” desc=”Active, left-right frame-sequential at 120 Hz”] [project_person name=”Eyewear” desc=”NVIDIA NVision Pro shutter glasses with RF (radio-frequency) synchronization”] [/project_person_group] [project_person_group title=”Computers”] [project_person name=”Render Computers (5)” desc=”Hewlett-Packard, HP Z820, Intel Xeon E5-2630, 2.6 GHz, 32 GB, NVIDIA Quadro K6000, G-Sync”] [project_person name=”Tracking Computer (1)” desc=”Dell, Intel Pentium 4, 2 cores, 3.4 GHz, 2 GB”] [/project_person_group] [project_person_group title=”Input, Tracking, and Audio”] [project_person name=”Audio” desc=”Klipsch speakers, quadrophonic plus subwoofer, 100 watts per channel”] [project_person name=”Input Devices” desc=”Logitech Rumblepad, keyboard, mouse, Kinect, Emotiv EPOC”] [project_person name=”Tracking System” desc=”Vicon MX13, 8 cameras, 1.3-megapixels per camera”] [/project_person_group] [project_person_group title=”Supported Software”] [project_person name=”In-House” desc=”Jugular”] [/project_person_group]