Using the MIDEN for Hospital Room Visualization

Using the MIDEN for Hospital Room Visualization

How can doctors and nurses walk around a hospital room that hasn’t been built yet? It may seem like an impossible riddle, but the Duderstadt Center is making it possible!

Working with the University Of Michigan Hospital and a team of architects, healthcare professionals are able to preview full-scale re-designs of hospital rooms using the MIDEN. The MIDEN— or Michigan Immersive Digital Experience Nexus— is our an advanced audio-visual system for virtual reality. It provides its users with the convincing illusion of being fully immersed in a computer-generated, three-dimensional world. This world is presented in life-size stereoscopic projections on four surfaces that together fill the visual field, as well as 4.1 surround sound with attenuation and Doppler Effect.

Architects and nursing staff are using the MIDEN to preview patient room upgrades in the Trauma Burn Unit of the University Hospital. Of particular interest is the placement of an adjustable wall-mounted workstation monitor and keyboard. The MIDEN offers full-scale immersive visualization of clearances and sight-lines for the workstation with respect to the walls, cabinets, and patient bed. The design is being revised based on these visualizations before any actual construction occurs, avoiding time-consuming and costly renovations later.

Pushing Interactive Boundaries in a Tea House

Pushing Interactive Boundaries in a Tea House

A person exploring the tea house within the MIDEN.

A tea house environment was designed to explore the limits of what is possible in interactivity and textures. Moving freely around the environment, the user could lift up, open, and move around objects. The scene was explored with the MIDEN and had real-world physics applies to the objects.

Dialogue of the Senses: Different Eyes

Dialogue of the Senses: Different Eyes

Three guests experiencing Alex Surdu’s exhibit

“Dialogue of the Senses” was the theme for an exhibit of student work from the Department of Performing Arts Technology, School of Music, Theater & Dance (May 2013). Alex Surdu titled his piece, “Different Eyes / I’m Standing in a Vroom.” He designed it for exhibition in the MIDEN, for aural as well as visual immersion.  In Alex’s words:

With each passing day, we find ourselves gaining perspective from the places we go, the people we meet, and the world that we experience. We are ultimately, however, alone in our individual universes of experience. With this piece, I attempt to bridge this gap by immersing participants in an abstract virtual universe that utilizes isochronic pulses to stimulate different states of consciousness. If art was a device created by man to communicate perspective, then works of this nature are the next logical step in realizing art’s purpose: providing not just something to look at, but a way with which to look at it.

New Discoveries Exploring Renal Gene Clusters

New Discoveries Exploring Renal Gene Clusters

Sometimes a mess of data is just a mess of data. But sometimes, as Dr. Suresh Bhavnani discovered, it is an opportunity for a new type of visualization. Ted Hall, advanced visualization specialist at the University of Michigan’s Duderstadt Center, set up an immersive stereoscopic projection of Bhavnani’s data in the MIDEN (Michigan Immersive Digital Experience Nexus), a small room that surrounds the user with 3D images. An antenna headset and a game console controller give Bhavnani a position in space relative to his data, from which he can virtually navigate the web of relationships between genes and diseases. This allowed him to see new patterns and identify unexpected regularities in gene function that are very difficult to untangle in 2D

Concentrate Media: On the Cutting Edge of 3D

Concentrate Media: On the Cutting Edge of 3D

Patrick Dunn, Concetrate Media:

“Because there are different paths one can take, it helps to go to one location where there are multiple individuals who are well-versed in those different paths,” … “It really helps people to find their direction.”

The accessibility of U-M’s facility makes it a particularly rare gem. The lab provides unique ease of access to technology that’s on the rise but still fairly exotic to the general public, like the 3D printers. And in the case of the MIDEN, it’s one of only a couple of publicly accessible similar facilities nationwide.

“Generally these technologies are locked behind doors because they’re very expensive, they require expertise, and they can be very delicate,” … “Here, people say, ‘We want to use the MIDEN,’ and we say ‘Okay, we’ll help you do what you want to do.'”

Visit Story at ConcentrateMedia

Virtual Reality 3-D Brain Helps Scientists Understand Migraine Pain

Virtual Reality 3-D Brain Helps Scientists Understand Migraine Pain

Dr. Alex DaSilva Photo Credit: Scott Soderberg, Michigan Photography

From U-M News:

ANN ARBOR—Wielding a joystick and wearing special glasses, pain researcher Alexandre DaSilva rotates and slices apart a large, colorful, 3-D brain floating in space before him.

Despite the white lab coat, it appears DaSilva’s playing the world’s most advanced virtual video game.  The University of Michigan dentistry professor is actually hoping to better understand how our brains make their own pain-killing chemicals during a migraine attack.

The 3-D brain is a novel way to examine data from images taken during a patient’s actual migraine attack, says DaSilva, who heads the Headache and Orofacial Pain Effort at the U-M School of Dentistry and the Molecular and Behavioral Neuroscience Institute.

Different colors in the 3-D brain give clues about chemical processes happening during a patient’s migraine attack using a PET scan, or positron emission tomography, a type of medical imaging.

“This high level of immersion (in 3-D) effectively places our investigators inside the actual patient’s brain image,” DaSilva said.

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data that shows activation in the brain *during* a migraine attack. Most data happens before or after an attack. Sean Petty and Ted Hall worked closely with Dr. DaSilva to interpret the data and add some new tools to Jugular, our in-house 3D engine, for exploring volumetric data such as fMRI and CT scans. Dr. DaSilva can now explore the brain data by easily walking around the data and interactively cutting through it.

Test Driving with FAAC and Graphics Performance Discussion

Test Driving with FAAC and Graphics Performance Discussion


FAAC Incorporated provides system engineering and software products including driving simulators for commercial and private training. FAAC reached out to the Duderstadt Center to share information and to compare their system performance to the MIDEN’s capabilities. The Duderstadt Center had developed an “urban neighborhood” model as a stress test: how big and highest number of triangles and vertices can we make the models while still maintaining a comfortable interactive frame-rate in the MIDEN? The demo showed the MIDEN’s system capabilities and potential. The Duderstadt Center proceeded to visit FAAC’s space and saw the first 6-DOF full-motion system in a mobile trailer.

Wayfinding in Assisted Living Homes

Wayfinding in Assisted Living Homes

Rebecca Davis, professor and researcher at the Grand Valley State University, received a research grant from the National Institute of Health to research how patients with Alzheimers disease navigate their living space. Assisted living homes can be drab or nondescript with long hallways adding to the confusion and frustration for those living in these homes. To research this problem and possible solutions, Davis recruited 40 people in the early stages of Alzheimer’s and 40 without the disease to virtually walk through a simulation of an actual assisted living home in the MIDEN. Staff and students at the Duderstadt Center modeled a 3D environment to re-create details such as the complicated lighting or maze-like hallways, to create a natural and immersive experience. This allows users to fully experience how the color schemes, lighting, and wall detail can affect the experience of living in the home. Various “visual cues” are placed throughout the space at key locations to determine if these help the subject in remembering which paths lead to where they need to go. Rebecca currently utilizes two environments in her study, one with visual cues and one without. Subjects are shown the path they must go to reach a destination and then given an opportunity to travel there themselves-if they can remember how.

Kinect in Virtual Reality – M.I.D.E.N. Test

Kinect in Virtual Reality – M.I.D.E.N. Test


The Kinect exploded on the gaming and natural user interface scene. People had it hacked within a few days and a collective desire to see how a depth sensing camera can be used was born. Caught up in the same energy the Duderstadt Center started playing with the hacks coming out and seeing how they could be used with other technology. After some initial tests, and the release of the official SDK from Microsoft, we dove into deeper development with the device.

In an effort to improve interactivity in the MIDEN, the Kinect has been applied as a way of representing the physical body in a virtual space. By analyzing the data received from the Kinect, the Duderstadt Center’s rendering engine can create a digital model of the body. This body represents an avatar that corresponds to the user’s location in space, allowing them to interact with virtual objects. Because the MIDEN offers the user perspective and depth perception, interaction feels more natural than maneuvering an avatar on a screen; the user can reach out and directly “touch” objects.

Migraine Brain – Quick Mapping of Brain Data

Migraine Brain – Quick Mapping of Brain Data

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data related to migraines and their effect on the brain. We had to quickly turn the data into an image suitable for a pending journal submission. While we can’t go into details at this time about the research being done, we created a quick model of the data and brought it into the MIDEN for further exploration. The model was created by taking cross-sections of the MRI dataset and projecting those onto the surface of a brain mesh. The resulting model & textures were exported and then brought into the MIDEN.