Using Mobile VR to Assess Claustrophobia During an MRI

Using Mobile VR to Assess Claustrophobia During an MRI

new methods for exposure therapy

Stephanie O’Malley


Dr. Richard Brown and his colleague Dr. Jadranka Stojanovska had an idea for how VR could be used in a clinical setting. Having realized a problem with patients undergoing MRI scans experiencing claustrophobia, they wanted to use VR simulations to introduce potential patients to what being inside an MRI machine might feel like.

Duderstadt Center programmer Sean Petty and director Dan Fessahazion alongside Dr. Richard Brown

Claustrophobia in this situation is a surprisingly common problem. While there are 360 videos that convey what an MRI might look like, these fail to address the major factor contributing to claustrophobia: The perceived confined space within the bore. 360 videos tend to make the environment skewed, seeming further away than it would be in reality and thereby failing to induce the same feelings of claustrophobia that the MRI bore would produce in reality. With funding from the Patient Education Award Committee, Dr. Brown approached the Duderstadt Center to see if a better solution could be produced.

VR MRI: Character customization
A patient enters feet-first into the bore of the MRI machine.

In order to simulate the effects of an MRI accurately, a CGI MRI machine was constructed and ported to the Unity game engine. A customize-able avatar representing the viewer’s body was also added to give viewers a sense of self. When a VR headset is worn, the viewer’s perspective allows them to see their avatar body and the real proportions of the MRI machine as they are slowly transported into the bore. Verbal instructions mimic what would be said throughout the course of a real MRI, with the intimidating boom of the machine occurring as the simulated scan proceeds.

Two modes are provided within the app: Feet first or head first, to accommodate the most common scanning procedures that have been shown to induce claustrophobia.  

In order to make this accessible to patients, the MRI app was developed with mobile VR in mind, allowing anyone (patients or clinicians) with a VR-capable phone to download the app and use it with a budget friendly headset like Google Daydream or Cardboard.

Dr. Brown’s VR simulator was recently featured as the cover story in the September edition of Tomography magazine.

Learning Jaw Surgery with Virtual Reality

Learning Jaw Surgery with Virtual Reality

Jaw surgery can be complex and there are many factors that contribute to how a procedure is done. From routine corrective surgery to reconstructive surgery, the traditional means of teaching these scenarios has been unchanged for years. In an age populated with computers and the growing popularity of virtual reality, students still find themselves moving paper cut-outs of their patients around on a table top to explore different surgical methods.

Dr. Hera Kim-Berman was inspired to change this. Working with the Duderstadt Center’s 3D artist and programmers, a more immersive and comprehensive learning experience was achieved. Hera was able to provide the Duderstadt Center with patient Dicom data. These data sets were originally comprised of a series of two-dimensional MRI images, which were converted into 3D models and then segmented just as they would be during a surgical procedure. These were then joined to a model of the patient’s skin, allowing the movement of the various bones to influence real-time changes to a person’s facial structure, now visible from any angle.

This was done for several common practice scenarios (such as correcting an extreme over or under bite, or a jaw misalignment) and then imported into the Oculus Rift, where hand tracking controls were developed to allow students to “grab” the bones for adjusting in 3D.

Before re-positioning the jaw segments, the jaw has a shallow profile.

After re-positioning of the jaw segments, the jaw is more pronounced.

As a result, students are now able to gain a more thorough understanding of the spatial movement of bones and more complex scenarios, such as extensive reconstructive surgery, could be practiced well in advance of seeing a patient for a scheduled surgery.

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba, a Collaborative Exploration of Data

On Nov. 17th-19th the Duderstadt Center’s Visualization Expert, Ted Hall, will be in Austin, Texas representing the Duderstadt Center at SC15, a super computing event. The technology on display will allow people in Austin to be projected into the MIDEN, the University of Michigan’s immersive virtual reality cave, allowing visitors in both Ann Arbor and in Austin to explore the body of a mummified mammoth.

The mummified remains of Lyuba.

The mammoth in question is a calf called Lyuba, found in Siberia in 2007 after being preserved underground for 50,000 years. This specimen is considered the best preserved mammoth mummy in the world, and is currently on display in the Shemanovskiy Museum and Exhibition Center in Salekhard, Russia.

University of Michigan Professor Daniel Fisher and his colleagues at the University of Michigan Museum of Paleontology arranged to have the mummy scanned using X-Ray computed tomography in Ford Motor Company’s Nondestructive Evaluation Laboratory. Adam Rountrey then applied a color map to the density data to reveal the internal anatomical structures.

Lyuba with her skeleton visible.

The Duderstadt Center got this data as an image stack for interactive volumetric visualization. The stack comprises 1,132 JPEG image slices with 762×700 pixel resolution per slice. Each of the resulting voxels is 1mm cubed.

When this data is brought into the Duderstadt Center’s Jugular software, the user can interactively slice through the mammoth’s total volume by manipulating a series of hexagonal planes, revealing the internal structure. In the MIDEN, the user can explore the mammoth in the same way while the mammoth appears to exist in front of them in three virtual dimensions. The MIDEN’s Virtual Cadaver used a similar process.

For the demo at SC15, users in Texas can occupy the same virtual space as another user in Ann Arbor’s MIDEN. Via a Kinect sensor in Austin, a 3D mesh of the user will be projected into the MIDEN alongside Lyuba allowing for simultaneous interaction and exploration of the data.

Showings will take place in the MIDEN

Sean Petty and Ted Hall simultaneously explore the Lyuba data set, with Ted’s form being projected into the virtual space of the MIDEN via Kinect sensor.

More about the Lyuba specimen:
Fisher, Daniel C.; Shirley, Ethan A.; Whalen, Christopher D.; Calamari, Zachary T.; Rountrey, Adam N.;
Tikhonov, Alexei N.; Buigues, Bernard; Lacombat, Frédéric; Grigoriev, Semyon; Lazarev, Piotr A. (2014 July). “X-ray Computed Tomography of Two Mammoth Calf Mummies.” Journal of Paleontology 88(4):664-675. DOI: http://dx.doi.org/10.1666/13-092
https://en.wikipedia.org/wiki/Lyuba
http://www.dallasnews.com/lifestyles/travel/headlines/20100418-42-000-year-old-baby-mammoth-4566.ece

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

CT data was brought into Zbrush & Topogun to be segmented and re-topologized. Influence was then added to the skin mesh allowing it to deform as the bones were manipulated.

Hera Kim-Berman is a Clinical Assistant Professor with the University of Michigan School of Dentistry. She recently approached the Duderstadt Center with an idea that would allow surgeons to prototype jaw surgery specific to patient data extracted from CT scans. Hera’s concept involved the ability to digitally manipulate portions of the skull in virtual reality, just as surgeons would when physically working with a patient, allowing them to preview different scenarios and evaluate how effective a procedure might be prior to engaging in surgery.

Before re-positioning the jaw segments, the jaw has a shallow profile.

After providing the Duderstadt Center with CT scan data, Shawn O’Grady was able to extract 3D meshes of the patient’s skull and skin using Magics. From there, Stephanie O’Malley worked with the models to make them interactive and suitable for real-time platforms. This involved bringing the skull into a software like Zbrush and creating slices in the mesh to correspond to areas identified by Hera as places where the skull would potentially be segmented during surgery. The mesh was then also optimized to perform at a higher frame rate when incorporated into real-time platforms. The skin mesh was also altered, undergoing a process called “re-topologizing” which allowed it to be more smoothly deformed.  From there, the segmented pieces of the skull were re-assembled, and then assigned influence over areas of the skin in a process called “rigging”. This allowed for areas of the skin to move with selected bones as they were separated and shifted by a surgeon in 3D space.

After re-positioning of the jaw segments, the jaw is more pronounced.

Once a working model was achieved, it was passed off to Ted Hall and student programmer Zachary Kiekover, to be implemented into the Duderstadt Center’s Jugular Engine, allowing the demo to run at large scale and in stereoscopic 3D from within the virtual reality MIDEN but also on smaller head mounted displays like the Oculus Rift. Additionally, more intuitive user controls were added which allowed for easier selection of the various bones using a game controller or motion tracked hand gestures via the Leap Motion. This meant surgeons could not only view the procedure from all angles in stereoscopic 3D, but they could also physically grab the bones they wanted to manipulate and transpose them in 3D space.

Zachary demonstrates the ability to manipulate the model using the Leap Motion.

Exploring Human Anatomy with the Anatomage Table

 

Exploring Human Anatomy with the Anatomage Table

The Anatomage table is a technologically advanced anatomy visualization system that allows users to explore the complex anatomy of the human body in digital form, eliminating the need for a human cadaver. The table presents a human figure at 1:1 scale, and utilizes data from the Visible Human effort with the additional capability of loading real patient data (CT, MRI, etc), making it a great resource for research, collaborative discovery, and the studying of surgical procedures. Funding to obtain the table was a collaborative effort between the schools of Dentistry, Movement Science, and Nursing although utilization is expected to expand to include Biology. Currently on display in the Duderstadt Center for exploration, the Anatomage table will be relocating to its more permanent home inside the Taubman Health Library in early July.

The Anatomage table allows users to explore the complex anatomy of the human body.