Planting Disabled Futures – A call for artists to collaborate

Planting Disabled Futures

OPen Call for Artist Collaborators

Author


Petra Kuppers is disability culture activist and a community performance artist. She creates participatory community performance environments that think/feel into public space, tenderness, site-specific art, access and experimentation. Petra grounds herself in disability culture methods, and uses ecosomatics, performance, and speculative writing to engage audiences toward more socially just and enjoyable futures.


Her latest project, Planting Disabled Futures, is funded by a Just Tech fellowship.

In the Planting Disabled Futures project, Petra aims to use live performance approaches and virtual reality (and other) technologies to share energy, liveliness, ongoingness, crip joy and experiences of pain. 

In the development of the Virtual Reality (VR) components of the project, we will ask: How can VR allow us to celebrate difference, rather than engage in hyper-mobile fantasies of overcoming and of disembodied life? How can our disabled bodymindspirits develop non-extractive intimacies, in energetic touch, using VR as a tool toward connecting with plants, with the world, even in pain, in climate emergency, in our ongoing COVID world?

A watercolor mock-up of the Crip Cave, with Moira Williams’ Stim Tent, two VR stations, a potential sound bed, and a table for drawing/writing.

Petra envisions a sensory art installation equipped with a VR experience, stimming tent, a soundbed and a drawing and writing table. The VR experience would be supplemented by actors providing opportunities to engage with unique taste, touch and smell sensations as the environment is navigated.

A cyanotype (blue) and watercolor mock-up of what the VR app might look like: a violet leaf with sensation hubs, little white ink portals, that might lead to an audio dream journey

The VR experience involved in the Crip-Cave is expected to be tree-like environment that allows participants to select either a visual or an auditory experience. Participants can travel down to the roots and experience earth critters or up to the branches and into the leafy canopy. In both locations, “sensory hubs” would take participants on a journey to other worlds – worlds potentially populated with content produced by fellow artists.

A cyanotype/watercolor mock-up of little critters that might accompany you on your journey through the environment.

Artist collaborators are welcome to contribute their talents generating 3d worlds in Unreal Engine, reciting poetry, animating or composing music to create a dream journey in virtual reality. Artists generating digital content they would like considered for inclusion in this unique art installation can reach out to: [email protected]


To learn more about Planting Disabled Futures, visit:
https://www.petrakuppers.com/planting-disabled-futures

Engineering Grants for XR

The Enhancing Engineering Education Grants Program is designed to support innovative strategies for engaging and supporting all learners in Michigan Engineering undergraduate courses. This program springs from a collaboration among ADUEADGPECAENCRLT-Engin, and Nexus. Proposals will be invited across the range of innovations in engineering education, including instructional practices, course design and content, and instructional technology.

As part of the initial Enhancing Education using Technology (EET) proposal to the College to support the instructional needs of faculty, grants were offered to support the implementation of innovative ideas that instructors needed money to accomplish. The first year of the grants program was FY23 and all grant money was awarded to faculty. It included three major grants of $50K each on the topics of XR, DEI, and Tandem. Additional smaller grants were also awarded to faculty. At the completion of this first year, the team used the past year’s knowledge to propose improvements and changes to the program.

For AY 2024-2025, there are three grants available to support instructional faculty members:

Education Innovation Grants

Grants of up to $10K are available to COE faculty & staff

About the Grant

Grants of up to $10K are available to individual or small groups of Michigan Engineering instructional faculty and staff members seeking to implement innovative teaching methods and/or tools.

Group 2 applications are now being accepted. This call for proposals is open to all eligible applicants and does not necessitate a previous Group 1 proposal or submission.


Proposal Evaluation Criteria
  • Applies a novel method or tool to achieve objectives
  • Reflects innovation in teaching methods or approaches
  • For online courses they utilize the Quality Matters framework and work with Nexus to do so.
  • Involves partnering with Nexus or CRLT-E to co-teach a new faculty development workshop
  • Builds upon evidence-based best practices for enhancing student learning
  • Promotes equitable instruction for all learners
  • Implements practices or tools that have the potential for great impact (either large impact on a small population, or be something that could be applied to a larger population at Michigan Engineering)
  • Other funding opportunities do not exist for this type of work
  • Achieves synergy with goals, strengths, and ongoing work of the College of Engineering, especially as it relates to Michigan Engineering’s Strategic Vision

Group 2 applications close Wednesday, May 1, 2024

Project Statement:

  • Clearly describe the proposed project
  • Explain the value of the project
  • Identify the specific innovation and its relation to evidence-based practices
  • Explain how the project supports equitable instruction and enhanced student learning
  • Discuss the project’s potential for application in broader contexts

Project Evaluation Plan:
Explain how the success of this project will be evaluated, documented, and disseminated. Approaches might include midterm course assessments, focus groups, and surveys, among others.

Budget Request:

  • Graduate or undergraduate student salaries
  • Materials and supplies
  • Project evaluation expenses
  • Travel and registration fees for teaching-related conferences, seminars or workshops
  • Faculty member summer salary (up to $2K of the project’s total budget)

Timeline:
Submissions will be accepted until Wednesday, May 1, 2024 with funding decisions announced late May.

Strategic Technology Grants

COE Projects Focussed on XR, online/hybrid learning and/or generative artificial intelligence

About the Grant

Grants of up to $50,000 are available to teams of at least three Michigan Engineering instructional faculty and staff members to implement innovative teaching methods and/or tools that require an investment of time/resources and collaboration for deployment that is larger than what is available via Education Innovation Grants. Projects should focus on strategic themes of XR, online/hybrid learning and/or generative artificial intelligence.


Proposal Evaluation Criteria
  • Applies a novel method, modality or tool to achieve objectives
  • Reflects innovation in teaching methods or approaches
  • Builds upon evidence-based best practices for enhancing student learning
  • Promotes equitable instruction for all learners
  • If online, leverages the Quality Matters rubric and best practices with online course design and development
  • Implements practices or tools that have the potential for great impact (either large impact on a small population, or be something that could be applied to a larger population at Michigan Engineering)
  • Achieves synergy with goals, strengths, and ongoing work of ADUEADGPRCAENCRLT-EnginNexus, and/or the broader College of Engineering, especially as it relates to Michigan Engineering’s Strategic Vision

Applications closed March 4, 2024

Identify your proposal’s strategic theme:

  • Online/hybrid learning
  • Generative artificial intelligence
  • XR

Project Statement:

  • Clearly describe the proposed project
  • Explain the value of the project
  • Identify the specific innovation and its relation to evidence-based practices
  • Explain how the project supports equitable instruction and enhanced student learning
  • If online, describe how the course design and development effort will leverage the Quality Matters rubric
  • Discuss the project’s potential for great impact
  • Describe your goals for collaboration with at least one E3 grant sponsor (ADUE, ADGPE, CAEN, CRLT-Engin, and/or Nexus)

Project Evaluation Plan:
Explain how the success of this project will be evaluated, documented, and disseminated.

Budget Request:

  • Graduate or undergraduate student salaries
  • Instructional software and classroom technology
  • Materials and supplies
  • Project evaluation expenses
  • Travel and registration fees for teaching-related conferences, seminars or workshops
  • Faculty member summer salary (up to $2K of the project’s total budget)

Team Roster:
Provide a list of all team members, with descriptions of their respective roles and very brief bios.

Timeline:

Submissions are due on Monday, March 4, 2024 with funding decisions announced in April.

Software Pilot Grants

GRANT FUNDING UP TO $10K for COE Faculty & STAFF SEEKING TO PILOT INSTRUCTIONAL SOFTWARE

About the Grant

Grants of up to $10K are available to instructional faculty and staff members seeking to pilot innovative and results-oriented instructional software that has the potential to improve teaching and learning in Michigan Engineering. Proposals may be submitted by individuals requesting software for a specific class or a team of faculty members requesting software for a group of classes.

In the spirit of innovation, all ideas are welcome. Proposals that call for the use of collaborative teaching and learning strategies are encouraged. Priority will be given to projects that, if proven successful, can be replicated throughout the College.

Please note that there are many routes for procuring software licenses at the University of Michigan. We encourage you to reach out to our team at [email protected] to help determine if this grant program is appropriate for your request before submitting a proposal.


REQUIRED DELIVERABLES
  • Presentation of a case study of your application of the software and how it impacted your students’ learning objectives to the Michigan Engineering faculty community
  • Engagement with CAEN on evaluation of software for possible college adoption
  • Acting as a faculty advocate for this software and sharing how you are using it in your class

Applications for Fall 2024 close April 1, 2024

Course Information:
Logistical course details including frequency the course is taught, enrollment summary, etc.

Learning Gaps:
Describe the learning gap(s) you have identified in your lesson/module/unit/course.

Teaching Intervention (Pedagogical Support):
Explain the teaching and technology intervention(s) that will close the stated learning gaps. Identify the evidence-based practices that support the efficacy of the proposed software solution.

Comparative Tool Evaluation:

  • Identify 3-4 comparable software tools (including your proposed tool) that could fill the established learning gaps.
  • List the criteria you will use to evaluate the 3-4 comparable tools to inform your decision making.

Project Evaluation Plan:

  • Explain how the success of this software will be evaluated, documented, and disseminated -approaches might include midterm course assessments, focus groups, and surveys, among others.
  • Explain how you will evaluate if this software met the needs of you and your students. How will you identify if it has improved the educational experience?

Budget Request:
Provide the number of licenses, estimated cost per license,  and estimated total cost for this software.

Timeline:
To use the software for instruction in the Fall 2024 term, proposals must be submitted by April 1, 2024.

New MIDEN – Unveiling of the upgraded MIDEN for Fall ‘23

New MIDEN – Unveiling of the upgraded MIDEN for Fall ‘23

New dual view capabilities

Maredith Byrd


We have upgraded the MIDEN! The new projectors use LEDs with much brighter and higher resolution using four Christie Digital M 4K25 RGB Laser Projectors. The new projectors use LEDs that have a longer lifespan. We used to have to limit how often and how long the MIDEN was run because the previous lamps had a very limited lifespan of just 1250 hours. For a 10′ x 10′ Screen, the resolution for each screen will be 2160×2160, which is double the previous resolution. There are now 25,000 hours of Lifespan at 100% brightness and 50,000 hours at 50% brightness.The new capabilities allow for two people to experience the view at once. They can see the same virtual content aligned to each of their unique perspectives and simultaneously interact with the content.

In a typical setup, 3D stereoscopic content (like what you would experience in a 3D movie) is projected onto three walls and the floor and stitched seamlessly together. Users wear a set of motion-tracked glasses that allow their perspective to be updated depending on where they are standing or looking, and use a motion-tracked video game controller to navigate beyond the confines of the 10’x10’ room. To the user wearing the 3D glasses, the projected content appears entirely to scale and has realistic depth – they can look underneath tables that appear to be situated in front of them, despite the table being projected onto one of the walls.

The MIDEN supports 3D Modeling formats exported by the most popular modeling software: Blender, 3ds Max, Maya, Sketchup, Rhino, Revit, etc. These models can be exported in the following formats and then imported into our “Jugular” software: OBJ, FBX, STL, and VRML formats. The MIDEN can also produce Unreal Engine scenes where we use the nDisplay plugin to split the scene into 4 different cameras to correspond with the 4 projectors in the MIDEN. 

MIDEN users experience immersion in a virtual environment without it blocking their view of themselves or their surroundings as a VR headset does. Since VR “CAVE” is a trademarked term, ours is called the MIDEN, which stands for Michigan Immersive Digital Experience Nexus and the MIDEN takes traditional “CAVE” technology much further – it is driven by our in-house developed rendering engine that affords more flexibility than a typical “CAVE” setup.

The MIDEN is more accessible than VR headsets, meaning it takes less time to set up and begin using compared to headsets. The game controller used is a standard Xbox-type gaming pad, familiar to most gamers. The MIDEN has increased immersion, the vision of the real world is not hidden, so users do not have to worry about trip hazards or becoming disoriented. The MIDEN users see their real body unlike in a VR headset where the body is most likely a virtual avatar. This results in less motion sickness. 

It can be used for Architectural Review, Data Analysis, Art Installations, Learning 3D modeling, and much more. From seeing the true scale of a structure in relation to the body to sensory experiences with unique visuals and spatialized audio, the MIDEN is capable of assisting these projects to a new level.

The MIDEN is available to anyone to use for a project, class exercise, or tour by request. They can contact [email protected] to arrange to use it. Use of the MIDEN does require staff to run it, and we recommend anyone looking to view their custom content in the MIDEN arrange a few sessions ahead of their event to test their content and ensure their scene is configured properly.

Two individuals in the MIDEN point to the same virtual image with different views.</center>
This is how the MIDEN configures itself.

Scientific Visualization of Pain

Scientific Visualization of Pain

XR at the Headache & Orofacial Pain Effort (HOPE) Lab

Dr. Alexandre DaSilva is an Associate Professor in the School of Dentistry, an Adjunct Associate Professor of Psychology in the College of Literature, Science & Arts, and a neuroscientist in the Molecular and Behavioral Neuroscience Institute.  Dr. DaSilva and his associates study pain – not only its cause, but also its diagnosis and treatment – in his Headache & Orofacial Pain Effort (HOPE) Lab, located in the 300 N. Ingalls Building.

Dr. Alex DaSilva slices through a PET scan of a “migraine brain” in the MIDEN, to find areas of heightened μ-opioid activity.

Virtual and augmented reality have been important tools in this endeavor, and Dr. DaSilva has brought several projects to the Digital Media Commons (DMC) in the Duderstadt Center over the years.

In one line of research, Dr. DaSilva has obtained positron emission tomography (PET) scans of patients in the throes of migraine headaches.  The raw data obtained from these scans are three-dimensional arrays of numbers that encode the activation levels of dopamine or μ-opioid in small “finite element” volumes of the brain.  As such, they’re incomprehensible.  But, we bring the data to life through DMC-developed software that maps the numbers into a blue-to-red color gradient and renders the elements in stereoscopic 3D virtual reality (VR) – in the Michigan Immersive Digital Experience Nexus (MIDEN), or in head-mounted displays such as the Oculus Rift.  In VR, the user can effortlessly slide section planes through the volumes of data, at any angle or offset, to hunt for the red areas where the dopamine or μ-opioid signals are strongest.  Understanding how migraine headaches affect the brain may help in devising more focused and effective treatments.

Dr. Alex DaSilva’s associate, Hassan Jassar, demonstrates the real-time fNIRS-to-AR brain activation visualization, as seen through a HoloLens, as well as the tablet-based app for painting pain sensations on an image of a head. [Photo credit: Hour Detroit magazine, March 28, 2017. https://www.hourdetroit.com/health/virtual-reality-check/

In another line of research, Dr. DaSilva employs functional near-infrared spectroscopy (fNIRS) to directly observe brain activity associated with pain in “real time”, as the patient experiences it.  As Wikipedia describes it: “Using fNIRS, brain activity is measured by using near-infrared light to estimate cortical hemodynamic activity which occur in response to neural activity.”  [https://en.wikipedia.org/wiki/Functional_near-infrared_spectroscopy]  The study participant wears an elastic skullcap fitted with dozens of fNIRS sensors wired to a control box, which digitizes the signal inputs and sends the numeric data to a personal computer running a MATLAB script.  From there, a two-part software development by the DMC enables neuroscientists to visualize the data in augmented reality (AR).  The first part is a MATLAB function that opens a Wi-Fi connection to a Microsoft HoloLens and streams the numeric data out to it.  The second part is a HoloLens app that receives that data stream and renders it as blobs of light that change hue and size to represent the ± polarity and intensity of each signal.  The translucent nature of HoloLens AR rendering allows the neuroscientist to overlay this real-time data visualization on the actual patient.  Being able to directly observe neural activity associated with pain may enable a more objective scale, versus asking a patient to verbally rate their pain, for example “on a scale of 1 to 5”.  Moreover, it may be especially helpful for diagnosing or empathizing with patients who are unable to express their sensations verbally at all, whether due to simple language barriers or due to other complicating factors such as autism, dementia, or stroke.

Yet another DMC software development, the “PainTrek” mobile application also started by Dr. DaSilva, allows patients to “paint their pain” on an image of a manikin head that can be rotated freely on the screen, as a more convenient and intuitive reporting mechanism than filling out a common questionnaire.

PainTrek app allows users to “paint” regions of the body experiencing pain to indicate and track pain intensity.

Architectural Lighting Scenarios Envisioned in the MIDEN

Architectural Lighting Scenarios Envisioned in the MIDEN

ARCH 535 & Arch 545, Winter 2022

Mojtaba Navvab, Ted Hall


Prof. Mojtaba Navvab teaches environmental technology in the Taubman College of Architecture and Urban Planning, with particular interests in lighting and acoustics.  He is a regular user of the Duderstadt Center’s MIDEN (Michigan Immersive Digital Experience Nexus) – in teaching as well as sponsored research.

On April 7, 2022, he brought a combined class of ARCH 535 and ARCH 545 students to the MIDEN to see, and in some cases hear, their projects in full-scale virtual reality.

Recreating the sight and sound of the 18-story atrium space of the Hyatt Regency Louisville, where the Kentucky All State Choir gathers to sing the National Anthem.

Arch 535: To understand environmental technology design techniques through case studies and compliance with building standards.  VR applications are used to view the design solutions.

Arch 545: To apply the theory, principles, and lighting design techniques using a virtual reality laboratory.

“The objectives are to bring whatever you imagine to reality in a multimodal perception; in the MIDEN environment, whatever you create becomes a reality.  This aims toward simulation, visualization, and perception of light and sound in a virtual environment.”

Recreating and experiencing one of the artworks by James Turrell.

“Human visual perception is psychophysical because any attempt to understand it necessarily draws upon the disciplines of physics, physiology, and psychology.  A ‘Perceptionist’ is a person concerned with the total visual environment as interpreted in the human mind.”

“Imagine if you witnessed or viewed a concert hall or a choir performance in a cathedral.  You could describe the stimulus generated by the architectural space by considering each of the senses independently as a set of unimodal stimuli.  For example, your eyes would be stimulated with patterns of light energy bouncing off the simulated interior surfaces or luminous environment while you listen to an orchestra playing or choir singing with a correct auralized room acoustics.”

A few selected images photographed in the MIDEN are included in this article.  For the user wearing the stereoscopic glasses, the double images resolve into an immersive 3D visual experience that they can step into, with 270° of peripheral vision.

Students explore a daylight design solution for a library.

Multi-Sensing the Universe

Multi-Sensing the Universe

Envisioning a Toroidal universe

Robert Alexander teamed with Danielle Battaglia, a senior in Art & Design, to compose and integrate audio effects into her conceptual formal model of the Toroidal Universe.  Danielle combined Plato’s notion of the universe as a dodecahedron with modern notions of black holes, worm holes, and child universes.  Their multi-sensory multiverse came together in the MIDEN and was exhibited there as part of the Art & Design senior integrative art exhibition.

Interested in using the MIDEN to do something similar? Contact us.

Revolutionizing 3D Rotational Angiograms with Microsoft Hololens

Angiography with Hololens augmented reality

Revolutionizing 3D Rotational Angiograms with Microsoft Hololens

A NEW WAY TO VISUALIZE THE HEART

Stephanie O’Malley


Just prior to release of the Microsoft Hololens 2, the Visualization Studio was approached by Dr. Arash Salavitabar in the U-M CS Mott Children’s Hospital with an innovative idea: to use XR to improve evaluation of patient scans stemming from 3D rotational angiography. 

Rotational angiography is a medical imaging technique based on x-ray, that allows clinicians to acquire CT-like 3D volumes during hybrid surgery or during a catheter intervention. This technique is performed by injecting contrast into the pulmonary artery followed by rapidly rotating a cardiac C-arm. Clinicians are then able to view the resulting data on a computer monitor, manipulating images of the patient’s vasculature. This is used to evaluate how a procedure should move forward and to aid in communicating that with the patient’s family.

With augmented reality devices like the Hololens 2, new possibilities for displaying and manipulating patient data have emerged, along with the potential for collaborative interactions with patient data among clinicians.

What if, instead of viewing a patient’s vasculature as a series of 2D images displayed on a computer monitor, you and your fellow doctors could view it more like a tangible 3D object placed on the table in front of you? What if you could share in the interaction with this 3D model — rotating and scaling the model, viewing cross sections, or taking measurements, to plan a procedure and explain it to the patient’s family?

This has now been made possible with a Faith’s Angels grant awarded to Dr. Salavitabar, intended to explore innovative ways of addressing congenital heart disease. The funding for this grant was generously provided by a family impacted by congenital heart disease, who unfortunately had lost a child to the disease at a very young age.

The Visualization Studio consulted with Dr. Salavitabar on essential features and priorities to realize his vision, using the latest version of the Visualization Studio’s Jugular software.

This video was spliced from two separate streams recorded concurrently from two collaborating HoloLens users. Each user has a view of the other, as well as their own individual perspectives of the shared holographic model.

JUGULAR

The angiography system in the Mott clinic produces digital surface models of the vasculature in STL format.

That format is typically used for 3D printing, but the process of queuing and printing a physical 3D model often takes at least several hours or even days, and the model is ultimately physical waste that must be properly disposed of after its brief use.

Jugular offers the alternative of viewing a virtual 3D model in devices such as the Microsoft HoloLens, loaded from the same STL format, with a lead time under an hour.  The time is determined mostly by the angiography software to produce the STL file.  Once the file is ready, it takes only minutes to upload and view on a HoloLens.  Jugular’s network module allows several HoloLens users to share a virtual scene over Wi-Fi.  The HoloLens provides a “spatial anchor” capability that ties hologram locations to a physical space.  Users can collaboratively view, walk around, and manipulate shared holograms relative to their shared physical space.  The holograms can be moved, scaled, sliced, and marked using hand gestures and voice commands.

This innovation is not confined to medical purposes.  Jugular is a general-purpose extended-reality program with applications in a broad range of fields.  The developers analyze specific project requirements in terms of general XR capabilities.  Project-specific requirements are usually met through easily-editable configuration files rather than “hard coding.”

Novels in VR – Experiencing Uncle Tom’s Cabin

Novels in VR – Experiencing Uncle Tom’s Cabin

A Unique Perspective

Stephanie O’Malley


This past semester, English Professor Sara Blair taught a course at the University titled, “The Novel and Virtual Realities.”  – The purpose of this course was to expose students to different methods of analyzing novels and ways of understanding them from different perspectives by utilizing platforms like VR and AR.

Designed as a hybrid course, her class was split between a traditional classroom environment, and an XR lab, providing a comparison between traditional learning methods, and more hands-on experiential lessons through the use of immersive, interactive VR and AR simulations.

As part of her class curriculum, students were exposed to a variety of experiential XR content. Using the Visualization Studio’s Oculus Rifts, her class was able to view Dr. Courtney Cogburn’s “1000 Cut Journey” installation – a VR experience that puts viewers in the shoes of a black american man growing up in the time of segregation, allowing viewers to see first hand how racism affects every facet of their life. They also had the opportunity to view Asad J. Malik’s “Terminal 3” using augmented reality devices like the Microsoft Hololens. Students engaging with Terminal 3 see how Muslim identities in the U.S. are approached through the lens of an airport interrogation.

Wanting to create a similar experience for her students at the University of Michigan, Sara approached the Duderstadt Center about the possibility of turning another novel into a VR experience: Uncle Tom’s Cabin.

She wanted her students to understand the novel from the perspective of it’s lead character, Eliza, during the pivotal moment where as a slave, she is trying to escape her captors and reach freedom. But she also wanted to give her students the perspective of the slave owner and other slaves tasked with her pursuit, as well as the perspective of an innocent bystander watching this scene unfold.

Adapted for VR by the Duderstadt Center: Uncle Tom’s Cabin

Using Unreal Engine, the Duderstadt Center was able to make this a reality. An expansive winter environment was created based on imagery detailed in the novel, and CGI characters for Eliza and her captors were produced and then paired with motion capture data to drive their movements. When students put on the Oculus Rift headset, they can choose to experience the moment of escape either through Eliza’s perspective, her captors, or as a bystander. And to better evaluate what components contributed to student’s feelings during the simulation, versions of these scenarios were provided with and without sound. With sound enabled as Eliza, you hear footsteps in the snow gaining on you, the crack of the ice beneath your feet as you leap across a tumultuous river, and the barking of a vicious dog on your heels – all adding to the tension of the moment. While viewers are able to freely look around the environment, they are passive observers: They have no control over the choices Eliza makes or where she can go.

Adapted for VR by the Duderstadt Center: Uncle Tom’s Cabin – Freedom for Eliza lies on the other side of the frozen Ohio river.

The scene ends with Eliza reaching freedom on the opposite side of the Ohio river and leaving her pursuers behind. What followed the student’s experience with the VR version of the novel was a deep class discussion on how the scene felt in VR verses how it felt reading the same passage in the book. Some students wondered what it might feel like to instead be able to control the situation and control where Eliza goes, or as a bystander, to move freely through the environment as the scene plays out, deciding which party (Eliza or her pursuers) was of most interest to follow in that moment.

While Sara’s class has concluded for the semester, you can still try this experience for yourself – Uncle Tom’s Cabin is available to demo on all Visualization Studio workstations equipped with an Oculus Rift.

Using Mobile VR to Assess Claustrophobia During an MRI

Using Mobile VR to Assess Claustrophobia During an MRI

new methods for exposure therapy

Stephanie O’Malley


Dr. Richard Brown and his colleague Dr. Jadranka Stojanovska had an idea for how VR could be used in a clinical setting. Having realized a problem with patients undergoing MRI scans experiencing claustrophobia, they wanted to use VR simulations to introduce potential patients to what being inside an MRI machine might feel like.

Duderstadt Center programmer Sean Petty and director Dan Fessahazion alongside Dr. Richard Brown

Claustrophobia in this situation is a surprisingly common problem. While there are 360 videos that convey what an MRI might look like, these fail to address the major factor contributing to claustrophobia: The perceived confined space within the bore. 360 videos tend to make the environment skewed, seeming further away than it would be in reality and thereby failing to induce the same feelings of claustrophobia that the MRI bore would produce in reality. With funding from the Patient Education Award Committee, Dr. Brown approached the Duderstadt Center to see if a better solution could be produced.

VR MRI: Character customization
A patient enters feet-first into the bore of the MRI machine.

In order to simulate the effects of an MRI accurately, a CGI MRI machine was constructed and ported to the Unity game engine. A customize-able avatar representing the viewer’s body was also added to give viewers a sense of self. When a VR headset is worn, the viewer’s perspective allows them to see their avatar body and the real proportions of the MRI machine as they are slowly transported into the bore. Verbal instructions mimic what would be said throughout the course of a real MRI, with the intimidating boom of the machine occurring as the simulated scan proceeds.

Two modes are provided within the app: Feet first or head first, to accommodate the most common scanning procedures that have been shown to induce claustrophobia.  

In order to make this accessible to patients, the MRI app was developed with mobile VR in mind, allowing anyone (patients or clinicians) with a VR-capable phone to download the app and use it with a budget friendly headset like Google Daydream or Cardboard.

Dr. Brown’s VR simulator was recently featured as the cover story in the September edition of Tomography magazine.

Students Learn 3D Modeling for Virtual Reality

Students Learn 3D Modeling for Virtual Reality

making tiny worlds

Stephanie O’Malley


ArtDes240 is course offered by the Stamps School of Art & Design and taught by Stephanie O’Malley that teaches students 3D modeling & animation.  As one of only a few 3D digital classes offered at the University of Michigan, AD240 sees student interest from several schools across campus with students looking to gain a better understanding of 3D art as it pertains to the video game industry.

The students in AD240 are given a crash-course in 3D modeling in 3D Studio Max and level creation within the Unreal Editor. It is then within Unreal that all of their objects are positioned, terrain is sculpted, and atmospheric effects such as time of day, weather, or fog can be added.

“Candyland” – Elise Haadsma & Heidi Liu, developed using 3D Studio Max and Unreal Engine
“Candyland” – Elise Haadsma & Heidi Liu, developed using 3D Studio Max and Unreal Engine

With just 5 weeks to model their entire environment, bring it into Unreal,  package it as an executable, and test it in the MIDEN (or on the Oculus Rift), the resulting student projects were truly impressive. Art & Design Students Elise Haadsma & Heidi Liu took inspiration from the classic board game, “Candyland” to create a life-size game board environment in Unreal consisting of a lollipop forest, mountains of Hershey’s kisses, even a gingerbread house and chocolate river.

Lindsay Balaka  from the School of Music, Theater & Dance, chose to create her scene using the Duderstadt Center’s in-house rendering software “Jugular” instead of Unreal Engine-Her creation, “Galaxy Cakes”, is a highly stylized (reminiscent of an episode of the 1960’s cartoon, The Jetson’s) cupcake shop, complete with spatial audio emanating from the corner Jukebox.

Lindsay Balaka’s “Galaxy Cakes” environment
An abandoned school, created by Vicki Liu in 3D Studio Max and Unreal Engine

Vicki Liu, also of Art & Design, created a realistic horror scene using Unreal. After navigating down a poorly lit hallway of an abandoned nursery school, you will find yourself in a run down classroom inhabited by some kind of mad man. A tally of days passed has been scratched into the walls, an eerie message scrawled onto the chalkboard, and furniture haphazardly barricades the windows.

While the goal of the final project was to create a traversible environment for virtual reality, some students took it a step further.

Art & Design student Gus Schissler created an environment composed of neurons in Unreal intended for viewing within the Oculus Rift. He then integrated data from an Epoch neurotransmitter (a device capable of reading brain waves) to allow the viewer to telepathically interact with the environment. The viewers mood when picked up by the Epoch not only changed the way the environment looked by adjusting the intensities of the light being emitted by the neurons, but also allowed the viewer to think specific commands (push, pull, etc) in order to navigate their way past various obstacles in the environment.

Students spend the last two weeks of the semester scheduling time with Ted Hall and Sean Petty to test their scenes and ensure everything runs and looks correctly on the day of their presentations. This was a class that not just introduced students to the design process, but to also allowed them to get hands on experience with upcoming technologies as virtual reality continues to expand in the game and film industries.

Student Gus Schissler demonstrates his Neuron environment for Oculus Rift that uses inputs from an Epoch neurotransmitter to interact.