Scientific Visualization of Pain

Scientific Visualization of Pain

XR at the Headache & Orofacial Pain Effort (HOPE) Lab

Dr. Alexandre DaSilva is an Associate Professor in the School of Dentistry, an Adjunct Associate Professor of Psychology in the College of Literature, Science & Arts, and a neuroscientist in the Molecular and Behavioral Neuroscience Institute.  Dr. DaSilva and his associates study pain – not only its cause, but also its diagnosis and treatment – in his Headache & Orofacial Pain Effort (HOPE) Lab, located in the 300 N. Ingalls Building.

Dr. Alex DaSilva slices through a PET scan of a “migraine brain” in the MIDEN, to find areas of heightened μ-opioid activity.

Virtual and augmented reality have been important tools in this endeavor, and Dr. DaSilva has brought several projects to the Digital Media Commons (DMC) in the Duderstadt Center over the years.

In one line of research, Dr. DaSilva has obtained positron emission tomography (PET) scans of patients in the throes of migraine headaches.  The raw data obtained from these scans are three-dimensional arrays of numbers that encode the activation levels of dopamine or μ-opioid in small “finite element” volumes of the brain.  As such, they’re incomprehensible.  But, we bring the data to life through DMC-developed software that maps the numbers into a blue-to-red color gradient and renders the elements in stereoscopic 3D virtual reality (VR) – in the Michigan Immersive Digital Experience Nexus (MIDEN), or in head-mounted displays such as the Oculus Rift.  In VR, the user can effortlessly slide section planes through the volumes of data, at any angle or offset, to hunt for the red areas where the dopamine or μ-opioid signals are strongest.  Understanding how migraine headaches affect the brain may help in devising more focused and effective treatments.

Dr. Alex DaSilva’s associate, Hassan Jassar, demonstrates the real-time fNIRS-to-AR brain activation visualization, as seen through a HoloLens, as well as the tablet-based app for painting pain sensations on an image of a head. [Photo credit: Hour Detroit magazine, March 28, 2017.

In another line of research, Dr. DaSilva employs functional near-infrared spectroscopy (fNIRS) to directly observe brain activity associated with pain in “real time”, as the patient experiences it.  As Wikipedia describes it: “Using fNIRS, brain activity is measured by using near-infrared light to estimate cortical hemodynamic activity which occur in response to neural activity.”  []  The study participant wears an elastic skullcap fitted with dozens of fNIRS sensors wired to a control box, which digitizes the signal inputs and sends the numeric data to a personal computer running a MATLAB script.  From there, a two-part software development by the DMC enables neuroscientists to visualize the data in augmented reality (AR).  The first part is a MATLAB function that opens a Wi-Fi connection to a Microsoft HoloLens and streams the numeric data out to it.  The second part is a HoloLens app that receives that data stream and renders it as blobs of light that change hue and size to represent the ± polarity and intensity of each signal.  The translucent nature of HoloLens AR rendering allows the neuroscientist to overlay this real-time data visualization on the actual patient.  Being able to directly observe neural activity associated with pain may enable a more objective scale, versus asking a patient to verbally rate their pain, for example “on a scale of 1 to 5”.  Moreover, it may be especially helpful for diagnosing or empathizing with patients who are unable to express their sensations verbally at all, whether due to simple language barriers or due to other complicating factors such as autism, dementia, or stroke.

Yet another DMC software development, the “PainTrek” mobile application also started by Dr. DaSilva, allows patients to “paint their pain” on an image of a manikin head that can be rotated freely on the screen, as a more convenient and intuitive reporting mechanism than filling out a common questionnaire.

PainTrek app allows users to “paint” regions of the body experiencing pain to indicate and track pain intensity.

Revolutionizing 3D Rotational Angiograms with Microsoft Hololens

Angiography with Hololens augmented reality

Revolutionizing 3D Rotational Angiograms with Microsoft Hololens


Stephanie O’Malley

Just prior to release of the Microsoft Hololens 2, the Visualization Studio was approached by Dr. Arash Salavitabar in the U-M CS Mott Children’s Hospital with an innovative idea: to use XR to improve evaluation of patient scans stemming from 3D rotational angiography. 

Rotational angiography is a medical imaging technique based on x-ray, that allows clinicians to acquire CT-like 3D volumes during hybrid surgery or during a catheter intervention. This technique is performed by injecting contrast into the pulmonary artery followed by rapidly rotating a cardiac C-arm. Clinicians are then able to view the resulting data on a computer monitor, manipulating images of the patient’s vasculature. This is used to evaluate how a procedure should move forward and to aid in communicating that with the patient’s family.

With augmented reality devices like the Hololens 2, new possibilities for displaying and manipulating patient data have emerged, along with the potential for collaborative interactions with patient data among clinicians.

What if, instead of viewing a patient’s vasculature as a series of 2D images displayed on a computer monitor, you and your fellow doctors could view it more like a tangible 3D object placed on the table in front of you? What if you could share in the interaction with this 3D model — rotating and scaling the model, viewing cross sections, or taking measurements, to plan a procedure and explain it to the patient’s family?

This has now been made possible with a Faith’s Angels grant awarded to Dr. Salavitabar, intended to explore innovative ways of addressing congenital heart disease. The funding for this grant was generously provided by a family impacted by congenital heart disease, who unfortunately had lost a child to the disease at a very young age.

The Visualization Studio consulted with Dr. Salavitabar on essential features and priorities to realize his vision, using the latest version of the Visualization Studio’s Jugular software.

This video was spliced from two separate streams recorded concurrently from two collaborating HoloLens users. Each user has a view of the other, as well as their own individual perspectives of the shared holographic model.


The angiography system in the Mott clinic produces digital surface models of the vasculature in STL format.

That format is typically used for 3D printing, but the process of queuing and printing a physical 3D model often takes at least several hours or even days, and the model is ultimately physical waste that must be properly disposed of after its brief use.

Jugular offers the alternative of viewing a virtual 3D model in devices such as the Microsoft HoloLens, loaded from the same STL format, with a lead time under an hour.  The time is determined mostly by the angiography software to produce the STL file.  Once the file is ready, it takes only minutes to upload and view on a HoloLens.  Jugular’s network module allows several HoloLens users to share a virtual scene over Wi-Fi.  The HoloLens provides a “spatial anchor” capability that ties hologram locations to a physical space.  Users can collaboratively view, walk around, and manipulate shared holograms relative to their shared physical space.  The holograms can be moved, scaled, sliced, and marked using hand gestures and voice commands.

This innovation is not confined to medical purposes.  Jugular is a general-purpose extended-reality program with applications in a broad range of fields.  The developers analyze specific project requirements in terms of general XR capabilities.  Project-specific requirements are usually met through easily-editable configuration files rather than “hard coding.”