Virtual Reality 3-D Brain Helps Scientists Understand Migraine Pain

Virtual Reality 3-D Brain Helps Scientists Understand Migraine Pain

Dr. Alex DaSilva Photo Credit: Scott Soderberg, Michigan Photography

From U-M News:

ANN ARBOR—Wielding a joystick and wearing special glasses, pain researcher Alexandre DaSilva rotates and slices apart a large, colorful, 3-D brain floating in space before him.

Despite the white lab coat, it appears DaSilva’s playing the world’s most advanced virtual video game.  The University of Michigan dentistry professor is actually hoping to better understand how our brains make their own pain-killing chemicals during a migraine attack.

The 3-D brain is a novel way to examine data from images taken during a patient’s actual migraine attack, says DaSilva, who heads the Headache and Orofacial Pain Effort at the U-M School of Dentistry and the Molecular and Behavioral Neuroscience Institute.

Different colors in the 3-D brain give clues about chemical processes happening during a patient’s migraine attack using a PET scan, or positron emission tomography, a type of medical imaging.

“This high level of immersion (in 3-D) effectively places our investigators inside the actual patient’s brain image,” DaSilva said.

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data that shows activation in the brain *during* a migraine attack. Most data happens before or after an attack. Sean Petty and Ted Hall worked closely with Dr. DaSilva to interpret the data and add some new tools to Jugular, our in-house 3D engine, for exploring volumetric data such as fMRI and CT scans. Dr. DaSilva can now explore the brain data by easily walking around the data and interactively cutting through it.

Test Driving with FAAC and Graphics Performance Discussion

Test Driving with FAAC and Graphics Performance Discussion


FAAC Incorporated provides system engineering and software products including driving simulators for commercial and private training. FAAC reached out to the Duderstadt Center to share information and to compare their system performance to the MIDEN’s capabilities. The Duderstadt Center had developed an “urban neighborhood” model as a stress test: how big and highest number of triangles and vertices can we make the models while still maintaining a comfortable interactive frame-rate in the MIDEN? The demo showed the MIDEN’s system capabilities and potential. The Duderstadt Center proceeded to visit FAAC’s space and saw the first 6-DOF full-motion system in a mobile trailer.

Article: Measurable Domain for Colour Differences within a Virtual Environment

Article: Measurable Domain for Colour Differences within a Virtual Environment

Light & Engineering (vol. 20, no. 3, 2012) | Светотехника (2 • 2012)

Professor Moji Navvab has published another article regarding his lighting analysis of virtual reality: “Область Поддающихся Измерению Цветовых Различий в Виртуальной Среде” (“Measurable Domain for Colour Differences within a Virtual Environment”), in, Светотехника (Light & Engineering).

Duderstadt Center takes 1st and 2nd Place in Mobile Apps Challenge

Duderstadt Center takes 1st and 2nd Place in Mobile Apps Challenge

In December of 2012, The University of Michigan held a mobile app competition to showcase new apps developed within the university and encourage the developer community to create innovative mobile designs. U-M students, faculty, and staff submitted a variety of apps from many different disciplines and genres. The event was sponsored and judged by individuals from Computer Science and Engineering, Google, Information and Technology Services, and Technology Transfer.

1st Place – PainTrek
Ever have a headache or facial pain that seemingly comes and goes without warning? Ever been diagnosed with migraines, TMD or facial neuralgias but feel that the medication or your ability to explain your pain is limited? PainTrek is a novel app that was developed to make it easier to track, analyze, and talk about pain.

2nd Place – PictureIt: The Epistles of St. Paul
The app will give you the feel of what it was like reading an ancient Greek book on papyrus, where the text is written without word division, punctuation, headings, or chapter and verse numbers. To aid the reader without knowledge of ancient Greek the translation mode will give a literal translation of the Greek text preserved on these pages (with addition of chapter and verse numbers), with explanatory notes showing where this text is different from the Standard text.

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

Rachael Miller and Rob Soltesz presented their summer work on Kinect development, natural user interfaces, and capturing emotive qualities of users at the 2012 UROP Symposium for MSTEM. Rachael won a Blue Ribbon at the event for her poster and they are both the first (that I know of) who have successfully used multiple Kinects in an immersive virtual reality space for virtual physical presence.

Rachael focused on creating a natural user interface for immersive 3D environments by combining multiple connects for a more robust skeleton.  This stable and predictable skeleton allowed her to then wrap virtual (invisible) objects around the user’s limbs and torso effectively allowing people to interact with virtual objects without markers or special tracking devices. Beyond simple interaction with virtual objects she then developed several gestures to be used for navigation in virtual reality.

Rob worked with Rachael on aspects of her project but also looked into using the Kinect’s multiple microphones and internal voice recognition capabilities to extract emotive qualities from the user inside a virtual reality space.

Andrew Janke also presented at a second UROP symposium on his work with iOS connectivity to a variety of applications. Getting data off of an iOS device is not always trivial. Formatting that data into a PDF and then sending it via email to a specific individual can be a challenge. Andrew developed a process that allows arbitrary iOS applications to send data, using simple sockets, which can then be formatted and then sent via email. This functionality was required by a few of our applications in development and proved to be extremely useful.

All students did a great job over the summer and we’re excited to be a part of the UROP program at the University of Michigan.

Migraine Brain – Quick Mapping of Brain Data

Migraine Brain – Quick Mapping of Brain Data

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data related to migraines and their effect on the brain. We had to quickly turn the data into an image suitable for a pending journal submission. While we can’t go into details at this time about the research being done, we created a quick model of the data and brought it into the MIDEN for further exploration. The model was created by taking cross-sections of the MRI dataset and projecting those onto the surface of a brain mesh. The resulting model & textures were exported and then brought into the MIDEN.

Generative Components and Genetic Algorithms

Generative Components and Genetic Algorithms

Genetic algorithms aim to mimic natural selection in the design process. A set of parameters or “genes” characterize a “species” of artifact. Individuals within the species express different values for those genes. A fitness function evaluates each individual’s health. The algorithm works by assigning random gene values for several individuals, evaluating them, discarding the weakest ones, breeding the strongest ones by interchanging genes, and repeating for successive generations. Genetic algorithms sometimes yield surprising designs that a strictly deductive deterministic design process might not discover.

This project uses Bentley Generative Components to script parametric designs for several classes of structures, including folded plates, branching columns, and geodesic domes. Bentley STAAD structural analysis serves as the fitness function.

Monica Ponce de Leon (Dean of Architecture and Urban and Regional Planning) is the principal investigator. Peter von Bülow (Associate Professor of Architecture) develops the genetic algorithms. Ted Hall worked with recent Architecture graduates Jason Dembski and Kevin Deng to script the structures and visualize them at full scale 3D in the MIDEN.

Virtual Jet Ski Driving Simulator

Virtual Jet Ski Driving Simulator

The Virtual Jet Ski Driving Simulator allows a user to drive a jet ski (or personal watercraft) through a lake environment that is presented in an immersive virtual reality MIDEN system. The user sits on a jet ski mockup and controls the ride via handlebar and throttle. While the mockup is stationary (does not move), the environment changes dynamically in response to handlebar and throttle operation, thereby, creating the feeling of jet ski driving in a very convincing way. The virtual reality system provides head-referenced stereo viewing and a realistic, full scale representation of the environment.

The simulator was developed to study human risk factors related to the operation of a personal watercraft (PWC). In recreational boating, PWCs are involved in accidents in disproportional numbers. Using the simulator, accident scenarios can be simulated and the reaction of PWC operators in specific situations can be studied. The simulator provides a cost-effective analysis tool for regulators and equipment designers as well as a training device for PWC operators, enforcers, and educators.

The simulator was developed for the U.S. Coast Guard (USCG) by the University of Michigan Virtual Reality Laboratory and the Research Triangle Institute. It is now in the process of being revived through help from the Undergraduate Research Opportunity Program (UROP)

Aerial City

Aerial City

The way we stage, organize and construct our environment will determine what roles we play in society. If the arrangement is to maximize the revenue and return to bankers, then the society will become an investment game. The focus will become dollars or any number of sophisticated, market oriented strategic economic situations which essentially ignore the spiritual fulfillment of humans living in cities. To live above ground, in flexible, self-sufficient cities, through our technology in order to coexist with other living species on Earth is the core purpose of Aerial City.

Local architect and active member at the University of Michigan School of Architecture, Sahba La’al, has been working with the University of Michigan Duderstadt Center to visualize various concepts related to the Aerial City project including the creation of unique designs in various real-world locations.

The project has been featured at several international conferences and exhibitions and is under continued design and refinement.