Robert Alexander is a Design Science Ph.D. Graduate and member of the Solar and Heliospheric Research Group. Working with NASA, Robert aims to use data audification to teach us something new about the Sun’s solar wind and is using mixed media coupled with unique interaction methods to pull viewers into the experience. The Duderstadt Center worked with Robert to put his research into video form:
Rachael Miller and Carlos Garcia discuss how their individual experiences with the Digital Media Commons (DMC) shaped their projects and ambitions. Rachael, an undergraduate in computer science, was able to expand her horizons by working in the Duderstadt Center on projects which dealt with virtual reality. She gained vital knowledge about motion capture by working in the MIDEN with the Kinect, and continues to apply her new skills to projects and internships today.
Carolos Garcia worked to combine technology and art in the form of projection mapping for his senior thesis Out of the Box. To approach the project, he began by searching for resources and found DMC to be the perfect fit. By establishing connections to staff in the 3D Lab, Groundworks, the Video Studio and many others, he was able to complete his project and go on to teach others the process as well. For a more behind the scenes look at both Carlos Garcia and Racheael Miller’s projects and process, please watch the video above!
User Story: Robert Alexander and Sonification of Data
Robert Alexander, a graduate student at the University of Michigan, represents what students can do in the Digital Media Commons (DMC), a service of the Library, if they take the time to embrace their ideas and use the resources available to them. In the video above, he talks about the projects, culture, and resources available through the Library. In particular, he mentions time spent pursuing the sonification of data for NASA research, art installations, and musical performances.
The Virtual Cadaver is a visualization of data provided by the Visible Human Project of the National Library of Medicine. This project aimed to create a digital image dataset of complete human male and female cadavers.
The male dataset originate from Joseph Paul Jernigan, a 38-year-old convicted Texas murderer who was executed by lethal injection. He donated his body for scientific research in 1993. The female cadaver remains anonymous, and has been described as a 59-year-old Maryland housewife who passed away from a heart attack. Her specimen contains several pathologies, including cardiovascular disease and diverticulitis.
Both cadavers were encased in gelatin and water mixture and frozen to produce the fine slices that comprise the data. The male dataset consists of 1,871 slices produced at 1 millimeter intervals. The female dataset is comprised of 5,164 slices.
The Duderstadt Center was directed to the dataset for the female subject in December of 2013. To load the data into the virtual reality MIDEN (a fully-immersive multi-screen head-tracked CAVE environment) and a variety of other display environments, the images were pre-processed into JPEGs at 1024×608 pixels. Every tenth slice is loaded, allowing the figure to be formed out of 517 slices at 3.3mm spacing per slice. A generic image-stack loader was written to allow for a 3D volume model to be produced from any stack of images, not specific to the Visible Human data. In this way, it can be configured to load a denser sample of slices over a shorter range should a subset of the model need to be viewed in higher detail.
Users can navigate around their data in passive cinema-style stereoscopic projection. In the case of the Virtual Cadaver, the body appears just as it would to a surgeon, revealing the various bones, organs and tissues. Using a game controller, users can arbitrarily position sectional planes to view a cross-section of the subject. This allows for cuts to be made that would otherwise be very difficult to produce in a traditional anatomy lab. The system can accommodate markerless motion-tracking through devices like the Microsoft Kinect and can also allow for multiple simultaneous users interacting with a shared scene from remote locations across a network.
Over the years the Duderstadt Center has provided its services of visualization for a variety of NASA Proposals. Submitting a proposal requires a packet of information and visual aids that follow a strict format and series of guidelines.
Most recently, the Duderstadt Center assisted with the Mars Radar and Radiometry Subsurface Investigation (MARRSI) proposal. This was submitted in December 2013 and is currently awaiting a response. This proposal aims to implement new ways of tracing evidence of water in the martian soil, by utilizing the antenna of the existing Mars rovers. This antenna would detect signals from Earth that are reflected off the surface of Mars, thereby probing the soil for indications of water. The Duderstadt Center worked with the professor involved, as well as NASA’s Jet Propulsion Laboratory to design a proposal cover, diagrams and CDs for submission that adhere to the format requested.
Additionally, the Duderstadt Center was also involved in the Trace Gas Microwave Radiometer (TGMR) proposal. This proposal was centered on detecting the processes that produce and destroy methane gas on the surface of Mars. The goal of both of these proposals is to seek evidence of both methane and water on Mars, which may lead to discovering signs of bacterial life on Mars.
In the past, the Duderstadt Center designed mission logos and a cover for the Armada proposal. This proposal concerned documenting atmospheric events on Earth using cube satellites.
Technology Interventions for Health, $5M Center Award from Department of Education (UMHS, CoE, SI, Library)
Recently, the University of Michigan received a prestigious 5 million dollar Center Grant, awarded by the National Institute on Disability and Rehabilitation Research (NIDRR), part of the Department of Education.
The funds from this award will primarily be used to pursue several development, research, and training projects/studies involving technology interventions for self management of health behaviors. The newly formed center, led by Michelle Meade (PI, Rehab Medicine), will be an interdisciplinary endeavor, involving clinicians, researchers, and engineers from multiple departments on campus. This will allow UM researchers to continue to study how technology (including applications for smartphones/tablets, video games) can benefit individuals with spinal cord or neuro-developmental disabilities.
For the past three years, the Duderstadt Center has been developing SCI Hard, a transformative game facilitating skill development and promoting the ability of individuals with Spinal Cord Injuries (SCI). Through game-play, SCI Hard teaches players how to manage their health and interact more readily in home, health care and community environments. Combining practical teaching methods with the element of play, SCI Hard aims to give autonomy and confidence back to individuals who find their world drastically altered after a spinal cord injury, specifically young men (ages 15-25) with a recent SCI.
Players navigate the game by wheelchair, enabling them to face their real-world challenges: juggling doctors’ appointments, attending therapy sessions to build muscle, and learning to drive a wheelchair-accessible vehicle. Even banal tasks such as waiting in line at the DMV are covered in a way that exposes the new obstacles individuals with a SCI may face. SCI Hard tackles this difficult subject matter with optimism and an earnest of humor. (The player’s quest is ultimately to stop the evil Dr. Schrync from taking over the world with zombie animals.)
Funds from this grant will be used to study how playing games like SCI Hard can directly benefit the health or alter the behaviors of individuals with a SCI, an effort that has been supported and well received by the accessibility advocacy, gamification, and health science communities. Receiving the Center Grant allows Duderstadt Center to continue to develop SCI Hard and other projects through Android support, more health/configuration options, voice acting throughout for greater immersion, and leader boards to help track progress.
To learn more about how the grant will be used and what University of Michigan departments are involved, read The Record’s write up on this great accomplishment. For a sneak-peek at SCI Hard and what it entails, check out the video below.
Sometimes a mess of data is just a mess of data. But sometimes, as Dr. Suresh Bhavnani discovered, it is an opportunity for a new type of visualization. Ted Hall, advanced visualization specialist at the University of Michigan’s Duderstadt Center, set up an immersive stereoscopic projection of Bhavnani’s data in the MIDEN (Michigan Immersive Digital Experience Nexus), a small room that surrounds the user with 3D images. An antenna headset and a game console controller give Bhavnani a position in space relative to his data, from which he can virtually navigate the web of relationships between genes and diseases. This allowed him to see new patterns and identify unexpected regularities in gene function that are very difficult to untangle in 2D
StateTech Magazine Cites Duderstadt Center for NUI
A journalist for StateTech Magazine interviewed Ted Hall (Advanced Visualization Specialist) and Rachael Miller (undergraduate student in Computer Science) regarding the Duderstadt Center’s work with natural user interfaces (NUIs) and their possible applications for state and local government. An article derived from the interview appears in the Spring 2013 issue (page 16) and is also posted on-line here.
ANN ARBOR—Wielding a joystick and wearing special glasses, pain researcher Alexandre DaSilva rotates and slices apart a large, colorful, 3-D brain floating in space before him.
Despite the white lab coat, it appears DaSilva’s playing the world’s most advanced virtual video game. The University of Michigan dentistry professor is actually hoping to better understand how our brains make their own pain-killing chemicals during a migraine attack.
The 3-D brain is a novel way to examine data from images taken during a patient’s actual migraine attack, says DaSilva, who heads the Headache and Orofacial Pain Effort at the U-M School of Dentistry and the Molecular and Behavioral Neuroscience Institute.
Different colors in the 3-D brain give clues about chemical processes happening during a patient’s migraine attack using a PET scan, or positron emission tomography, a type of medical imaging.
“This high level of immersion (in 3-D) effectively places our investigators inside the actual patient’s brain image,” DaSilva said.
Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data that shows activation in the brain *during* a migraine attack. Most data happens before or after an attack. Sean Petty and Ted Hall worked closely with Dr. DaSilva to interpret the data and add some new tools to Jugular, our in-house 3D engine, for exploring volumetric data such as fMRI and CT scans. Dr. DaSilva can now explore the brain data by easily walking around the data and interactively cutting through it.
Test Driving with FAAC and Graphics Performance Discussion
FAAC Incorporated provides system engineering and software products including driving simulators for commercial and private training. FAAC reached out to the Duderstadt Center to share information and to compare their system performance to the MIDEN’s capabilities. The Duderstadt Center had developed an “urban neighborhood” model as a stress test: how big and highest number of triangles and vertices can we make the models while still maintaining a comfortable interactive frame-rate in the MIDEN? The demo showed the MIDEN’s system capabilities and potential. The Duderstadt Center proceeded to visit FAAC’s space and saw the first 6-DOF full-motion system in a mobile trailer.