S.C.I Hard Available in App Store

S.C.I Hard Available in App Store

Those with spinal cord injuries (SCI) encounter a drastically different world when they are released from the hospital. With varying degrees of disability, mobility and function, the world around them becomes a collection of physical and mental challenges which is a complete departure from their previous lifestyles. Whether they are in crutches or manual/automatic wheelchairs, they need to learn mobility, scheduling, and social tasks once again.

Players in S.C.I Hard must navigate a chaotic club scene to wrangle escaped tarsier monkeys

S.C.I Hard is a mobile game developed by the Duderstadt Center and designed by Dr. Michelle Meade for the Center for Technology & Independence (TIKTOC RERC) with funding from a NIDRR Field Initiated Development Grant.

Its purpose is to assist persons with spinal cord injury and develop and apply the necessary skills to keep their bodies healthy while managing the many aspects of SCI care, serving as a fun and engaging manual for individuals with spinal cord injuries learning independence. Tasks such as scheduling, mobility, and social interaction are all integrated subtly into the game. Players engage in goofy quests, from befriending roid-raging girlscouts in the park to collecting tarsier monkeys running rampant at a night club. The goal of S.C.I Hard was to be different from most medically oriented games, so players don’t feel like they’re being lectured or bombarded with  boring medical jargon, and instead learn the important concepts of their condition in a more light-hearted and engaging way.

Players shop for a handicap accessible vehicle to take their road test as they learn independence

With more than 30 different scenarios and mini-games, a full cast of odd characters to talk with, and dozens of collectible items and weapons only you can save the town from impending doom. SCI-Hard puts you, the player, in the chair of someone with a Spinal Cord Injury. Introducing you to new challenges and obstacles all while trying to save the world from legions of mutated animals. Join the fight and kick a** while sitting down!

S.C.I Hard is now available for free on Apple and Android devices through the app store, but will require participation in the subsequent study or feedback group to play:

Apple Devices: https://itunes.apple.com/us/app/sci-hard/id1050205395?mt=8

Android Devices: https://play.google.com/store/apps/details?id=edu.umich.mobile.SciHard&hl=en

To learn more about the subsequent study or to participate in the study involving S.C.I Hard, visit:
http://cthi.medicine.umich.edu/projects/tiktoc-rerc/projects/r2

Michigan Alumnus: Libraries with No Limits

Michigan Alumnus: Libraries with No Limits

The Duderstadt Center’s MIDEN is featured on the cover of the Michigan Alumnus with the caption “Libraries of the Future”. This tribute to Michigan’s high-tech libraries is continued on page 36 with an article that explores the new additions to our libraries that enhance student and instructor experiences. The article introduces new visualization stations in the Duderstadt Center (dubbed “VizHubs”) that are similar to the type of collaborative work spaces found at Google and Apple.

Read the full article here.

Robert Alexander’s “Audification Explained” Featured on BBC World Service

Robert Alexander’s “Audification Explained” Featured on BBC World Service

Sonification is the conversion of data sets to audio files. Robert Alexander II is a Sonification Specialist working with NASA, who uses satellite recordings of the sun’s emissions to discover new solar phenomena. The Duderstadt Center worked with Robert to produce a short video explaining the concept of data audification.

Recently Robert was featured in a BBC World Service clip along with his video about making music from the sun: http://www.bbc.co.uk/programmes/p03crzsv

Lia Min: RAW, April 7 – 8

Lia Min: RAW

Research fellow Lia Min will be exhibiting “RAW”  in the 3D lab’s MIDEN April 7 & 8th from 4 – 6pm. All are welcome to attend. Lia Min’s exhibit is an intersection of art and science, assembled through her training as a neuroscientist. Her data set, commonly referred to as a “Brainbow“,  focuses on a lobe of a fruit fly brain at the base of an antenna. This visualization scales microns to centimeters to enlarge the specimen with an overall visual volume of about 1.8 x 1.8 x 0.4 meters.

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba, a Collaborative Exploration of Data

On Nov. 17th-19th the Duderstadt Center’s Visualization Expert, Ted Hall, will be in Austin, Texas representing the Duderstadt Center at SC15, a super computing event. The technology on display will allow people in Austin to be projected into the MIDEN, the University of Michigan’s immersive virtual reality cave, allowing visitors in both Ann Arbor and in Austin to explore the body of a mummified mammoth.

The mummified remains of Lyuba.

The mammoth in question is a calf called Lyuba, found in Siberia in 2007 after being preserved underground for 50,000 years. This specimen is considered the best preserved mammoth mummy in the world, and is currently on display in the Shemanovskiy Museum and Exhibition Center in Salekhard, Russia.

University of Michigan Professor Daniel Fisher and his colleagues at the University of Michigan Museum of Paleontology arranged to have the mummy scanned using X-Ray computed tomography in Ford Motor Company’s Nondestructive Evaluation Laboratory. Adam Rountrey then applied a color map to the density data to reveal the internal anatomical structures.

Lyuba with her skeleton visible.

The Duderstadt Center got this data as an image stack for interactive volumetric visualization. The stack comprises 1,132 JPEG image slices with 762×700 pixel resolution per slice. Each of the resulting voxels is 1mm cubed.

When this data is brought into the Duderstadt Center’s Jugular software, the user can interactively slice through the mammoth’s total volume by manipulating a series of hexagonal planes, revealing the internal structure. In the MIDEN, the user can explore the mammoth in the same way while the mammoth appears to exist in front of them in three virtual dimensions. The MIDEN’s Virtual Cadaver used a similar process.

For the demo at SC15, users in Texas can occupy the same virtual space as another user in Ann Arbor’s MIDEN. Via a Kinect sensor in Austin, a 3D mesh of the user will be projected into the MIDEN alongside Lyuba allowing for simultaneous interaction and exploration of the data.

Showings will take place in the MIDEN

Sean Petty and Ted Hall simultaneously explore the Lyuba data set, with Ted’s form being projected into the virtual space of the MIDEN via Kinect sensor.

More about the Lyuba specimen:
Fisher, Daniel C.; Shirley, Ethan A.; Whalen, Christopher D.; Calamari, Zachary T.; Rountrey, Adam N.;
Tikhonov, Alexei N.; Buigues, Bernard; Lacombat, Frédéric; Grigoriev, Semyon; Lazarev, Piotr A. (2014 July). “X-ray Computed Tomography of Two Mammoth Calf Mummies.” Journal of Paleontology 88(4):664-675. DOI: http://dx.doi.org/10.1666/13-092
https://en.wikipedia.org/wiki/Lyuba
http://www.dallasnews.com/lifestyles/travel/headlines/20100418-42-000-year-old-baby-mammoth-4566.ece

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

CT data was brought into Zbrush & Topogun to be segmented and re-topologized. Influence was then added to the skin mesh allowing it to deform as the bones were manipulated.

Hera Kim-Berman is a Clinical Assistant Professor with the University of Michigan School of Dentistry. She recently approached the Duderstadt Center with an idea that would allow surgeons to prototype jaw surgery specific to patient data extracted from CT scans. Hera’s concept involved the ability to digitally manipulate portions of the skull in virtual reality, just as surgeons would when physically working with a patient, allowing them to preview different scenarios and evaluate how effective a procedure might be prior to engaging in surgery.

Before re-positioning the jaw segments, the jaw has a shallow profile.

After providing the Duderstadt Center with CT scan data, Shawn O’Grady was able to extract 3D meshes of the patient’s skull and skin using Magics. From there, Stephanie O’Malley worked with the models to make them interactive and suitable for real-time platforms. This involved bringing the skull into a software like Zbrush and creating slices in the mesh to correspond to areas identified by Hera as places where the skull would potentially be segmented during surgery. The mesh was then also optimized to perform at a higher frame rate when incorporated into real-time platforms. The skin mesh was also altered, undergoing a process called “re-topologizing” which allowed it to be more smoothly deformed.  From there, the segmented pieces of the skull were re-assembled, and then assigned influence over areas of the skin in a process called “rigging”. This allowed for areas of the skin to move with selected bones as they were separated and shifted by a surgeon in 3D space.

After re-positioning of the jaw segments, the jaw is more pronounced.

Once a working model was achieved, it was passed off to Ted Hall and student programmer Zachary Kiekover, to be implemented into the Duderstadt Center’s Jugular Engine, allowing the demo to run at large scale and in stereoscopic 3D from within the virtual reality MIDEN but also on smaller head mounted displays like the Oculus Rift. Additionally, more intuitive user controls were added which allowed for easier selection of the various bones using a game controller or motion tracked hand gestures via the Leap Motion. This meant surgeons could not only view the procedure from all angles in stereoscopic 3D, but they could also physically grab the bones they wanted to manipulate and transpose them in 3D space.

Zachary demonstrates the ability to manipulate the model using the Leap Motion.

Tour the Michigan Ion Beam Laboratory in 3D

Tour the Michigan Ion Beam Laboratory in 3D

3D Model of the Michigan Ion Beam Laboratory

The Michigan Ion Beam Laboratory (MIBL) was established in 1986 as part of the Department of Nuclear Engineering and Radiological Sciences in the College of Engineering. Located on the University of Michigan’s North Campus, the MIBL serves to provide unique and extensive facilities to support research and development. Recently, Professor Gary Was, Director of the MIBL reached out to the Duderstadt Center for assistance with developing content for the MIBL website to better introduce users to the capabilities of their lab as construction on a new particle accelerator reached completion.

Gary’s group was able to provide the Duderstadt Center with a scale model of the Ion Beam Laboratory generated in Inventor and a detailed synopsis of the various components and executable experiments. From there, the Stephanie O’Malley of the Duderstadt Center optimized and beautified the provided model, adding corresponding materials, labels and lighting. A series of fly-throughs, zoom-ins, and experiment animations were generated from this model that would serve to introduce visitors to the various capabilities of the lab.

These interactive animations were then integrated into the MIBL’s wordpress platform by student programmer, Yun-Tzu Chang. Visitors to the MIBL website are now able to compare the simplified digital replica of the space with actual photos of the equipment as well as run various experiments to better understand how each component functions.  To learn more about the Michigan Ion Beam Laboratory and to explore the space yourself, visit their website at  mibl.engin.umich.edu.

Virtual Cadaver Featured in Proto Magazine

Virtual Cadaver Featured in Proto Magazine

Proto Magazine features articles on biomedicine and health care, targeting physicians, researchers and policy makers.

Proto is a natural science magazine produced by Massachusetts General Hospital in collaboration with Time Inc. Content Solutions. Launched in 2005, the magazine covers topics in the field of biomedicine and health care, targeting physicians, researchers and policy makers. In June, Proto featured an article, “Mortal Remains” that discusses alternatives to using real cadavers in the study of medicine.

Preserving human remains for use as a cadaver during a school semester has tremendous costs associated with it. The article in Proto magazine discusses options for revolutionizing this area of study, from the mention of old techniques like 17th Century anatomically correct wax models or Plastination (the process of removing fluids from the body and instead injecting a polymer) to new technology utilizing the Visible Human data, with a specific mention of the Duderstadt Center’s Virtual Cadaver.

To learn more, the full article from Proto Magazine can be found here.

vis_visible-human_miden_02
Sean Petty manipulates cross-sections of the Virtual Cadaver from within the 3D Lab’s virtual reality environment, the MIDEN.

Exploring Human Anatomy with the Anatomage Table

 

Exploring Human Anatomy with the Anatomage Table

The Anatomage table is a technologically advanced anatomy visualization system that allows users to explore the complex anatomy of the human body in digital form, eliminating the need for a human cadaver. The table presents a human figure at 1:1 scale, and utilizes data from the Visible Human effort with the additional capability of loading real patient data (CT, MRI, etc), making it a great resource for research, collaborative discovery, and the studying of surgical procedures. Funding to obtain the table was a collaborative effort between the schools of Dentistry, Movement Science, and Nursing although utilization is expected to expand to include Biology. Currently on display in the Duderstadt Center for exploration, the Anatomage table will be relocating to its more permanent home inside the Taubman Health Library in early July.

The Anatomage table allows users to explore the complex anatomy of the human body.

Museum of Natural History – Planetarium Shows

Museum of Natural History – Planetarium Shows

Shredding Stars: Stars are consumed by a black hole
The Museum of Natural History will soon be the host of several animations produced by the Duderstadt Center covering an array of space-related subjects. From understanding the behavior of black holes, to demonstrations of the life cycle of stars, Stephanie O’Malley, digital artist of the Duderstatdt Center, has created the animations in collaboration with Matthew Linke, the planetarium director, Lydia Bieri, professor in mathematics, and Kayhan Gultekin, an assistant researcher in astronomy.
Kicked out black holes: The gravitational pull of a black hole can cause multiple black holes to merge together. This spinning motion then causes the merged black holes to be kicked out of orbit.
The Museum of Natural History houses a vast collection of natural history objects ranging from local birds species, to larger mammals, to the skeletons of mammoths.  The museum is located on campus and provides educational opportunities and exhibits open to both the campus and the wider community.  The planetarium is located on the top floor of the museum.  Since 1958 the planetarium has put on informative shows about astronomy for visitors.  A full-dome screen is used to immerse guests in the night sky, and throughout the year staff put on seasonal star talks using the dome to visualize what the sky looks like at that time of the year.
 
The collaboration between visualization artists and scientists produced well-researched visualizations on an array of astronomy topics.  These animations are unique in that much of what has been visualized stems from raw data in many cases.  Nobody has ever photographed these events actually occurring in space and they are largely hypothetical in some cases.  These animations are scheduled to be projected on the museum’s full-dome screen and used as a tool in classes to better acquaint students with concepts discussed in class.  They are also being featured for a short time in a separate exhibit outside of the planetarium space.
 
Those familiar with Saturday Morning Physics lessons may recognize some of the animations, as they were shown recently during Lydia Bieri’s spot discussing gravitational lensing and gravity waves (Click here for the link to the video).
Gravitational Lensing: A gravitational lens refers to a distribution of matter (such as a cluster of galaxies) between a distant source and an observer, that is capable of bending the light from the source, as it travels towards the observer.

The animations created were each part of National Science Foundation funded grants. They were created in After Effects and 3D Studio Max, using a special plugin (Domemaster 3D camera shader) for Full Dome Planetarium warping (this is what gives single frames of an animation the correct distortion to be projected onto the planetarium’s curved ceiling). Frames were then rendered at 1200-4k pixel resolution to accommodate even very large planetariums looking to feature these animations.