S.C.I Hard Available in App Store

S.C.I Hard Available in App Store

Those with spinal cord injuries (SCI) encounter a drastically different world when they are released from the hospital. With varying degrees of disability, mobility and function, the world around them becomes a collection of physical and mental challenges which is a complete departure from their previous lifestyles. Whether they are in crutches or manual/automatic wheelchairs, they need to learn mobility, scheduling, and social tasks once again.

Players in S.C.I Hard must navigate a chaotic club scene to wrangle escaped tarsier monkeys

S.C.I Hard is a mobile game developed by the Duderstadt Center and designed by Dr. Michelle Meade for the Center for Technology & Independence (TIKTOC RERC) with funding from a NIDRR Field Initiated Development Grant.

Its purpose is to assist persons with spinal cord injury and develop and apply the necessary skills to keep their bodies healthy while managing the many aspects of SCI care, serving as a fun and engaging manual for individuals with spinal cord injuries learning independence. Tasks such as scheduling, mobility, and social interaction are all integrated subtly into the game. Players engage in goofy quests, from befriending roid-raging girlscouts in the park to collecting tarsier monkeys running rampant at a night club. The goal of S.C.I Hard was to be different from most medically oriented games, so players don’t feel like they’re being lectured or bombarded with  boring medical jargon, and instead learn the important concepts of their condition in a more light-hearted and engaging way.

Players shop for a handicap accessible vehicle to take their road test as they learn independence

With more than 30 different scenarios and mini-games, a full cast of odd characters to talk with, and dozens of collectible items and weapons only you can save the town from impending doom. SCI-Hard puts you, the player, in the chair of someone with a Spinal Cord Injury. Introducing you to new challenges and obstacles all while trying to save the world from legions of mutated animals. Join the fight and kick a** while sitting down!

S.C.I Hard is now available for free on Apple and Android devices through the app store, but will require participation in the subsequent study or feedback group to play:

Apple Devices: https://itunes.apple.com/us/app/sci-hard/id1050205395?mt=8

Android Devices: https://play.google.com/store/apps/details?id=edu.umich.mobile.SciHard&hl=en

To learn more about the subsequent study or to participate in the study involving S.C.I Hard, visit:
http://cthi.medicine.umich.edu/projects/tiktoc-rerc/projects/r2

Virtual Cadaver Featured in Proto Magazine

Virtual Cadaver Featured in Proto Magazine

Proto Magazine features articles on biomedicine and health care, targeting physicians, researchers and policy makers.

Proto is a natural science magazine produced by Massachusetts General Hospital in collaboration with Time Inc. Content Solutions. Launched in 2005, the magazine covers topics in the field of biomedicine and health care, targeting physicians, researchers and policy makers. In June, Proto featured an article, “Mortal Remains” that discusses alternatives to using real cadavers in the study of medicine.

Preserving human remains for use as a cadaver during a school semester has tremendous costs associated with it. The article in Proto magazine discusses options for revolutionizing this area of study, from the mention of old techniques like 17th Century anatomically correct wax models or Plastination (the process of removing fluids from the body and instead injecting a polymer) to new technology utilizing the Visible Human data, with a specific mention of the Duderstadt Center’s Virtual Cadaver.

To learn more, the full article from Proto Magazine can be found here.

vis_visible-human_miden_02
Sean Petty manipulates cross-sections of the Virtual Cadaver from within the 3D Lab’s virtual reality environment, the MIDEN.

Exploring Human Anatomy with the Anatomage Table

 

Exploring Human Anatomy with the Anatomage Table

The Anatomage table is a technologically advanced anatomy visualization system that allows users to explore the complex anatomy of the human body in digital form, eliminating the need for a human cadaver. The table presents a human figure at 1:1 scale, and utilizes data from the Visible Human effort with the additional capability of loading real patient data (CT, MRI, etc), making it a great resource for research, collaborative discovery, and the studying of surgical procedures. Funding to obtain the table was a collaborative effort between the schools of Dentistry, Movement Science, and Nursing although utilization is expected to expand to include Biology. Currently on display in the Duderstadt Center for exploration, the Anatomage table will be relocating to its more permanent home inside the Taubman Health Library in early July.

The Anatomage table allows users to explore the complex anatomy of the human body.

Virtual Cadaver – Supercomputing

Virtual Cadaver – Supercomputing

The Virtual Cadaver is a visualization of data provided by the Visible Human Project of the National Library of Medicine. This project aimed to create a digital image dataset of complete human male and female cadavers.

Volumetric anatomy data from the Visible Human Project

The male dataset originate from Joseph Paul Jernigan, a 38-year-old convicted Texas murderer who was executed by lethal injection. He donated his body for scientific research in 1993. The female cadaver remains anonymous, and has been described as a 59-year-old Maryland housewife who passed away from a heart attack. Her specimen contains several pathologies, including cardiovascular disease and diverticulitis.

Both cadavers were encased in gelatin and water mixture and frozen to produce the fine slices that comprise the data. The male dataset consists of 1,871 slices produced at 1 millimeter intervals. The female dataset is comprised of 5,164 slices.

The Duderstadt Center was directed to the dataset for the female subject in December of 2013. To load the data into the virtual reality MIDEN (a fully-immersive multi-screen head-tracked CAVE environment) and a variety of other display environments, the images were pre-processed into JPEGs at 1024×608 pixels. Every tenth slice is loaded, allowing the figure to be formed out of 517 slices at 3.3mm spacing per slice. A generic image-stack loader was written to allow for a 3D volume model to be produced from any stack of images, not specific to the Visible Human data. In this way, it can be configured to load a denser sample of slices over a shorter range should a subset of the model need to be viewed in higher detail.

Users can navigate around their data in passive cinema-style stereoscopic projection. In the case of the Virtual Cadaver, the body appears just as it would to a surgeon, revealing the various bones, organs and tissues. Using a game controller, users can arbitrarily position sectional planes to view a cross-section of the subject. This allows for cuts to be made that would otherwise be very difficult to produce in a traditional anatomy lab. The system can accommodate markerless motion-tracking through devices like the Microsoft Kinect and can also allow for multiple simultaneous users interacting with a shared scene from remote locations across a network.