Duderstadt Center takes 1st and 2nd Place in Mobile Apps Challenge

Duderstadt Center takes 1st and 2nd Place in Mobile Apps Challenge

In December of 2012, The University of Michigan held a mobile app competition to showcase new apps developed within the university and encourage the developer community to create innovative mobile designs. U-M students, faculty, and staff submitted a variety of apps from many different disciplines and genres. The event was sponsored and judged by individuals from Computer Science and Engineering, Google, Information and Technology Services, and Technology Transfer.

1st Place – PainTrek
Ever have a headache or facial pain that seemingly comes and goes without warning? Ever been diagnosed with migraines, TMD or facial neuralgias but feel that the medication or your ability to explain your pain is limited? PainTrek is a novel app that was developed to make it easier to track, analyze, and talk about pain.

2nd Place – PictureIt: The Epistles of St. Paul
The app will give you the feel of what it was like reading an ancient Greek book on papyrus, where the text is written without word division, punctuation, headings, or chapter and verse numbers. To aid the reader without knowledge of ancient Greek the translation mode will give a literal translation of the Greek text preserved on these pages (with addition of chapter and verse numbers), with explanatory notes showing where this text is different from the Standard text.

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

Rachael Miller and Rob Soltesz presented their summer work on Kinect development, natural user interfaces, and capturing emotive qualities of users at the 2012 UROP Symposium for MSTEM. Rachael won a Blue Ribbon at the event for her poster and they are both the first (that I know of) who have successfully used multiple Kinects in an immersive virtual reality space for virtual physical presence.

Rachael focused on creating a natural user interface for immersive 3D environments by combining multiple connects for a more robust skeleton.  This stable and predictable skeleton allowed her to then wrap virtual (invisible) objects around the user’s limbs and torso effectively allowing people to interact with virtual objects without markers or special tracking devices. Beyond simple interaction with virtual objects she then developed several gestures to be used for navigation in virtual reality.

Rob worked with Rachael on aspects of her project but also looked into using the Kinect’s multiple microphones and internal voice recognition capabilities to extract emotive qualities from the user inside a virtual reality space.

Andrew Janke also presented at a second UROP symposium on his work with iOS connectivity to a variety of applications. Getting data off of an iOS device is not always trivial. Formatting that data into a PDF and then sending it via email to a specific individual can be a challenge. Andrew developed a process that allows arbitrary iOS applications to send data, using simple sockets, which can then be formatted and then sent via email. This functionality was required by a few of our applications in development and proved to be extremely useful.

All students did a great job over the summer and we’re excited to be a part of the UROP program at the University of Michigan.

Migraine Brain – Quick Mapping of Brain Data

Migraine Brain – Quick Mapping of Brain Data

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data related to migraines and their effect on the brain. We had to quickly turn the data into an image suitable for a pending journal submission. While we can’t go into details at this time about the research being done, we created a quick model of the data and brought it into the MIDEN for further exploration. The model was created by taking cross-sections of the MRI dataset and projecting those onto the surface of a brain mesh. The resulting model & textures were exported and then brought into the MIDEN.

Generative Components and Genetic Algorithms

Generative Components and Genetic Algorithms

Genetic algorithms aim to mimic natural selection in the design process. A set of parameters or “genes” characterize a “species” of artifact. Individuals within the species express different values for those genes. A fitness function evaluates each individual’s health. The algorithm works by assigning random gene values for several individuals, evaluating them, discarding the weakest ones, breeding the strongest ones by interchanging genes, and repeating for successive generations. Genetic algorithms sometimes yield surprising designs that a strictly deductive deterministic design process might not discover.

This project uses Bentley Generative Components to script parametric designs for several classes of structures, including folded plates, branching columns, and geodesic domes. Bentley STAAD structural analysis serves as the fitness function.

Monica Ponce de Leon (Dean of Architecture and Urban and Regional Planning) is the principal investigator. Peter von Bülow (Associate Professor of Architecture) develops the genetic algorithms. Ted Hall worked with recent Architecture graduates Jason Dembski and Kevin Deng to script the structures and visualize them at full scale 3D in the MIDEN.

Virtual Jet Ski Driving Simulator

Virtual Jet Ski Driving Simulator

The Virtual Jet Ski Driving Simulator allows a user to drive a jet ski (or personal watercraft) through a lake environment that is presented in an immersive virtual reality MIDEN system. The user sits on a jet ski mockup and controls the ride via handlebar and throttle. While the mockup is stationary (does not move), the environment changes dynamically in response to handlebar and throttle operation, thereby, creating the feeling of jet ski driving in a very convincing way. The virtual reality system provides head-referenced stereo viewing and a realistic, full scale representation of the environment.

The simulator was developed to study human risk factors related to the operation of a personal watercraft (PWC). In recreational boating, PWCs are involved in accidents in disproportional numbers. Using the simulator, accident scenarios can be simulated and the reaction of PWC operators in specific situations can be studied. The simulator provides a cost-effective analysis tool for regulators and equipment designers as well as a training device for PWC operators, enforcers, and educators.

The simulator was developed for the U.S. Coast Guard (USCG) by the University of Michigan Virtual Reality Laboratory and the Research Triangle Institute. It is now in the process of being revived through help from the Undergraduate Research Opportunity Program (UROP)

Aerial City

Aerial City

The way we stage, organize and construct our environment will determine what roles we play in society. If the arrangement is to maximize the revenue and return to bankers, then the society will become an investment game. The focus will become dollars or any number of sophisticated, market oriented strategic economic situations which essentially ignore the spiritual fulfillment of humans living in cities. To live above ground, in flexible, self-sufficient cities, through our technology in order to coexist with other living species on Earth is the core purpose of Aerial City.

Local architect and active member at the University of Michigan School of Architecture, Sahba La’al, has been working with the University of Michigan Duderstadt Center to visualize various concepts related to the Aerial City project including the creation of unique designs in various real-world locations.

The project has been featured at several international conferences and exhibitions and is under continued design and refinement.

Medical Readiness Trainer

Medical Readiness Trainer

This ongoing project is an interdisciplinary effort at the University of Michigan involving the Medical Center, the Department of Emergency Medicine, the Digital Media Commons (formerly the Media Union), and the Virtual Reality Laboratory at the College of Engineering. The objective is the development of a “Virtual Reality-Enhanced Medical Readiness Trainer” (MRT) that integrates advanced technologies like human patient simulators, immersive virtual reality MIDEN systems, next generation Internet technology, virtual video conferencing, and more in the context of distributed and shared virtual environments for the training of emergency personnel in a variety of common as well as extreme situations.

One such example of the collaboration is the design and organization of an Operating Room (OR) in fully immersive virtual reality. This particular application allows a physician or hospital administrator to configure a virtual operating room for optimal efficiency and safety.

Another example of efforts originating from this collaboration is the development of the sick bay application for the Virtual Reality MIDEN. This application places the individual in a sick bay on turbulent waters. Our peripheral vision plays a large role in our orientation and balance. Performing medical procedures as the room appears to move and shift around you is a difficult task that is better prepared for in advance. The MIDEN‘s wide field-of-view and immersion allow for a effective and nauseating experience perfect for training medical personal on naval vessels.

Original Project Page: Medical Readiness Trainer