Kinect in Virtual Reality – M.I.D.E.N. Test

Kinect in Virtual Reality – M.I.D.E.N. Test


The Kinect exploded on the gaming and natural user interface scene. People had it hacked within a few days and a collective desire to see how a depth sensing camera can be used was born. Caught up in the same energy the Duderstadt Center started playing with the hacks coming out and seeing how they could be used with other technology. After some initial tests, and the release of the official SDK from Microsoft, we dove into deeper development with the device.

In an effort to improve interactivity in the MIDEN, the Kinect has been applied as a way of representing the physical body in a virtual space. By analyzing the data received from the Kinect, the Duderstadt Center’s rendering engine can create a digital model of the body. This body represents an avatar that corresponds to the user’s location in space, allowing them to interact with virtual objects. Because the MIDEN offers the user perspective and depth perception, interaction feels more natural than maneuvering an avatar on a screen; the user can reach out and directly “touch” objects.

Migraine Brain – Quick Mapping of Brain Data

Migraine Brain – Quick Mapping of Brain Data

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data related to migraines and their effect on the brain. We had to quickly turn the data into an image suitable for a pending journal submission. While we can’t go into details at this time about the research being done, we created a quick model of the data and brought it into the MIDEN for further exploration. The model was created by taking cross-sections of the MRI dataset and projecting those onto the surface of a brain mesh. The resulting model & textures were exported and then brought into the MIDEN.

Generative Components and Genetic Algorithms

Generative Components and Genetic Algorithms

Genetic algorithms aim to mimic natural selection in the design process. A set of parameters or “genes” characterize a “species” of artifact. Individuals within the species express different values for those genes. A fitness function evaluates each individual’s health. The algorithm works by assigning random gene values for several individuals, evaluating them, discarding the weakest ones, breeding the strongest ones by interchanging genes, and repeating for successive generations. Genetic algorithms sometimes yield surprising designs that a strictly deductive deterministic design process might not discover.

This project uses Bentley Generative Components to script parametric designs for several classes of structures, including folded plates, branching columns, and geodesic domes. Bentley STAAD structural analysis serves as the fitness function.

Monica Ponce de Leon (Dean of Architecture and Urban and Regional Planning) is the principal investigator. Peter von Bülow (Associate Professor of Architecture) develops the genetic algorithms. Ted Hall worked with recent Architecture graduates Jason Dembski and Kevin Deng to script the structures and visualize them at full scale 3D in the MIDEN.

SCI-Hard Mobile Game

SCI-Hard Mobile Game

Those with spinal cord injuries (SCI), often males ages 15-25, encounter a drastically different world when they are released from the hospital. With varying degrees of disability, mobility and function, the world around them becomes a collection of physical and mental challenges which is a complete departure from their previous lifestyles. Whether they are in crutches or manual/automatic wheelchairs, they need to learn mobility, scheduling, and social tasks once again. Stairs may now be an unsurmountable obstacle. The individual may receive glaring looks from others on the street or be taunted by children. Daily activities often surround the scheduling of their colostomy bag. The list goes on.

This project was initially the conceptualization of several ideas for a complete “manual” to be used by health care professionals working with individuals with SCI. It has since been turned into a larger development effort which has recently been funded by the U.S. Department of Education. This extension of the project would involve the development of a game which teaches those with SCI the necessary skills they need to now learn in a fun, edgy way. Tasks such as scheduling, mobility, and social interaction all become elements of the game as the player builds up their character’s abilities and opens up new locations and mini-games they can do.

Tech Demo – Realistic “Spooky” Cellar with Physical Interactions

Tech Demo – Realistic “Spooky” Cellar with Physical Interactions

Spooksville, a haunting and dimly lit basement environment, was originally designed by Andrew Hamilton, optimized  developed by the Duderstadt Center, and brought into the MIDEN as an experiment in immersive environments. The user in this environment can walk up rickety stairs, see the cobwebbed and otherwise grimey surfaces in a basement, and knock over old paint cans, sending them tumbling down the stairs in a life-like manner.

The real-time interaction creates the feeling of truly being immersed–try to knock cans on the virtual floor, forgetting where the physical floor is, and you might knock the controller (now taped and re-taped) or go too quickly up the stairs or step off the ledge and you might feel woozy. An earlier version featured localized spooky sounds right next to the leading viewer and floating apparitions just out of the corner of the user’s eyes. Enter at your own risk.

A Ferry called “Wahoo”

A Ferry called “Wahoo”

A passenger ferry was designed by a student team from the Naval Architecture and Marine Engineering Schools, for both their final project and the Puget Sound. The vessel, named Wahoo, is 57 meters long, 18 meters wide,  and seats 350 passengers with a top speed of 45 knots.  The students modeled the ferry in Rhinoceros and worked with the Duderstadt Center to print the model in plaster for presentation purposes. They also exported VRML for visualization in the MIDEN, allowing them to explore the ferry. Although Wahoo is much larger than the MIDEN, the students were able to see it in immersive stereo at full scale, allowing them to directly observe and evaluate sizes and clearances.

The engine room was an especially detailed design. The students obtained the real marine engine model from MTU Detroit Diesel (in STP format) and placed three instances of it in their vessel.

Pisidian Antioch

Pisidian Antioch

From January 13 to February 24, 2006 at the Duderstadt center on the University of Michigan north campus, the Kelsey Museum mounted an exhibition on the Roman site of Antioch of Pisidia in Asia Minor (Turkey)—a Hellenistic city refounded by Augustus in 25 BC as a Roman colony. Located along a strategic overland artery between Syria and the western coast of Asia Minor, Pisidian Antioch served Rome’s military needs but also presented a striking symbol, from the Roman perspective, of the benefits that Roman civilization provided to local populations. The city is best known to the modern world as a destination on the first missionary journey of St. Paul and Barnabas in the 1st century AD, recounted in the Book of Acts.

Held at the Duderstadt Center Gallery on North Campus, the exhibition featured a physical model created with a University of Michigan Duderstadt Center’s Rapid Prototyping servces. Digital reconstructions of the buildings and topography, which were created with the help of internal staff working with talented students associated with the project, were displayed using the Virtual Reality MIDEN which conveyed a sense of the original monumentality of the site and the character of its setting.

Back Pain Is Not A Game (Except When It Is)

Back Pain Is Not A Game (Except When It Is)

BackQuack, a recently released video game, lets gamers play the best or worst doctor. By joining a “good” or “evil” clinics, players can win points and accolades or prison time. The player can fill the role of doctor or patient, learning about our health care system along the way. Players (in the “patient” role) can even enter their own information to learn more about back pain specific to them. Developed by the Duderstadt Center with Dr. Andrew Haig and funding from the Center for Healthcare Research and Transformation, the game is part of a multimedia package that includes pamphlets, books, and events–all with the purpose of teaching people about the real causes of backpain and best treatment practices.

Virtual Disaster Simulator

Virtual Disaster Simulator

Mass casualty scenarios are inherently dangerous with many risks to those on site. Subjecting novices to such scenarios prematurely could lead to additional risks or dangers due to inexperience and poor decision making. Additionally, such scenarios are often far too expensive to replicate at any complexity or scale that mimics the real world. For example, staging a training scenario that faithfully recreates the 9/11 events would be a production that rivals blockbuster Hollywood movies as fire, smoke, debris, victims are all staged and coordinated. Even then one would begin to introduce additional dangers to the trainee and eliminate the possibility of trainees having *exactly* the same scenario unfold before them.

This project is an extension of earlier work done for the CDC and Department of Homeland Security in which first responders were trained for a specific disaster scenario to great effect. The focus of this revision was to target the needs of the Emergency Medicine residency program while also making significant advances in visual quality and immersion. It became important for trainees to identify wounds quickly and effectively. Advanced shaders were used to allow for a greater amount of detail per surface so the details of burns and lacerations could be realized. Additionally, advanced skeletally animated characters were also introduced to allow for full articulation of characters.

The project was a large success as the related studies showed the simulator was just as effective at training residents as using standardized patients.
Additionally, the project was recently featured on the Big Ten Network as part of the “Blue in Brief” segments.

Autonomous Control of Helicopters

Autonomous Control of Helicopters

Under the guidance of Prof. Girard from Aerospace Engineering and with the help of the University of Michigan Duderstadt Center, two students developed novel algorithms to give autonomous motion control to miniature helicopters. Using the Duderstadt Center’s motion capture system they were able to determine the helicopters precise location in 3D space. With this information their algorithm determined proper throttle and heading for the helicopter to reach its goal. One classic example of their work involved two helicopters which were told to “defend” one of the creators (Zahid). Zahid wore a helmet with markers on it so the computer, and helicopters, knew where he was. Then, as he walked around the room the two helicopters followed beside him protecting him from any potential aggressors.