Virtual Disaster Simulator

Virtual Disaster Simulator

Mass casualty scenarios are inherently dangerous with many risks to those on site. Subjecting novices to such scenarios prematurely could lead to additional risks or dangers due to inexperience and poor decision making. Additionally, such scenarios are often far too expensive to replicate at any complexity or scale that mimics the real world. For example, staging a training scenario that faithfully recreates the 9/11 events would be a production that rivals blockbuster Hollywood movies as fire, smoke, debris, victims are all staged and coordinated. Even then one would begin to introduce additional dangers to the trainee and eliminate the possibility of trainees having *exactly* the same scenario unfold before them.

This project is an extension of earlier work done for the CDC and Department of Homeland Security in which first responders were trained for a specific disaster scenario to great effect. The focus of this revision was to target the needs of the Emergency Medicine residency program while also making significant advances in visual quality and immersion. It became important for trainees to identify wounds quickly and effectively. Advanced shaders were used to allow for a greater amount of detail per surface so the details of burns and lacerations could be realized. Additionally, advanced skeletally animated characters were also introduced to allow for full articulation of characters.

The project was a large success as the related studies showed the simulator was just as effective at training residents as using standardized patients.
Additionally, the project was recently featured on the Big Ten Network as part of the “Blue in Brief” segments.

Autonomous Control of Helicopters

Autonomous Control of Helicopters

Under the guidance of Prof. Girard from Aerospace Engineering and with the help of the University of Michigan Duderstadt Center, two students developed novel algorithms to give autonomous motion control to miniature helicopters. Using the Duderstadt Center’s motion capture system they were able to determine the helicopters precise location in 3D space. With this information their algorithm determined proper throttle and heading for the helicopter to reach its goal. One classic example of their work involved two helicopters which were told to “defend” one of the creators (Zahid). Zahid wore a helmet with markers on it so the computer, and helicopters, knew where he was. Then, as he walked around the room the two helicopters followed beside him protecting him from any potential aggressors.

Virtual Jet Ski Driving Simulator

Virtual Jet Ski Driving Simulator

The Virtual Jet Ski Driving Simulator allows a user to drive a jet ski (or personal watercraft) through a lake environment that is presented in an immersive virtual reality MIDEN system. The user sits on a jet ski mockup and controls the ride via handlebar and throttle. While the mockup is stationary (does not move), the environment changes dynamically in response to handlebar and throttle operation, thereby, creating the feeling of jet ski driving in a very convincing way. The virtual reality system provides head-referenced stereo viewing and a realistic, full scale representation of the environment.

The simulator was developed to study human risk factors related to the operation of a personal watercraft (PWC). In recreational boating, PWCs are involved in accidents in disproportional numbers. Using the simulator, accident scenarios can be simulated and the reaction of PWC operators in specific situations can be studied. The simulator provides a cost-effective analysis tool for regulators and equipment designers as well as a training device for PWC operators, enforcers, and educators.

The simulator was developed for the U.S. Coast Guard (USCG) by the University of Michigan Virtual Reality Laboratory and the Research Triangle Institute. It is now in the process of being revived through help from the Undergraduate Research Opportunity Program (UROP)

Aerial City

Aerial City

The way we stage, organize and construct our environment will determine what roles we play in society. If the arrangement is to maximize the revenue and return to bankers, then the society will become an investment game. The focus will become dollars or any number of sophisticated, market oriented strategic economic situations which essentially ignore the spiritual fulfillment of humans living in cities. To live above ground, in flexible, self-sufficient cities, through our technology in order to coexist with other living species on Earth is the core purpose of Aerial City.

Local architect and active member at the University of Michigan School of Architecture, Sahba La’al, has been working with the University of Michigan Duderstadt Center to visualize various concepts related to the Aerial City project including the creation of unique designs in various real-world locations.

The project has been featured at several international conferences and exhibitions and is under continued design and refinement.

The Museum of Life and Death

The Museum of Life and Death

Andy Kirshner, a resident faculty member in the School of Music and Theater, used the University of Michigan Duderstadt Center’s motion capture service to record several movements for his production, The Museum of Life and Death which is described as:

“Set in the post-human 26th-century, The Museum of Life and Death is a radical reconsideration of the medieval Play of Everyman. Framed as a kind of post-human Masterpiece Theatre, and hosted by a chipper cyborg named Virgil, The Museum mixes 3D animation, projected video, live action, buddhist sutras, and original music to consider essential questions of Life, Death — and extinction — in our own time.”

Performance Homepage

Virtual Prototyping of Classrooms – Business School

Virtual Prototyping of Classrooms – Business School

The designing of architectural spaces provides unique challenges, especially when those spaces are intended to serve specific functions as well. The Ross School of Business recently constructed a new building which strived to meet the needs of the school’s faculty and students. Within the new construction was a plan for new U shaped classrooms. Since the design was unlike what many have used in the past and their effectiveness during daily classes was in question, the School of Business planned to construct test sites so faculty could experience the room before it was built. These test sites were typical of movie sets costing hundreds of thousands of dollars. If changes needed to be made, the site would need to be reconstructed to the new plans.

Dean Graham Mercer, approached the University of Michigan Duderstadt Center looking for a more cost effective solution to identifying problems in the design earlier on. Through the use of the Virtual Reality MIDEN, which has the distinct ability to display virtual worlds at true 1-to-1 scale, faculty from the School of Business was able to experience the proposed classrooms prior to the physical construction of the space and offer suggestions with confidence. This process cost the school a fraction of building physical test sites and allowed for rapid turn around on any additions they needed.

The new classrooms can now be seen in the Ross School of Business on Central Campus.

Remote Dance Performances

Remote Dance Performances

Shortly after the acquisition of the University of Michigan’s first motion capture system faculty and students began exploring its use for the performing arts. One such project involved two dancers who coordinated their performances remotely. With one dancer performing in the MIDEN and the other in the Video Studio they effectively created a complete performance. The MIDEN performer wore our motion capture suit and had their point cloud (visualization of just her joints) streamed to the Video Studio where the other dancer was performing.

Another project related to remote performances involved a faculty member from the school of music who specialized in jazz compositions. He conducted a collection of performers remotely using methods similar to the dancers above. One unique challenge was the expressiveness and articulation of the composers hands and face. To solve this we placed additional markers on his face and hands so the remote musicians could identify his facial expressions and hand poses.

Detroit Midfield Terminal

Detroit Midfield Terminal

Photo credit: University of Michigan Virtual Reality Lab

During the past years, Northwest Airlines designed and built a spectacular, state-of-the-art terminal at Wayne County’s Detroit Metropolitan Airport. The project included the construction of a new international/domestic terminal (the ‘Midfield Terminal’) with 97 gates, airfield connections via aprons and taxiways, a large parking structure with 11,500 spaces, a multi-level system of access roads to the new terminal, and a power plant. This 1.2 billion dollar expansion opened on February 24, 2002. The terminal was named after Wayne County commissioner McNamara.

In cooperation with Northwest Airlines, the Virtual Reality Laboratory (VRL) at the College of Engineering at the University of Michigan developed a virtual model for the Detroit Midfield Terminal Project to assist in design evaluation and to support a complex decision making process.

During the design phase, a three-dimensional computer model was developed at the University of Michigan and continuously updated as the design progressed. Once the terminal was created digitally, functionality was added allowing Northwest Airlines to test line of sight for their proposed control towers, obstruction and planting strategies for nearby trees, as well as traffic patterns for visitors of the terminal.

Original Project Page: Detroit Midfield Terminal

Virtual Football Trainer

Virtual Football Trainer

Imagine watching the U-M football team playing in a sold out Michigan Stadium on a sunny Saturday afternoon. But instead of cheering from the bleachers, you are being transposed right down to the field next to the quarterback. You are in the middle of the action. You can move to any position and experience the game from the player’s point of view. You feel like being a participant, no longer a spectator.

The technology of immersive virtual reality makes this amazing scenario possible. as you are fully surrounded by the virtual players on the field, the players are presented in full scale and in stereo. It seems that you can touch them. You can look and walk around, hover over the quarterback, or even fly to cover distances quickly.

The simulation of football plays in immersive virtual reality has a most useful and very promising application in the training of football players for specific aspects of a game. The University of Michigan Virtual Reality Laboratory has developed the concept for such a “Virtual Football Trainer” and has implemented a demo version that illustrates the potential in an already convincing way.

The original idea for the Virtual Football Trainer was inspired by the Lloyd Carr, head football coach at the University of Michigan. Generous funding for the development of the system came from the Graham Foundation and equipment support from Silicon Graphics Inc. The Michigan football staff provided continuous guidance and valuable expertise for the design and implementation of the Virtual Football Trainer.

Original Project Page: Virtual Football Trainer

Medical Readiness Trainer

Medical Readiness Trainer

This ongoing project is an interdisciplinary effort at the University of Michigan involving the Medical Center, the Department of Emergency Medicine, the Digital Media Commons (formerly the Media Union), and the Virtual Reality Laboratory at the College of Engineering. The objective is the development of a “Virtual Reality-Enhanced Medical Readiness Trainer” (MRT) that integrates advanced technologies like human patient simulators, immersive virtual reality MIDEN systems, next generation Internet technology, virtual video conferencing, and more in the context of distributed and shared virtual environments for the training of emergency personnel in a variety of common as well as extreme situations.

One such example of the collaboration is the design and organization of an Operating Room (OR) in fully immersive virtual reality. This particular application allows a physician or hospital administrator to configure a virtual operating room for optimal efficiency and safety.

Another example of efforts originating from this collaboration is the development of the sick bay application for the Virtual Reality MIDEN. This application places the individual in a sick bay on turbulent waters. Our peripheral vision plays a large role in our orientation and balance. Performing medical procedures as the room appears to move and shift around you is a difficult task that is better prepared for in advance. The MIDEN‘s wide field-of-view and immersion allow for a effective and nauseating experience perfect for training medical personal on naval vessels.

Original Project Page: Medical Readiness Trainer