UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

Rachael Miller and Rob Soltesz presented their summer work on Kinect development, natural user interfaces, and capturing emotive qualities of users at the 2012 UROP Symposium for MSTEM. Rachael won a Blue Ribbon at the event for her poster and they are both the first (that I know of) who have successfully used multiple Kinects in an immersive virtual reality space for virtual physical presence.

Rachael focused on creating a natural user interface for immersive 3D environments by combining multiple connects for a more robust skeleton.  This stable and predictable skeleton allowed her to then wrap virtual (invisible) objects around the user’s limbs and torso effectively allowing people to interact with virtual objects without markers or special tracking devices. Beyond simple interaction with virtual objects she then developed several gestures to be used for navigation in virtual reality.

Rob worked with Rachael on aspects of her project but also looked into using the Kinect’s multiple microphones and internal voice recognition capabilities to extract emotive qualities from the user inside a virtual reality space.

Andrew Janke also presented at a second UROP symposium on his work with iOS connectivity to a variety of applications. Getting data off of an iOS device is not always trivial. Formatting that data into a PDF and then sending it via email to a specific individual can be a challenge. Andrew developed a process that allows arbitrary iOS applications to send data, using simple sockets, which can then be formatted and then sent via email. This functionality was required by a few of our applications in development and proved to be extremely useful.

All students did a great job over the summer and we’re excited to be a part of the UROP program at the University of Michigan.

Generative Components and Genetic Algorithms

Generative Components and Genetic Algorithms

Genetic algorithms aim to mimic natural selection in the design process. A set of parameters or “genes” characterize a “species” of artifact. Individuals within the species express different values for those genes. A fitness function evaluates each individual’s health. The algorithm works by assigning random gene values for several individuals, evaluating them, discarding the weakest ones, breeding the strongest ones by interchanging genes, and repeating for successive generations. Genetic algorithms sometimes yield surprising designs that a strictly deductive deterministic design process might not discover.

This project uses Bentley Generative Components to script parametric designs for several classes of structures, including folded plates, branching columns, and geodesic domes. Bentley STAAD structural analysis serves as the fitness function.

Monica Ponce de Leon (Dean of Architecture and Urban and Regional Planning) is the principal investigator. Peter von Bülow (Associate Professor of Architecture) develops the genetic algorithms. Ted Hall worked with recent Architecture graduates Jason Dembski and Kevin Deng to script the structures and visualize them at full scale 3D in the MIDEN.

Tech Demo – Realistic “Spooky” Cellar with Physical Interactions

Tech Demo – Realistic “Spooky” Cellar with Physical Interactions

Spooksville, a haunting and dimly lit basement environment, was originally designed by Andrew Hamilton, optimized  developed by the Duderstadt Center, and brought into the MIDEN as an experiment in immersive environments. The user in this environment can walk up rickety stairs, see the cobwebbed and otherwise grimey surfaces in a basement, and knock over old paint cans, sending them tumbling down the stairs in a life-like manner.

The real-time interaction creates the feeling of truly being immersed–try to knock cans on the virtual floor, forgetting where the physical floor is, and you might knock the controller (now taped and re-taped) or go too quickly up the stairs or step off the ledge and you might feel woozy. An earlier version featured localized spooky sounds right next to the leading viewer and floating apparitions just out of the corner of the user’s eyes. Enter at your own risk.

A Ferry called “Wahoo”

A Ferry called “Wahoo”

A passenger ferry was designed by a student team from the Naval Architecture and Marine Engineering Schools, for both their final project and the Puget Sound. The vessel, named Wahoo, is 57 meters long, 18 meters wide,  and seats 350 passengers with a top speed of 45 knots.  The students modeled the ferry in Rhinoceros and worked with the Duderstadt Center to print the model in plaster for presentation purposes. They also exported VRML for visualization in the MIDEN, allowing them to explore the ferry. Although Wahoo is much larger than the MIDEN, the students were able to see it in immersive stereo at full scale, allowing them to directly observe and evaluate sizes and clearances.

The engine room was an especially detailed design. The students obtained the real marine engine model from MTU Detroit Diesel (in STP format) and placed three instances of it in their vessel.

Pisidian Antioch

Pisidian Antioch

From January 13 to February 24, 2006 at the Duderstadt center on the University of Michigan north campus, the Kelsey Museum mounted an exhibition on the Roman site of Antioch of Pisidia in Asia Minor (Turkey)—a Hellenistic city refounded by Augustus in 25 BC as a Roman colony. Located along a strategic overland artery between Syria and the western coast of Asia Minor, Pisidian Antioch served Rome’s military needs but also presented a striking symbol, from the Roman perspective, of the benefits that Roman civilization provided to local populations. The city is best known to the modern world as a destination on the first missionary journey of St. Paul and Barnabas in the 1st century AD, recounted in the Book of Acts.

Held at the Duderstadt Center Gallery on North Campus, the exhibition featured a physical model created with a University of Michigan Duderstadt Center’s Rapid Prototyping servces. Digital reconstructions of the buildings and topography, which were created with the help of internal staff working with talented students associated with the project, were displayed using the Virtual Reality MIDEN which conveyed a sense of the original monumentality of the site and the character of its setting.

Back Pain Is Not A Game (Except When It Is)

Back Pain Is Not A Game (Except When It Is)

BackQuack, a recently released video game, lets gamers play the best or worst doctor. By joining a “good” or “evil” clinics, players can win points and accolades or prison time. The player can fill the role of doctor or patient, learning about our health care system along the way. Players (in the “patient” role) can even enter their own information to learn more about back pain specific to them. Developed by the Duderstadt Center with Dr. Andrew Haig and funding from the Center for Healthcare Research and Transformation, the game is part of a multimedia package that includes pamphlets, books, and events–all with the purpose of teaching people about the real causes of backpain and best treatment practices.

Virtual Prototyping of Classrooms – Business School

Virtual Prototyping of Classrooms – Business School

The designing of architectural spaces provides unique challenges, especially when those spaces are intended to serve specific functions as well. The Ross School of Business recently constructed a new building which strived to meet the needs of the school’s faculty and students. Within the new construction was a plan for new U shaped classrooms. Since the design was unlike what many have used in the past and their effectiveness during daily classes was in question, the School of Business planned to construct test sites so faculty could experience the room before it was built. These test sites were typical of movie sets costing hundreds of thousands of dollars. If changes needed to be made, the site would need to be reconstructed to the new plans.

Dean Graham Mercer, approached the University of Michigan Duderstadt Center looking for a more cost effective solution to identifying problems in the design earlier on. Through the use of the Virtual Reality MIDEN, which has the distinct ability to display virtual worlds at true 1-to-1 scale, faculty from the School of Business was able to experience the proposed classrooms prior to the physical construction of the space and offer suggestions with confidence. This process cost the school a fraction of building physical test sites and allowed for rapid turn around on any additions they needed.

The new classrooms can now be seen in the Ross School of Business on Central Campus.

Detroit Midfield Terminal

Detroit Midfield Terminal

Photo credit: University of Michigan Virtual Reality Lab

During the past years, Northwest Airlines designed and built a spectacular, state-of-the-art terminal at Wayne County’s Detroit Metropolitan Airport. The project included the construction of a new international/domestic terminal (the ‘Midfield Terminal’) with 97 gates, airfield connections via aprons and taxiways, a large parking structure with 11,500 spaces, a multi-level system of access roads to the new terminal, and a power plant. This 1.2 billion dollar expansion opened on February 24, 2002. The terminal was named after Wayne County commissioner McNamara.

In cooperation with Northwest Airlines, the Virtual Reality Laboratory (VRL) at the College of Engineering at the University of Michigan developed a virtual model for the Detroit Midfield Terminal Project to assist in design evaluation and to support a complex decision making process.

During the design phase, a three-dimensional computer model was developed at the University of Michigan and continuously updated as the design progressed. Once the terminal was created digitally, functionality was added allowing Northwest Airlines to test line of sight for their proposed control towers, obstruction and planting strategies for nearby trees, as well as traffic patterns for visitors of the terminal.

Original Project Page: Detroit Midfield Terminal

Virtual Football Trainer

Virtual Football Trainer

Imagine watching the U-M football team playing in a sold out Michigan Stadium on a sunny Saturday afternoon. But instead of cheering from the bleachers, you are being transposed right down to the field next to the quarterback. You are in the middle of the action. You can move to any position and experience the game from the player’s point of view. You feel like being a participant, no longer a spectator.

The technology of immersive virtual reality makes this amazing scenario possible. as you are fully surrounded by the virtual players on the field, the players are presented in full scale and in stereo. It seems that you can touch them. You can look and walk around, hover over the quarterback, or even fly to cover distances quickly.

The simulation of football plays in immersive virtual reality has a most useful and very promising application in the training of football players for specific aspects of a game. The University of Michigan Virtual Reality Laboratory has developed the concept for such a “Virtual Football Trainer” and has implemented a demo version that illustrates the potential in an already convincing way.

The original idea for the Virtual Football Trainer was inspired by the Lloyd Carr, head football coach at the University of Michigan. Generous funding for the development of the system came from the Graham Foundation and equipment support from Silicon Graphics Inc. The Michigan football staff provided continuous guidance and valuable expertise for the design and implementation of the Virtual Football Trainer.

Original Project Page: Virtual Football Trainer