Wayfinding in Assisted Living Homes

Wayfinding in Assisted Living Homes

Rebecca Davis, professor and researcher at the Grand Valley State University, received a research grant from the National Institute of Health to research how patients with Alzheimers disease navigate their living space. Assisted living homes can be drab or nondescript with long hallways adding to the confusion and frustration for those living in these homes. To research this problem and possible solutions, Davis recruited 40 people in the early stages of Alzheimer’s and 40 without the disease to virtually walk through a simulation of an actual assisted living home in the MIDEN. Staff and students at the Duderstadt Center modeled a 3D environment to re-create details such as the complicated lighting or maze-like hallways, to create a natural and immersive experience. This allows users to fully experience how the color schemes, lighting, and wall detail can affect the experience of living in the home. Various “visual cues” are placed throughout the space at key locations to determine if these help the subject in remembering which paths lead to where they need to go. Rebecca currently utilizes two environments in her study, one with visual cues and one without. Subjects are shown the path they must go to reach a destination and then given an opportunity to travel there themselves-if they can remember how.

PictureIt: Epistles of Paul Released on iTunes

PictureIt: Epistles of Paul Released on iTunes

Get the App!

Welcome to the world of second century C.E. Egypt. This app will allow you to leaf through pages of the world’s oldest existing manuscript of the letters of St. Paul (P.Mich.inv.6238, also known in NT scholarship as P46). Thirty leaves of this manuscript, written in about 200 C.E., were found in Egypt and purchased by the University of Michigan Papyrology Collection in 1931 and 1933 (another 56 leaves, not included in this app, are housed in the Chester Beatty Library, Dublin; 18 leaves are missing completely).

The app will give you the feel of what it was like reading an ancient Greek book on papyrus, where the text is written without word division, punctuation, headings, or chapter and verse numbers. To aid the reader without knowledge of ancient Greek the translation mode will give a literal translation of the Greek text preserved on these pages (with addition of chapter and verse numbers), with explanatory notes showing where this text is different from the Standard text.

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

Rachael Miller and Rob Soltesz presented their summer work on Kinect development, natural user interfaces, and capturing emotive qualities of users at the 2012 UROP Symposium for MSTEM. Rachael won a Blue Ribbon at the event for her poster and they are both the first (that I know of) who have successfully used multiple Kinects in an immersive virtual reality space for virtual physical presence.

Rachael focused on creating a natural user interface for immersive 3D environments by combining multiple connects for a more robust skeleton.  This stable and predictable skeleton allowed her to then wrap virtual (invisible) objects around the user’s limbs and torso effectively allowing people to interact with virtual objects without markers or special tracking devices. Beyond simple interaction with virtual objects she then developed several gestures to be used for navigation in virtual reality.

Rob worked with Rachael on aspects of her project but also looked into using the Kinect’s multiple microphones and internal voice recognition capabilities to extract emotive qualities from the user inside a virtual reality space.

Andrew Janke also presented at a second UROP symposium on his work with iOS connectivity to a variety of applications. Getting data off of an iOS device is not always trivial. Formatting that data into a PDF and then sending it via email to a specific individual can be a challenge. Andrew developed a process that allows arbitrary iOS applications to send data, using simple sockets, which can then be formatted and then sent via email. This functionality was required by a few of our applications in development and proved to be extremely useful.

All students did a great job over the summer and we’re excited to be a part of the UROP program at the University of Michigan.

Kinect in Virtual Reality – M.I.D.E.N. Test

Kinect in Virtual Reality – M.I.D.E.N. Test


The Kinect exploded on the gaming and natural user interface scene. People had it hacked within a few days and a collective desire to see how a depth sensing camera can be used was born. Caught up in the same energy the Duderstadt Center started playing with the hacks coming out and seeing how they could be used with other technology. After some initial tests, and the release of the official SDK from Microsoft, we dove into deeper development with the device.

In an effort to improve interactivity in the MIDEN, the Kinect has been applied as a way of representing the physical body in a virtual space. By analyzing the data received from the Kinect, the Duderstadt Center’s rendering engine can create a digital model of the body. This body represents an avatar that corresponds to the user’s location in space, allowing them to interact with virtual objects. Because the MIDEN offers the user perspective and depth perception, interaction feels more natural than maneuvering an avatar on a screen; the user can reach out and directly “touch” objects.

Migraine Brain – Quick Mapping of Brain Data

Migraine Brain – Quick Mapping of Brain Data

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data related to migraines and their effect on the brain. We had to quickly turn the data into an image suitable for a pending journal submission. While we can’t go into details at this time about the research being done, we created a quick model of the data and brought it into the MIDEN for further exploration. The model was created by taking cross-sections of the MRI dataset and projecting those onto the surface of a brain mesh. The resulting model & textures were exported and then brought into the MIDEN.

Generative Components and Genetic Algorithms

Generative Components and Genetic Algorithms

Genetic algorithms aim to mimic natural selection in the design process. A set of parameters or “genes” characterize a “species” of artifact. Individuals within the species express different values for those genes. A fitness function evaluates each individual’s health. The algorithm works by assigning random gene values for several individuals, evaluating them, discarding the weakest ones, breeding the strongest ones by interchanging genes, and repeating for successive generations. Genetic algorithms sometimes yield surprising designs that a strictly deductive deterministic design process might not discover.

This project uses Bentley Generative Components to script parametric designs for several classes of structures, including folded plates, branching columns, and geodesic domes. Bentley STAAD structural analysis serves as the fitness function.

Monica Ponce de Leon (Dean of Architecture and Urban and Regional Planning) is the principal investigator. Peter von Bülow (Associate Professor of Architecture) develops the genetic algorithms. Ted Hall worked with recent Architecture graduates Jason Dembski and Kevin Deng to script the structures and visualize them at full scale 3D in the MIDEN.

SCI-Hard Mobile Game

SCI-Hard Mobile Game

Those with spinal cord injuries (SCI), often males ages 15-25, encounter a drastically different world when they are released from the hospital. With varying degrees of disability, mobility and function, the world around them becomes a collection of physical and mental challenges which is a complete departure from their previous lifestyles. Whether they are in crutches or manual/automatic wheelchairs, they need to learn mobility, scheduling, and social tasks once again. Stairs may now be an unsurmountable obstacle. The individual may receive glaring looks from others on the street or be taunted by children. Daily activities often surround the scheduling of their colostomy bag. The list goes on.

This project was initially the conceptualization of several ideas for a complete “manual” to be used by health care professionals working with individuals with SCI. It has since been turned into a larger development effort which has recently been funded by the U.S. Department of Education. This extension of the project would involve the development of a game which teaches those with SCI the necessary skills they need to now learn in a fun, edgy way. Tasks such as scheduling, mobility, and social interaction all become elements of the game as the player builds up their character’s abilities and opens up new locations and mini-games they can do.

Tech Demo – Realistic “Spooky” Cellar with Physical Interactions

Tech Demo – Realistic “Spooky” Cellar with Physical Interactions

Spooksville, a haunting and dimly lit basement environment, was originally designed by Andrew Hamilton, optimized  developed by the Duderstadt Center, and brought into the MIDEN as an experiment in immersive environments. The user in this environment can walk up rickety stairs, see the cobwebbed and otherwise grimey surfaces in a basement, and knock over old paint cans, sending them tumbling down the stairs in a life-like manner.

The real-time interaction creates the feeling of truly being immersed–try to knock cans on the virtual floor, forgetting where the physical floor is, and you might knock the controller (now taped and re-taped) or go too quickly up the stairs or step off the ledge and you might feel woozy. An earlier version featured localized spooky sounds right next to the leading viewer and floating apparitions just out of the corner of the user’s eyes. Enter at your own risk.

A Ferry called “Wahoo”

A Ferry called “Wahoo”

A passenger ferry was designed by a student team from the Naval Architecture and Marine Engineering Schools, for both their final project and the Puget Sound. The vessel, named Wahoo, is 57 meters long, 18 meters wide,  and seats 350 passengers with a top speed of 45 knots.  The students modeled the ferry in Rhinoceros and worked with the Duderstadt Center to print the model in plaster for presentation purposes. They also exported VRML for visualization in the MIDEN, allowing them to explore the ferry. Although Wahoo is much larger than the MIDEN, the students were able to see it in immersive stereo at full scale, allowing them to directly observe and evaluate sizes and clearances.

The engine room was an especially detailed design. The students obtained the real marine engine model from MTU Detroit Diesel (in STP format) and placed three instances of it in their vessel.