Multi-Sensing the Universe

Multi-Sensing the Universe

Envisioning a Toroidal universe

Robert Alexander teamed with Danielle Battaglia, a senior in Art & Design, to compose and integrate audio effects into her conceptual formal model of the Toroidal Universe.  Danielle combined Plato’s notion of the universe as a dodecahedron with modern notions of black holes, worm holes, and child universes.  Their multi-sensory multiverse came together in the MIDEN and was exhibited there as part of the Art & Design senior integrative art exhibition.

Interested in using the MIDEN to do something similar? Contact us.

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

As part of the Kelsey museum’s most grandiose exhibition to date, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii features over 230 artifacts from the ancient world. These artifacts originate from the ancient villa of Oplontis, an area near Pompeii that was destroyed when Mt. Vesuvius erupted.

Real world location of the ancient villa of Oplontis

The traveling exhibit explores the lavish lifestyle and economic interests of ancient Rome’s wealthiest. This location is currently being excavated and is currently off limits to the general public, but as part of the Kelsey’s exhibit, visitors will get to experience the location with the assistance of virtual reality headsets like the Oculus Rift and tablet devices.

The Duderstadt Center worked closely with curator Elaine Gazda as well as the Oplontis Project team from the University of Texas to optimize a virtual re-creation for the Oculus Rift & MIDEN and to generate panoramic viewers for tablet devices. The virtual environment uses high resolution photos and scan data captured on location, mapped to the surface of 3D models to give a very real sense of being at the real-world location.

Visitors to the Kelsey can navigate Oplontis in virtual reality through the use of an Oculus Rift headset, or through panoramas presented on tablet devices.

Visitors to the Kelsey can traverse this recreation using the Rift, or they can travel to the Duderstadt to experience it in the MIDEN – and not only can viewers experience the villa as they appear in modern day-they can also toggle on an artist’s re-creation of what the villas would have looked like before their destruction. In the re-created version of the scene, the ornate murals covering the walls of the villa are restored and foliage and ornate statues populate the scene. Alongside the virtual reality experience, the Kelsey Museum will also house a physically reconstructed replica of one of the rooms found in the villa as part of the exhibit.

Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii is a traveling exhibit that will also be on display at the Museum of the Rockies at the Montana State University, Bozeman, and the Smith College Museum of Art in Northampton, Massachusetts.

On Display at the Kelsey Museum, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii

Unreal Engine in Stereoscopic Virtual Reality

Unreal Engine in Stereoscopic Virtual Reality

Up until now, the Oculus Rift has been the go-to system for gamers seeking the ultimate immersive experience, offering immersive stereo compatibility with game engines like Unreal and Unity 3D. Recently, the Duderstadt Center was able to push this experience even further, with Graphics Programmer Sean Petty adapting the Unreal Engine to work within the Duderstadt Center’s M.I.D.E.N – a fully immersive, stereoscopic 3D virtual reality experience.

Visitors entering the MIDEN are equipped with a pair of stereo glasses and a game controller, both outfitted with reflective markers that are then tracked by a series of Vicon cameras positioned around the top perimeter of the space. The existing capabilities of the MIDEN allow viewers to not only explore a space beyond the confines of the 10’x10′ room, but to also interact with objects using the provided game controller.

The services of the Duderstadt Center are open to all departments within the University, making visualization services, professional studio spaces, and exploratory technology accessible to artists, engineers, architects and more. The diverse atmosphere of the Digital Media Commons generates a multitude of cross-curricular collaborative projects each year – From live performances featuring orchestras manipulated via brain waves to exploring the anatomy of a digital cadaver in virtual reality.

In the past the Duderstadt Center’s MIDEN has been used to prototype architectural spaces, host artistic installations and assess human behavior or simulate training scenarios. Incorporating the Unreal Engine into a space like the MIDEN allows visitors to experience an intense level of realism never before achieved in this sort of environment, opening new doors not just for gamers, but for those seeking high quality visualizations for research and exploration. Unreal Engine brings a wide range of materials and visual effects to any scene. From realistic water, foliage or glass, to effects like fire and transitions in the time of day.

Sean Petty, graphics programmer of the Duderstadt Center, explains his process for getting Unreal to operate from within the MIDEN:

The MIDEN requires us to render a different view of the scene to each of the four walls from the perspective of the user. In order to achieve this we must track the location and orientation of the users eyes, which is accomplished by motion tracking a pair of glasses worn by the user. In the MIDEN there is a dedicated computer performing the necessary calculations, the first step to enabling MIDEN support in Unreal is to modify the engine to interface with this computer.

Visitors to the MIDEN are motion tracked within the space via reflective markers placed around a pair of stereo glasses and a hand held game controller. These markers are monitored by eight Vicon cameras located along the perimeter of the MIDEN.

Once the location of the user has been determined we must project the user’s view to each of the four walls. When rendering a scene in a standard desktop environment the camera is positioned in the center of the screen. A centered camera only requires a symmetric frustum projection which is the native transformation supported by Unreal. In the MIDEN, the center of the camera may be anywhere within the space and will often not be centered on a screen. This requires the use of an asymmetric frustum projection, which is functionality that had to be added to the engine.

Images for each wall are projected through a corresponding projector located behind the walls of the MIDEN. The floor is projected using a mirror located at the top of the space.

Unreal has native support for stereo by rendering the left and right views next to each other into the single image. This setup is used for devices such as the Oculus rift where the both images for the left and right eye are displayed at the same time. The MIDEN uses a technology called “active stereo”, where the displayed image flickers back and forth rapidly between the left and right images. This requires a modification to the engine so the left and right images are rendered to two separate buffers rather than to two sides of a single image.

Unreal Engine as seen from within the Duderstadt Center’s Virtual Reality MIDEN. The MIDEN is a 10’x10′ room comprised of 5 walls utilizing stereoscopic projection. Visitors are tracked using Vicon cameras allowing them to travel beyond the confines of the physical space.

The final step for displaying unreal scenes in the MIDEN is to get the four rendering computers communicating with each other. This ensures that when the user moves all the screens are updated appropriately to give a consistent view of the scene. The networking is accomplished using Unreal‘s built in network replication functionality, which is designed for use in multiplayer games.

With this latest development, researchers across all disciplines are now able to utilize this technology to reproduce lifelike environments for their studies giving subjects the ultimate immersive experience. It is hoped that this higher level of immersion offered by the Unreal Engine will have a dramatic impact in studies involving human behavior and environmental effects.

In addition to incorporating Unreal, the MIDEN also continues to operate using an in-house engine developed by Ted Hall & Sean Petty, called “Jugular,” which provides support for a broad range of models, materials, and interactivity. While Unreal offers finer elements of photo-realism for mesh-based geometry, Jugular supports easier import of a wider range of file types from a variety of sources, including not only meshes but also solid volumes and informatics graphs.

Virtual Cadaver Featured in Proto Magazine

Virtual Cadaver Featured in Proto Magazine

Proto Magazine features articles on biomedicine and health care, targeting physicians, researchers and policy makers.

Proto is a natural science magazine produced by Massachusetts General Hospital in collaboration with Time Inc. Content Solutions. Launched in 2005, the magazine covers topics in the field of biomedicine and health care, targeting physicians, researchers and policy makers. In June, Proto featured an article, “Mortal Remains” that discusses alternatives to using real cadavers in the study of medicine.

Preserving human remains for use as a cadaver during a school semester has tremendous costs associated with it. The article in Proto magazine discusses options for revolutionizing this area of study, from the mention of old techniques like 17th Century anatomically correct wax models or Plastination (the process of removing fluids from the body and instead injecting a polymer) to new technology utilizing the Visible Human data, with a specific mention of the Duderstadt Center’s Virtual Cadaver.

To learn more, the full article from Proto Magazine can be found here.

vis_visible-human_miden_02
Sean Petty manipulates cross-sections of the Virtual Cadaver from within the 3D Lab’s virtual reality environment, the MIDEN.

Architectural Visualization of Renovated Space for the Department of Pathology

Architectural Visualization of Renovated Space for the Department of Pathology

The University of Michigan Health System, Department of Pathology has recently started making preparations to move to the North Campus Research Complex.  Previously, the Department of Pathology had labs dispersed around the campus.  Now there is a proposed $160 million effort to centralize the labs of the Department of Pathology and other health system branches in a space that will be more flexible and adept at accommodating future research and developments in technology.

Tsoi/Kobus and Associates, an architecture firm based in Cambridge, Massachusetts are the architects chosen to design the new labs.  They are a firm based out of Cambridge, Massachusetts specializing in architecture and interior design of technology and science, university, and healthcare projects.

The Duderstadt Center played host to the design review lead by Christine Baker of the UMH Facilities Projects and Corrie Pennington-Block of CW Mott Administration.  The Department of Pathology staff were invited and asked to give feedback on the designs.  These meetings continued for a week from April 20-24, 2015 with various participating sub-groups.  Sessions comprised of an introduction and orientation to the design using standard hard-copy architectural floor plans.  Computer-generated walk-through videos and Google Earth maps were put up on the large Tiled Display to assist in visualizing.

Pathology staff viewing Google Earth layout of the proposed site.

The designers wanted the staff’s opinion on questions of space utilization and adjacencies.  To assist in visualizing this, the designs were uploaded to the MIDEN through FBX files, a file form exported from Autodesk Revit.  Through the use of the MIDEN, Pathology staff could walk through a full-scale virtual replication of the architects floor plan allowing participants to explore the proposed layout and give more in-depth feedback.  Based in part on that feedback, the architects revised the design and ran
another review session on May 14.

Experiencing the new labs in the MIDEN

Click here to read The Ann Arbor News article about the proposed lab space.

GIS Data As Seen In The (Immersive) Environment

GIS Data As Seen In The (Immersive) Environment

Viewing Vector Data of Ann Arbor in the MIDEN.

Geographical Information Systems (GIS) are used for mapping and analysis of data pertaining to geographic locations. The location data may consist of vectors, rasters, or points.

Vector data are typically used to represent boundaries of discrete political entities, zoning, or land use categories.

Raster data are often used to represent geographic properties that vary continuously over a 2D area, such as terrain elevation. Each raster represents a small rectangular finite element of information projected onto a regular 2D grid. It’s simple to construct a triangulated mesh from such data.

Unstructured point clouds are often acquired by LIDAR or other scanning techniques. Dense clouds of points can create a fuzzy visual impression of 3D surfaces of terrain, vegetation, and structures. Unlike raster data, point clouds can represent concave, undercut surfaces, but it’s harder to construct a triangulated mesh from such data.

The MIDEN demo collection includes test cases for generic loaders of all three of these types. The vector tests include boundaries of roads and wooded areas in Ann Arbor projected onto a 2D map, and national boundaries projected onto the surface of a globe. The raster test is a 3D terrain mesh for a section of the Grand Canyon. The point test is a LIDAR scan of a fault line in the Grand Tetons.

Exploring the globe with GIS data.

Massive Lighting in Sponza to Develop Global Illumination

Massive Lighting in Sponza to Develop Global Illumination

A person in the MIDEN exploring Sponza.

Real light is a complicated phenomenon that not only acts upon objects, but interacts with them–light bounces off an object and to another object so that an entire scene is implicated. In graphical applications, however, usually only one surface is lit without taking into consideration the other objects in the scene. Ray tracing is sometimes used in graphics to generate realistic lighting effects by tracing the path of light through a scene and the objects it would encounter. While this creates accurate and realistic lighting effects, this technique is so slow that it is not practical for real-time applications like video games or simulations.

To create real-time, real-looking lighting effects, graphics engineer Sean Petty and staff at the Duderstadt Center have been experimenting with a publicly available and commonly used scene called Sponza to develop global illumination skill. The Sponza Atrium is a model of an actual building in Croatia with dramatic lighting. The lighting experiments in Sponza has helped the lab to develop a more realistic global illumination. Spherical harmonic (SH) lighting creates a realistic light rendering, using volumes to approximate how light should behave. While this method isn’t perfectly accurate in the way ray tracing is, algorithms are used to figure out which rays intersect objects and calculates the intensity of light going towards it, and emitting from it. This information is inserted into the 3D volume and overall virtual environment. These algorithms can then be applied in other scenes. Realistic lighting is vital to a user becoming psychologically immersed in a scene.

The Sponza Atrium is a model of an actual building in Croatia.

Out of Body In Molly Dierks Art

Out of Body In Molly Dierks Art

Molly Dierks Art Installation, “Postmodern Venus”

Molly Dierks, as an MFA candidate at the Penny W. Stamps School of Art & Design, used resources at the Duderstadt Center to create an installation peice called “Postmodern Venus.” Shawn O’Grady scanned her body with the HandyScan Laser Scanner to create a 3D model of her body. The model was then textured to look like ancient marble, and presented in the MIDEN as a life-size replication of herself.

“Postmodern Venus” plays with modern conceptions of objectivity and objectification by allowing the viewer to interact with the accurately scanned body of Molly Dierks, touching and moving through it. On her website she notes, “Experience with disability fuels my work, which probes the divide between our projected selves as they relate to the trappings of our real and perceived bodies. This work questions if there is a difference between what is real with relation to our bodies and our identities, and what is constructed, reflected or projected.” To read more about this and other work, visit Molly Dierks’ website: http://www.mollyvdierks.com/#Postmodern-Venus

A Configurable iOS Controller for a Virtual Reality Environment

A Configurable iOS Controller for a Virtual Reality Environment

James examining a volumetric brain in the MIDEN with an iPod controller

Traditionally, users navigate through 3D virtual environments via game controllers; however, game controllers are littered with ambiguously labeled buttons.  And while excellent for gaming, this setup makes navigating through 3D space unnecessarily complicated for the average user.  James Cheng, a sophomore in Computer Science in Engineering, has been working to resolve this headache by using touch screens such as those found in mobile devices instead game controllers.  Using the Jugular Engine in development at the Duderstadt Center, he has been developing a scalable UI system that can be used for a wide range of immersive simulations. Want to cut through a volumetric brain?  Select the “slice button” and start dragging.  What to fly through an environment instead of walking?  Switch to “Fly” mode and take off.  The system aims to be highly configurable since every experience is different.

Initial development is being done for the iOS platform due to it’s consistent hardware and options for scalable user interfaces.  James aims to make immersive experiences more intuitive and give the developer more options for communicating with the user.  You can now say “good-bye!” to memorizing what buttons “X” and “Y” do for each simulation, and instead utilize clearly defined and simulation-specific buttons.

Using the MIDEN for Hospital Room Visualization

Using the MIDEN for Hospital Room Visualization

How can doctors and nurses walk around a hospital room that hasn’t been built yet? It may seem like an impossible riddle, but the Duderstadt Center is making it possible!

Working with the University Of Michigan Hospital and a team of architects, healthcare professionals are able to preview full-scale re-designs of hospital rooms using the MIDEN. The MIDEN— or Michigan Immersive Digital Experience Nexus— is our an advanced audio-visual system for virtual reality. It provides its users with the convincing illusion of being fully immersed in a computer-generated, three-dimensional world. This world is presented in life-size stereoscopic projections on four surfaces that together fill the visual field, as well as 4.1 surround sound with attenuation and Doppler Effect.

Architects and nursing staff are using the MIDEN to preview patient room upgrades in the Trauma Burn Unit of the University Hospital. Of particular interest is the placement of an adjustable wall-mounted workstation monitor and keyboard. The MIDEN offers full-scale immersive visualization of clearances and sight-lines for the workstation with respect to the walls, cabinets, and patient bed. The design is being revised based on these visualizations before any actual construction occurs, avoiding time-consuming and costly renovations later.