Massive Lighting in Sponza to Develop Global Illumination

Massive Lighting in Sponza to Develop Global Illumination

A person in the MIDEN exploring Sponza.

Real light is a complicated phenomenon that not only acts upon objects, but interacts with them–light bounces off an object and to another object so that an entire scene is implicated. In graphical applications, however, usually only one surface is lit without taking into consideration the other objects in the scene. Ray tracing is sometimes used in graphics to generate realistic lighting effects by tracing the path of light through a scene and the objects it would encounter. While this creates accurate and realistic lighting effects, this technique is so slow that it is not practical for real-time applications like video games or simulations.

To create real-time, real-looking lighting effects, graphics engineer Sean Petty and staff at the Duderstadt Center have been experimenting with a publicly available and commonly used scene called Sponza to develop global illumination skill. The Sponza Atrium is a model of an actual building in Croatia with dramatic lighting. The lighting experiments in Sponza has helped the lab to develop a more realistic global illumination. Spherical harmonic (SH) lighting creates a realistic light rendering, using volumes to approximate how light should behave. While this method isn’t perfectly accurate in the way ray tracing is, algorithms are used to figure out which rays intersect objects and calculates the intensity of light going towards it, and emitting from it. This information is inserted into the 3D volume and overall virtual environment. These algorithms can then be applied in other scenes. Realistic lighting is vital to a user becoming psychologically immersed in a scene.

The Sponza Atrium is a model of an actual building in Croatia.

A Configurable iOS Controller for a Virtual Reality Environment

A Configurable iOS Controller for a Virtual Reality Environment

James examining a volumetric brain in the MIDEN with an iPod controller

Traditionally, users navigate through 3D virtual environments via game controllers; however, game controllers are littered with ambiguously labeled buttons.  And while excellent for gaming, this setup makes navigating through 3D space unnecessarily complicated for the average user.  James Cheng, a sophomore in Computer Science in Engineering, has been working to resolve this headache by using touch screens such as those found in mobile devices instead game controllers.  Using the Jugular Engine in development at the Duderstadt Center, he has been developing a scalable UI system that can be used for a wide range of immersive simulations. Want to cut through a volumetric brain?  Select the “slice button” and start dragging.  What to fly through an environment instead of walking?  Switch to “Fly” mode and take off.  The system aims to be highly configurable since every experience is different.

Initial development is being done for the iOS platform due to it’s consistent hardware and options for scalable user interfaces.  James aims to make immersive experiences more intuitive and give the developer more options for communicating with the user.  You can now say “good-bye!” to memorizing what buttons “X” and “Y” do for each simulation, and instead utilize clearly defined and simulation-specific buttons.

U-M Future of Visualization Committee Issues Report

U-M Future of Visualization Committee Issues Report

The U-M Future of Visualization Committee* issued a report early this month focusing on the role Visualization plays at the University of Michigan, as well as steps for addressing growing needs on campus. The report concluded that two “visualization hubs” should be created on campus to make computing visualization services more accessible to our campus research community. “The hubs envisioned by the committee would leverage existing resources and consist of advanced workstations, high bandwidth connectivity, and collaborative learning spaces, with a support model based on that of the Duderstadt Center and Flux. The hardware and software would be configured to allow departments or individuals to purchase their own resources in a way that would reduce fragmentation and allow for efficient support, training, and maintenance.” (Text courtesy of Dan Miesler and ARC)

The following excerpts from the executive summary of the report highlight the importance and educational value of visualization services:

“The University of Michigan has seen incredible growth and change over the years. The growth will continue as we innovate and adapt. How we teach, conduct research, facilitate student learning, push technological boundaries, and collaborate with our peers will create demand for new tools and infrastructure. One such need is visualization because of the imperative role it plays in facilitating innovation. When one considers the vast quantities of data currently being generated from disparate domains, methods that facilitate discovery, exploration, and integration become necessary to ensure those data are understood and effectively used.

There is a great opportunity to change the way research and education has been done but to also allow for a seamless transition between the two through advancements in connectivity, mobility, and visualization. The opportunity here is tremendous, complex, and in no way trivial. Support for a responsive and organized visualization program and its cyberinfrastructure needs is necessary to leverage the opportunities currently present at the University of Michigan.”

A full copy of the report is available here.

*The committee was created by Dan Atkins with the charge of evaluating existing visualization technologies and methods on campus; developing an action plan for addressing deficiencies in visualization needs; establishing a group of visualization leaders; and communicating with the community on visualization topics. It is composed of faculty members and staff from ARC, University Libraries, Dentistry, LSA, the Medical School, ITS, Architecture and Urban Planning, Atmospheric and Oceanic and Space Sciences, and the College of Engineering. (Text courtesy of Dan Miesler and ARC)

Using the MIDEN for Hospital Room Visualization

Using the MIDEN for Hospital Room Visualization

How can doctors and nurses walk around a hospital room that hasn’t been built yet? It may seem like an impossible riddle, but the Duderstadt Center is making it possible!

Working with the University Of Michigan Hospital and a team of architects, healthcare professionals are able to preview full-scale re-designs of hospital rooms using the MIDEN. The MIDEN— or Michigan Immersive Digital Experience Nexus— is our an advanced audio-visual system for virtual reality. It provides its users with the convincing illusion of being fully immersed in a computer-generated, three-dimensional world. This world is presented in life-size stereoscopic projections on four surfaces that together fill the visual field, as well as 4.1 surround sound with attenuation and Doppler Effect.

Architects and nursing staff are using the MIDEN to preview patient room upgrades in the Trauma Burn Unit of the University Hospital. Of particular interest is the placement of an adjustable wall-mounted workstation monitor and keyboard. The MIDEN offers full-scale immersive visualization of clearances and sight-lines for the workstation with respect to the walls, cabinets, and patient bed. The design is being revised based on these visualizations before any actual construction occurs, avoiding time-consuming and costly renovations later.

3D Painter Provides Interactive Art Experience

3D Painter Provides Interactive Art Experience

Painting on a 3D Canvas in the MIDEN

An application developed by the Duderstadt Center, called 3D Painter, allows users to paint in multiple dimensions, rotating and flipping their strokes. You can switch the walls you’re painting on, can switch to the floor, can switch colors and even depth. All using a simple LED-wand. 3D Painter was created to showcase the creative potential of applications and the capabilities of the MIDEN.

Pushing Interactive Boundaries in a Tea House

Pushing Interactive Boundaries in a Tea House

A person exploring the tea house within the MIDEN.

A tea house environment was designed to explore the limits of what is possible in interactivity and textures. Moving freely around the environment, the user could lift up, open, and move around objects. The scene was explored with the MIDEN and had real-world physics applies to the objects.

Sailboat Environment Creates Interactions Even Pirates Would Envy

Sailboat Environment Creates Interactions Even Pirates Would Envy

The sailor sits at the table to greet the user in this tech demo

This Sailboat environment, similar to the Tea House, demonstrates the capabilities of real-time graphics in an immersive simulation. In this environment, the user can walk around the wave-ridden boat using a flashlight, showing the capabilities of fast-changing and dramatic lighting effects. This demonstration aimed to push the system to its limits with dynamic lighting, real-time global illumination, animated characters, and a fully physical environment that can be manipulated and interacted with.

Virtual Reality 3-D Brain Helps Scientists Understand Migraine Pain

Virtual Reality 3-D Brain Helps Scientists Understand Migraine Pain

Dr. Alex DaSilva Photo Credit: Scott Soderberg, Michigan Photography

From U-M News:

ANN ARBOR—Wielding a joystick and wearing special glasses, pain researcher Alexandre DaSilva rotates and slices apart a large, colorful, 3-D brain floating in space before him.

Despite the white lab coat, it appears DaSilva’s playing the world’s most advanced virtual video game.  The University of Michigan dentistry professor is actually hoping to better understand how our brains make their own pain-killing chemicals during a migraine attack.

The 3-D brain is a novel way to examine data from images taken during a patient’s actual migraine attack, says DaSilva, who heads the Headache and Orofacial Pain Effort at the U-M School of Dentistry and the Molecular and Behavioral Neuroscience Institute.

Different colors in the 3-D brain give clues about chemical processes happening during a patient’s migraine attack using a PET scan, or positron emission tomography, a type of medical imaging.

“This high level of immersion (in 3-D) effectively places our investigators inside the actual patient’s brain image,” DaSilva said.

Through some innovative work done by Dr. Alexandre Dasilva and his team in the School of Dentistry, the Duderstadt Center was presented with some exciting new data that shows activation in the brain *during* a migraine attack. Most data happens before or after an attack. Sean Petty and Ted Hall worked closely with Dr. DaSilva to interpret the data and add some new tools to Jugular, our in-house 3D engine, for exploring volumetric data such as fMRI and CT scans. Dr. DaSilva can now explore the brain data by easily walking around the data and interactively cutting through it.

Test Driving with FAAC and Graphics Performance Discussion

Test Driving with FAAC and Graphics Performance Discussion


FAAC Incorporated provides system engineering and software products including driving simulators for commercial and private training. FAAC reached out to the Duderstadt Center to share information and to compare their system performance to the MIDEN’s capabilities. The Duderstadt Center had developed an “urban neighborhood” model as a stress test: how big and highest number of triangles and vertices can we make the models while still maintaining a comfortable interactive frame-rate in the MIDEN? The demo showed the MIDEN’s system capabilities and potential. The Duderstadt Center proceeded to visit FAAC’s space and saw the first 6-DOF full-motion system in a mobile trailer.

PainTrek Released on iTunes!

PainTrek Released on iTunes!


Get the App!

Ever have a headache or facial pain that seemingly comes and goes without warning? Ever been diagnosed with migraines, TMD or facial neuralgias but feel that your ability to explain your pain is limited?

PainTrek is a novel app that was developed to make it easier to track, analyze, and talk about pain. Using an innovative “paint your pain” interface, users can easily enter the intensity and area of pain by simply dragging over a 3D head model. Pain information can be entered as often as desired, can be viewed over time, and even analyzed to provide deeper understanding of your pain.

The PainTrek application measures pain area and progression using a unique and accurate anatomical 3D system. The head 3D model is based on a square grid system with vertical and horizontal coordinates using anatomical landmarks. Each quadrangle frames well-detailed craniofacial areas for real-time indication of precise pain location and intensity in a quantifiable method. This is combined with essential sensory and biopsychosocial questionnaires related to previous and ongoing treatments, and their rate of success/failure, integrating and displaying such information in an intuitive way.