Massive Lighting in Sponza to Develop Global Illumination

Massive Lighting in Sponza to Develop Global Illumination

A person in the MIDEN exploring Sponza.

Real light is a complicated phenomenon that not only acts upon objects, but interacts with them–light bounces off an object and to another object so that an entire scene is implicated. In graphical applications, however, usually only one surface is lit without taking into consideration the other objects in the scene. Ray tracing is sometimes used in graphics to generate realistic lighting effects by tracing the path of light through a scene and the objects it would encounter. While this creates accurate and realistic lighting effects, this technique is so slow that it is not practical for real-time applications like video games or simulations.

To create real-time, real-looking lighting effects, graphics engineer Sean Petty and staff at the Duderstadt Center have been experimenting with a publicly available and commonly used scene called Sponza to develop global illumination skill. The Sponza Atrium is a model of an actual building in Croatia with dramatic lighting. The lighting experiments in Sponza has helped the lab to develop a more realistic global illumination. Spherical harmonic (SH) lighting creates a realistic light rendering, using volumes to approximate how light should behave. While this method isn’t perfectly accurate in the way ray tracing is, algorithms are used to figure out which rays intersect objects and calculates the intensity of light going towards it, and emitting from it. This information is inserted into the 3D volume and overall virtual environment. These algorithms can then be applied in other scenes. Realistic lighting is vital to a user becoming psychologically immersed in a scene.

The Sponza Atrium is a model of an actual building in Croatia.

A Configurable iOS Controller for a Virtual Reality Environment

A Configurable iOS Controller for a Virtual Reality Environment

James examining a volumetric brain in the MIDEN with an iPod controller

Traditionally, users navigate through 3D virtual environments via game controllers; however, game controllers are littered with ambiguously labeled buttons.  And while excellent for gaming, this setup makes navigating through 3D space unnecessarily complicated for the average user.  James Cheng, a sophomore in Computer Science in Engineering, has been working to resolve this headache by using touch screens such as those found in mobile devices instead game controllers.  Using the Jugular Engine in development at the Duderstadt Center, he has been developing a scalable UI system that can be used for a wide range of immersive simulations. Want to cut through a volumetric brain?  Select the “slice button” and start dragging.  What to fly through an environment instead of walking?  Switch to “Fly” mode and take off.  The system aims to be highly configurable since every experience is different.

Initial development is being done for the iOS platform due to it’s consistent hardware and options for scalable user interfaces.  James aims to make immersive experiences more intuitive and give the developer more options for communicating with the user.  You can now say “good-bye!” to memorizing what buttons “X” and “Y” do for each simulation, and instead utilize clearly defined and simulation-specific buttons.

Sailboat Environment Creates Interactions Even Pirates Would Envy

Sailboat Environment Creates Interactions Even Pirates Would Envy

The sailor sits at the table to greet the user in this tech demo

This Sailboat environment, similar to the Tea House, demonstrates the capabilities of real-time graphics in an immersive simulation. In this environment, the user can walk around the wave-ridden boat using a flashlight, showing the capabilities of fast-changing and dramatic lighting effects. This demonstration aimed to push the system to its limits with dynamic lighting, real-time global illumination, animated characters, and a fully physical environment that can be manipulated and interacted with.

Article: Measurable Domain for Colour Differences within a Virtual Environment

Article: Measurable Domain for Colour Differences within a Virtual Environment

Light & Engineering (vol. 20, no. 3, 2012) | Светотехника (2 • 2012)

Professor Moji Navvab has published another article regarding his lighting analysis of virtual reality: “Область Поддающихся Измерению Цветовых Различий в Виртуальной Среде” (“Measurable Domain for Colour Differences within a Virtual Environment”), in, Светотехника (Light & Engineering).

Wayfinding in Assisted Living Homes

Wayfinding in Assisted Living Homes

Rebecca Davis, professor and researcher at the Grand Valley State University, received a research grant from the National Institute of Health to research how patients with Alzheimers disease navigate their living space. Assisted living homes can be drab or nondescript with long hallways adding to the confusion and frustration for those living in these homes. To research this problem and possible solutions, Davis recruited 40 people in the early stages of Alzheimer’s and 40 without the disease to virtually walk through a simulation of an actual assisted living home in the MIDEN. Staff and students at the Duderstadt Center modeled a 3D environment to re-create details such as the complicated lighting or maze-like hallways, to create a natural and immersive experience. This allows users to fully experience how the color schemes, lighting, and wall detail can affect the experience of living in the home. Various “visual cues” are placed throughout the space at key locations to determine if these help the subject in remembering which paths lead to where they need to go. Rebecca currently utilizes two environments in her study, one with visual cues and one without. Subjects are shown the path they must go to reach a destination and then given an opportunity to travel there themselves-if they can remember how.