Unreal Engine in Stereoscopic Virtual Reality

Unreal Engine in Stereoscopic Virtual Reality

Up until now, the Oculus Rift has been the go-to system for gamers seeking the ultimate immersive experience, offering immersive stereo compatibility with game engines like Unreal and Unity 3D. Recently, the Duderstadt Center was able to push this experience even further, with Graphics Programmer Sean Petty adapting the Unreal Engine to work within the Duderstadt Center’s M.I.D.E.N – a fully immersive, stereoscopic 3D virtual reality experience.

Visitors entering the MIDEN are equipped with a pair of stereo glasses and a game controller, both outfitted with reflective markers that are then tracked by a series of Vicon cameras positioned around the top perimeter of the space. The existing capabilities of the MIDEN allow viewers to not only explore a space beyond the confines of the 10’x10′ room, but to also interact with objects using the provided game controller.

The services of the Duderstadt Center are open to all departments within the University, making visualization services, professional studio spaces, and exploratory technology accessible to artists, engineers, architects and more. The diverse atmosphere of the Digital Media Commons generates a multitude of cross-curricular collaborative projects each year – From live performances featuring orchestras manipulated via brain waves to exploring the anatomy of a digital cadaver in virtual reality.

In the past the Duderstadt Center’s MIDEN has been used to prototype architectural spaces, host artistic installations and assess human behavior or simulate training scenarios. Incorporating the Unreal Engine into a space like the MIDEN allows visitors to experience an intense level of realism never before achieved in this sort of environment, opening new doors not just for gamers, but for those seeking high quality visualizations for research and exploration. Unreal Engine brings a wide range of materials and visual effects to any scene. From realistic water, foliage or glass, to effects like fire and transitions in the time of day.

Sean Petty, graphics programmer of the Duderstadt Center, explains his process for getting Unreal to operate from within the MIDEN:

The MIDEN requires us to render a different view of the scene to each of the four walls from the perspective of the user. In order to achieve this we must track the location and orientation of the users eyes, which is accomplished by motion tracking a pair of glasses worn by the user. In the MIDEN there is a dedicated computer performing the necessary calculations, the first step to enabling MIDEN support in Unreal is to modify the engine to interface with this computer.

Visitors to the MIDEN are motion tracked within the space via reflective markers placed around a pair of stereo glasses and a hand held game controller. These markers are monitored by eight Vicon cameras located along the perimeter of the MIDEN.

Once the location of the user has been determined we must project the user’s view to each of the four walls. When rendering a scene in a standard desktop environment the camera is positioned in the center of the screen. A centered camera only requires a symmetric frustum projection which is the native transformation supported by Unreal. In the MIDEN, the center of the camera may be anywhere within the space and will often not be centered on a screen. This requires the use of an asymmetric frustum projection, which is functionality that had to be added to the engine.

Images for each wall are projected through a corresponding projector located behind the walls of the MIDEN. The floor is projected using a mirror located at the top of the space.

Unreal has native support for stereo by rendering the left and right views next to each other into the single image. This setup is used for devices such as the Oculus rift where the both images for the left and right eye are displayed at the same time. The MIDEN uses a technology called “active stereo”, where the displayed image flickers back and forth rapidly between the left and right images. This requires a modification to the engine so the left and right images are rendered to two separate buffers rather than to two sides of a single image.

Unreal Engine as seen from within the Duderstadt Center’s Virtual Reality MIDEN. The MIDEN is a 10’x10′ room comprised of 5 walls utilizing stereoscopic projection. Visitors are tracked using Vicon cameras allowing them to travel beyond the confines of the physical space.

The final step for displaying unreal scenes in the MIDEN is to get the four rendering computers communicating with each other. This ensures that when the user moves all the screens are updated appropriately to give a consistent view of the scene. The networking is accomplished using Unreal‘s built in network replication functionality, which is designed for use in multiplayer games.

With this latest development, researchers across all disciplines are now able to utilize this technology to reproduce lifelike environments for their studies giving subjects the ultimate immersive experience. It is hoped that this higher level of immersion offered by the Unreal Engine will have a dramatic impact in studies involving human behavior and environmental effects.

In addition to incorporating Unreal, the MIDEN also continues to operate using an in-house engine developed by Ted Hall & Sean Petty, called “Jugular,” which provides support for a broad range of models, materials, and interactivity. While Unreal offers finer elements of photo-realism for mesh-based geometry, Jugular supports easier import of a wider range of file types from a variety of sources, including not only meshes but also solid volumes and informatics graphs.

Photogrammetry for the Stearns Collection

Photogrammetry for the Stearns Collection

Photogrammetry results from the Stearns Collection: Here a drum is captured, and visible are the original digital photographs taken inside Stearns, the drum generated as a point cloud, the point cloud developed into a 3D mesh, and then a fully textured 3D model.

Donated in 1899 by wealthy Detroit drug manufacturer, Frederick Stearns, the  Stearn’s Collection is a university collection comprised of over 2,500 historical and contemporary musical instruments from all over the world, with many of the instruments in the collection being particularly fragile or one of a kind. In 1966 Stearns grew to include the only complete Javanese gamelan in the world, and being home to such masterpieces, the Stearns collection has become recognized internationally as unique. In 1974, due to concerns about preservation and display, much of the collection was relocated out of public view. Once residing in Hill Auditorium, the majority of the collection now sits in storage inside an old factory near downtown Ann Arbor.

The current location of the Stearns Collection. Photo Credit: www.dailymail.co.uk

Current preservation efforts have involved photographing the collection and making the nearly 13,000 resulting images available online. However, over the past year the Duderstadt Center has been working with Chris Dempsey, curator of the Stearns Collection and Jennifer Brown, a University Library Associate in Learning & Teaching, on a new process for preservation: Utilizing Photogrammetry to document the collection. Photogrammetry is a process that relies on several digital photographs of an artifact to re-construct the physical object into a digital 3D model. While traditional methods of obtaining 3D models often utilize markers placed atop the object, the process of Photogrammetry is largely un-invasive, allowing for minimal, and sometimes, no direct handling of an artifact. Models resulting from this process, when captured properly, are typically very precise and allow the viewer to rotate the object 360 degrees, zoom in and out, measure, or otherwise analyze the object in many cases as though it were actually in front of them.

Equipped with a high resolution digital SLR camera, Jennifer traveled to the warehouse where much of the Stearns collection is now held to document some of the instruments that are not currently on display and have limited accessibility to the general public. Feeding the resulting images into an experimental Photogrammetry software developed for research purposes (“Visual SFM” and “CMVS”), Jennifer processed the photos taken of various instruments into high resolution 3D models that could eventually be placed on the web for more accessible public viewing and student interaction.

Sonar Visualized in EECS

Sonar Visualized in EECS

Original point cloud data brought into Autodesk Recap

Professor Kamal Sarabandi, of Electrical Engingeering and Computer Science, and student Samuel Cook were looking into the accuracy of sonar equipment and came to the 3D Lab for assistance with visualizing their data. Their goal was to generate an accurate to scale 3D model of the EECS atrium that would be used to align their data to a physical space.

Gaps in point cloud data indicate an obstruction encountered by the sonar.

The Duderstadt Center’s Stephanie O’Malley and student consultant, Maggie Miller, used precise measurements and photo reference provided by Sam to re-create the atrium in 3D Studio Max. The point cloud data produced by their sonar was then exported as a *.PTS file, and brought into Autodesk Recap to quickly determine if everything appeared correct. When viewing point cloud data from the sonar, any significant gaps in the cloud indicate an obstruction, such as furniture, plants, or people.

Using the origin of the sonar device positioned on the second floor balcony, the data was aligned to the scene, and colored appropriately.  When the images produced by the sonar were aligned with the re-created EECS atrium, they were able to see the sonar picking up large objects such as benches or posts because those areas did not produce data points.  Professor Sarabandi’s research focus encompasses a wide range of topics in the area of applied electromagnetics.  The visualization efforts of the Duderstadt Center assisted in furthering his research by helping to improve the accuracy of their radar.

Sonar data aligned to a model of the EECS atrium