Multi-Sensing the Universe

Multi-Sensing the Universe

Envisioning a Toroidal universe

Robert Alexander teamed with Danielle Battaglia, a senior in Art & Design, to compose and integrate audio effects into her conceptual formal model of the Toroidal Universe.  Danielle combined Plato’s notion of the universe as a dodecahedron with modern notions of black holes, worm holes, and child universes.  Their multi-sensory multiverse came together in the MIDEN and was exhibited there as part of the Art & Design senior integrative art exhibition.

Interested in using the MIDEN to do something similar? Contact us.

Behind the Scenes: Re-creating Citizen Kane in VR

Behind the Scenes: Re-creating Citizen Kane in VR

inside a classic

Stephanie O’Malley


Students in Matthew Solomon’s classes are used to critically analyzing film. Now they get the chance to be the director for arguably one of the most influential films ever produced: Citizen Kane.

Using an application developed at the Duderstadt Center with grant funding provided by LSA Technology Services, students are placed in the role of the film’s director and able to record a prominent scene from the movie using a virtual camera. The film set which no longer exists, has been meticulously re-created in black and white CGI using reference photographs from the original set, with a CGI Orson Welles acting out the scene on repeat – his actions performed by Motion Capture actor Matthew Henerson, carefully chosen for his likeness to Orson Welles, with the Orson avatar generated from a photogrammetry scan of Matthew.

Top down view of the CGI re-creation of the film set for Citizen Kane

Analyzing the original film footage, doorways were measured, actor heights compared, and footsteps were counted, to determine a best estimate for the scale of the set when 3D modeling. With feedback from Citizen Kane expert, Harlan Lebo, fine details down to the topics of the books on the bookshelves were able to be determined.

Archival photograph provided by Vincent Longo of the original film set

Motion Capture actor Matthew Henerson was flown in to play the role of the digital Orson Welles. In a carefully choreographed session directed by Matthew’s PhD student, Vincent Longo, the iconic scene from Citizen Kane was re-enacted while the original footage played on an 80″ TV in the background, ensuring every step aligned to the original footage perfectly.

Actor Matthew Henerson in full mocap attire amidst the makeshift set for Citizen Kane – Props constructed using PVC. Photo provided by Shawn Jackson.

The boundaries of the set were taped on the floor so the data could be aligned to the digitally re-created set. Eight Vicon motion capture cameras, the same used throughout Hollywood for films like Lord of the Rings or Planet of the Apes, formed a circle around the makeshift set. These cameras rely on infrared light reflected off of tiny balls affixed to the motion capture suit to track the actor’s motion. Any props during the motion capture recording were carefully constructed out of cardboard and PVC (later to be 3D modeled) so as to not obstruct his movements. The 3 minutes of footage attempting to be re-created took 3 days to complete, comprised over 100 individual mocap takes and several hours of footage, which were then compared for accuracy and stitched together to complete the full route Orson travels through the environment.

Matthew Henerson
Orson Welles

  Matthew Henerson then swapped his motion capture suit for an actual suit, similar to that worn by Orson in the film, and underwent 3D scanning using the Duderstadt Center’s photogrammetry resources. 

Actor Matthew Henerson wears asymmetrical markers to assist the scanning process

Photogrammetry is a method of scanning existing objects or people, commonly used in Hollywood and throughout the video game industry to create a CGI likenesses of famous actors. This technology has been used in films like Star Wars (an actress similar in appearance to Carrie Fischer was scanned and then further sculpted, to create a more youthful Princess Leia) with entire studios now devoted to photogrammetry scanning. The process relies on several digital cameras surrounding the subject and taking simultaneous photographs.

Matthew Henerson being processed for Photogrammetry

The photos are submitted to a software that analyzes them on a per-pixel basis, looking for similar features across multiple photos. When a feature is recognized, it is triangulated using the focal length of the camera and it’s position relative to other identified features, allowing millions of tracking points to be generated. From this an accurate 3D model can be produced, with the original digital photos mapped to its surface to preserve photo-realistic color. These models can be further manipulated: Sometimes they are sculpted by an artist, or, with the addition of a digital “skeleton”, they can be driven by motion data to become a fully articulated digital character.

  The 3d modeled scene and scanned actor model were joined with mocap data and brought into the Unity game engine to develop the functionality students would need to film within the 3D set. A virtual camera was developed with all of the same settings you would find on a film camera from that era. When viewed in a virtual reality headset like the Oculus Rift, Matthew’s students can pick up the camera and physically move around to position it at different locations in the CGI environment, often capturing shots that otherwise would be difficult to do in a conventional film set. The footage students film within the app can be exported as MP4 video and then edited in their editing software of choice, just like any other camera footage.

  Having utilized the application for his course in the Winter of 2020, Matthew Solomon’s project with the Duderstadt Center was recently on display as part of the iLRN’s 2020 Immersive Learning Project Showcase & Competition. With Covid-19 making the conference a remote experience, the Citizen Kane project was able to be experienced in Virtual Reality by conference attendees using the FrameVR platform. Highlighting innovative ways of teaching with VR technologies, attendees from around the world were able to learn about the project and watch student edits made using the application.

Citizen Kane on display for iLRN’s 2020 Immersive Learning Project Showcase & Competition using Frame VR

Novels in VR – Experiencing Uncle Tom’s Cabin

Novels in VR – Experiencing Uncle Tom’s Cabin

A Unique Perspective

Stephanie O’Malley


This past semester, English Professor Sara Blair taught a course at the University titled, “The Novel and Virtual Realities.”  – The purpose of this course was to expose students to different methods of analyzing novels and ways of understanding them from different perspectives by utilizing platforms like VR and AR.

Designed as a hybrid course, her class was split between a traditional classroom environment, and an XR lab, providing a comparison between traditional learning methods, and more hands-on experiential lessons through the use of immersive, interactive VR and AR simulations.

As part of her class curriculum, students were exposed to a variety of experiential XR content. Using the Visualization Studio’s Oculus Rifts, her class was able to view Dr. Courtney Cogburn’s “1000 Cut Journey” installation – a VR experience that puts viewers in the shoes of a black american man growing up in the time of segregation, allowing viewers to see first hand how racism affects every facet of their life. They also had the opportunity to view Asad J. Malik’s “Terminal 3” using augmented reality devices like the Microsoft Hololens. Students engaging with Terminal 3 see how Muslim identities in the U.S. are approached through the lens of an airport interrogation.

Wanting to create a similar experience for her students at the University of Michigan, Sara approached the Duderstadt Center about the possibility of turning another novel into a VR experience: Uncle Tom’s Cabin.

She wanted her students to understand the novel from the perspective of it’s lead character, Eliza, during the pivotal moment where as a slave, she is trying to escape her captors and reach freedom. But she also wanted to give her students the perspective of the slave owner and other slaves tasked with her pursuit, as well as the perspective of an innocent bystander watching this scene unfold.

Adapted for VR by the Duderstadt Center: Uncle Tom’s Cabin

Using Unreal Engine, the Duderstadt Center was able to make this a reality. An expansive winter environment was created based on imagery detailed in the novel, and CGI characters for Eliza and her captors were produced and then paired with motion capture data to drive their movements. When students put on the Oculus Rift headset, they can choose to experience the moment of escape either through Eliza’s perspective, her captors, or as a bystander. And to better evaluate what components contributed to student’s feelings during the simulation, versions of these scenarios were provided with and without sound. With sound enabled as Eliza, you hear footsteps in the snow gaining on you, the crack of the ice beneath your feet as you leap across a tumultuous river, and the barking of a vicious dog on your heels – all adding to the tension of the moment. While viewers are able to freely look around the environment, they are passive observers: They have no control over the choices Eliza makes or where she can go.

Adapted for VR by the Duderstadt Center: Uncle Tom’s Cabin – Freedom for Eliza lies on the other side of the frozen Ohio river.

The scene ends with Eliza reaching freedom on the opposite side of the Ohio river and leaving her pursuers behind. What followed the student’s experience with the VR version of the novel was a deep class discussion on how the scene felt in VR verses how it felt reading the same passage in the book. Some students wondered what it might feel like to instead be able to control the situation and control where Eliza goes, or as a bystander, to move freely through the environment as the scene plays out, deciding which party (Eliza or her pursuers) was of most interest to follow in that moment.

While Sara’s class has concluded for the semester, you can still try this experience for yourself – Uncle Tom’s Cabin is available to demo on all Visualization Studio workstations equipped with an Oculus Rift.

Using Mobile VR to Assess Claustrophobia During an MRI

Using Mobile VR to Assess Claustrophobia During an MRI

new methods for exposure therapy

Stephanie O’Malley


Dr. Richard Brown and his colleague Dr. Jadranka Stojanovska had an idea for how VR could be used in a clinical setting. Having realized a problem with patients undergoing MRI scans experiencing claustrophobia, they wanted to use VR simulations to introduce potential patients to what being inside an MRI machine might feel like.

Duderstadt Center programmer Sean Petty and director Dan Fessahazion alongside Dr. Richard Brown

Claustrophobia in this situation is a surprisingly common problem. While there are 360 videos that convey what an MRI might look like, these fail to address the major factor contributing to claustrophobia: The perceived confined space within the bore. 360 videos tend to make the environment skewed, seeming further away than it would be in reality and thereby failing to induce the same feelings of claustrophobia that the MRI bore would produce in reality. With funding from the Patient Education Award Committee, Dr. Brown approached the Duderstadt Center to see if a better solution could be produced.

VR MRI: Character customization
A patient enters feet-first into the bore of the MRI machine.

In order to simulate the effects of an MRI accurately, a CGI MRI machine was constructed and ported to the Unity game engine. A customize-able avatar representing the viewer’s body was also added to give viewers a sense of self. When a VR headset is worn, the viewer’s perspective allows them to see their avatar body and the real proportions of the MRI machine as they are slowly transported into the bore. Verbal instructions mimic what would be said throughout the course of a real MRI, with the intimidating boom of the machine occurring as the simulated scan proceeds.

Two modes are provided within the app: Feet first or head first, to accommodate the most common scanning procedures that have been shown to induce claustrophobia.  

In order to make this accessible to patients, the MRI app was developed with mobile VR in mind, allowing anyone (patients or clinicians) with a VR-capable phone to download the app and use it with a budget friendly headset like Google Daydream or Cardboard.

Dr. Brown’s VR simulator was recently featured as the cover story in the September edition of Tomography magazine.

Students Learn 3D Modeling for Virtual Reality

Students Learn 3D Modeling for Virtual Reality

making tiny worlds

Stephanie O’Malley


ArtDes240 is course offered by the Stamps School of Art & Design and taught by Stephanie O’Malley that teaches students 3D modeling & animation.  As one of only a few 3D digital classes offered at the University of Michigan, AD240 sees student interest from several schools across campus with students looking to gain a better understanding of 3D art as it pertains to the video game industry.

The students in AD240 are given a crash-course in 3D modeling in 3D Studio Max and level creation within the Unreal Editor. It is then within Unreal that all of their objects are positioned, terrain is sculpted, and atmospheric effects such as time of day, weather, or fog can be added.

“Candyland” – Elise Haadsma & Heidi Liu, developed using 3D Studio Max and Unreal Engine
“Candyland” – Elise Haadsma & Heidi Liu, developed using 3D Studio Max and Unreal Engine

With just 5 weeks to model their entire environment, bring it into Unreal,  package it as an executable, and test it in the MIDEN (or on the Oculus Rift), the resulting student projects were truly impressive. Art & Design Students Elise Haadsma & Heidi Liu took inspiration from the classic board game, “Candyland” to create a life-size game board environment in Unreal consisting of a lollipop forest, mountains of Hershey’s kisses, even a gingerbread house and chocolate river.

Lindsay Balaka  from the School of Music, Theater & Dance, chose to create her scene using the Duderstadt Center’s in-house rendering software “Jugular” instead of Unreal Engine-Her creation, “Galaxy Cakes”, is a highly stylized (reminiscent of an episode of the 1960’s cartoon, The Jetson’s) cupcake shop, complete with spatial audio emanating from the corner Jukebox.

Lindsay Balaka’s “Galaxy Cakes” environment
An abandoned school, created by Vicki Liu in 3D Studio Max and Unreal Engine

Vicki Liu, also of Art & Design, created a realistic horror scene using Unreal. After navigating down a poorly lit hallway of an abandoned nursery school, you will find yourself in a run down classroom inhabited by some kind of mad man. A tally of days passed has been scratched into the walls, an eerie message scrawled onto the chalkboard, and furniture haphazardly barricades the windows.

While the goal of the final project was to create a traversible environment for virtual reality, some students took it a step further.

Art & Design student Gus Schissler created an environment composed of neurons in Unreal intended for viewing within the Oculus Rift. He then integrated data from an Epoch neurotransmitter (a device capable of reading brain waves) to allow the viewer to telepathically interact with the environment. The viewers mood when picked up by the Epoch not only changed the way the environment looked by adjusting the intensities of the light being emitted by the neurons, but also allowed the viewer to think specific commands (push, pull, etc) in order to navigate their way past various obstacles in the environment.

Students spend the last two weeks of the semester scheduling time with Ted Hall and Sean Petty to test their scenes and ensure everything runs and looks correctly on the day of their presentations. This was a class that not just introduced students to the design process, but to also allowed them to get hands on experience with upcoming technologies as virtual reality continues to expand in the game and film industries.

Student Gus Schissler demonstrates his Neuron environment for Oculus Rift that uses inputs from an Epoch neurotransmitter to interact.

Passion & Violence: Anna Galeotti’s MIDEN Installation

Passion & Violence

Anna Galeotti’s MIDEN INstallation

Ph.D. Fullbright Scholar (Winter, 2014) Anna Galeotti:  exploring the concept of “foam” or “bubbles” as a possible model for audiovisual design elements and their relationships. Her art installation, “Passion and Violence in Brazil” was displayed in the Duderstadt Center’s MIDEN.

Interested in using the MIDEN to do something similar? Contact us.

Extended Reality: changing the face of learning, teaching, and research

Extended Reality: changing the face of learning, teaching, and research

Written by Laurel Thomas, Michigan News

Students in a film course can evoke new emotions in an Orson Welles classic by virtually changing the camera angles in a dramatic scene.

Any one of us could take a smartphone, laptop, paper, Play-doh and an app developed at U-M, and with a little direction become a mixed reality designer. 

A patient worried about an upcoming MRI may be able put fears aside after virtually experiencing the procedure in advance. 

Dr. Jadranka Stojanovska, one of the collaborators on the virtual MRI, tries on the device

This is XR—Extended Reality—and the University of Michigan is making a major investment in how to utilize the technology to shape the future of learning. 

Recently, Provost Martin Philbert announced a three-year funded initiative led by the Center for Academic Innovation to fund XR, a term used to encompass augmented reality, virtual reality, mixed reality and other variations of computer-generated real and virtual environments and human-machine interactions. 

The Initiative will explore how XR technologies can strengthen the quality of a Michigan education, cultivate interdisciplinary practice, and enhance a national network of academic innovation. 

Throughout the next three years, the campus community will explore new ways to integrate XR technologies in support of residential and online learning and will seek to develop innovative partnerships with external organizations around XR in education.

“Michigan combines its public mission with a commitment to research excellence and innovation in education to explore how XR will change the way we teach and learn from the university of the future,” said Jeremy Nelson, new director of the XR Initiative in the Center for Academic Innovation.

Current Use of XR

 
Applications of the technology are already changing the learning experience across the university in classrooms and research labs with practical application for patients in health care settings. 

In January 2018, a group of students created the Alternative Reality Initiative to provide a community for hosting development workshops, discussing industry news, and connecting students in the greater XR ecosystem.

In Matthew Solomon‘s film course, students can alter a scene in Orson Welles’ classic “Citizen Kane.” U-M is home to one of the largest Orson Welles collections in the world.

Solomon’s concept for developers was to take a clip from the movie and model a scene to look like a virtual reality setting—almost like a video game. The goal was to bring a virtual camera in the space so students could choose shot angles to change the look and feel of the scene. 

This VR tool will be used fully next semester to help students talk about filmmaker style, meaning and choice.

“We can look at clips in class and be analytical but a tool like this can bring these lessons home a little more vividly,” said Solomon, associate professor in the Department of Film, Television and Media.

A scene from Orson Welles’ “Citizen Kane” from the point of view of a virtual camera that allows students to alter the action.

Sara Eskandari, who just graduated with a Bachelor of Arts from the Penny Stamps School of Art and a minor in Computer Science, helped develop the tool for Solomon’s class as a member of the Visualization Studio team.

“I hope students can enter an application like ‘Citizen Kane’ and feel comfortable experimenting, iterating, practicing, and learning in a low-stress environment,” Eskandari said. “Not only does this give students the feeling of being behind an old-school camera, and supplies them with practice footage to edit, but the recording experience itself removes any boundaries of reality. 

“Students can float to the ceiling to take a dramatic overhead shot with the press of a few buttons, and a moment later record an extreme close up with entirely different lighting.”

Sara Blair, vice provost for academic and faculty affairs, and the Patricia S. Yaeger Collegiate Professor of English Language and Literature, hopes to see more projects like “Citizen Kane.”

“An important part of this project, which will set it apart from experiments with XR on many other campuses, is our interest in humanities-centered perspectives to shape innovations in teaching and learning at a great liberal arts institution,” Blair said. “How can we use XR tools and platforms to help our students develop historical imagination or to help students consider the value and limits of empathy, and the way we produce knowledge of other lives than our own? 

“We hope that arts and humanities colleagues won’t just participate in this [initiative] but lead in developing deeper understandings of what we can do with XR technologies, as we think critically about our engagements with them.”

UM Faculty Embracing XR

 

Mark W. Newman, professor of information and of electrical engineering, and chair of the Augmented and Virtual Reality Steering Committee, said indications are that many faculty are thinking about ways to use the technology in teaching and research, as evidenced by progress on an Interdisciplinary Graduate Certificate Program in Augmented and Virtual Reality.

Newman chaired the group that pulled together a number of faculty working in XR to identify the scope of the work under way on campus, and to recommend ways to encourage more interdisciplinary collaboration. He’s now working with deans and others to move forward with the certificate program that would allow experiential learning and research collaborations on XR projects.

“Based on this, I can say that there is great enthusiasm across campus for increased engagement in XR and particularly in providing opportunities for students to gain experience employing these technologies in their own academic work,” Newman said, addressing the impact the technology can have on education and research.

“With a well-designed XR experience, users feel fully present in the virtual environment, and this allows them to engage their senses and bodies in ways that are difficult if not impossible to achieve with conventional screen-based interactive experiences. We’ve seen examples of how this kind of immersion can dramatically aid the communication and comprehension of otherwise challenging concepts, but so far we’re only scratching the surface in terms of understanding exactly how XR impacts users and how best to design experiences that deliver the effects intended by experience creators.

Experimentation for All

 

Encouraging everyone to explore the possibilities of mixed reality (MR) is a goal of Michael Nebeling, assistant professor in the School of Information, who has developed unique tools that can turn just about anyone into an augmented reality designer using his ProtoAR or 360proto software.

Most AR projects begin with a two-dimensional design on paper that are then made into a 3D model, typically by a team of experienced 3D artists and programmers. 

Michael Nebeling’s mixed reality app for everyone.

With Nebeling’s ProtoAR app content can be sketched on paper, or molded with Play-doh, then the designer either moves the camera around the object or spins the piece in front of the lens to create motion. ProtoAR then blends the physical and digital content to come up with various AR applications.

Using his latest tool, 360proto, they can even make the paper sketches interactive so that users can experience the AR app live on smartphones and headsets, without

Michael Nebeling’s mixed reality app for everyone.

spending hours and hours on refining and implementing the design in code.

These are the kind of technologies that not only allow his students to learn about AR/VR in his courses, but also have practical applications. For example, people can

experience their dream kitchen at  home, rather than having to use their imaginations when clicking things together on home improvement sites. He also is working on getting many solutions directly into future web browsers so that people can access AR/VR modes when visiting home improvement sites, cooking a recipe in the kitchen, planning a weekend trip with museum or gallery visits, or when reading articles on wikipedia or the news.

Nebeling is committed to “making mixed reality a thing that designers do and users want.”

“As a researcher, I can see that mixed reality has the potential to fundamentally change the way designers create interfaces and users interact with information,” he said. “As a user of current AR/VR applications, however, it’s difficult to see that potential even for me.”

He wants to enable a future in which “mixed reality will be mainstream, available and accessible to anyone, at the snap of a finger. Where everybody will be able to ‘play,’ be it as consumer or producer.”

 

XR and the Patient Experience

A team in the Department of Radiology, in collaboration with the Duderstadt Center Visualization Studio, has developed a Virtual Reality tool to simulate an MRI, with the goal of reducing last minute cancellations due to claustrophobia that occur in an estimated 4-14% of patients. The clinical trial is currently enrolling patients. 
VR MRI Machine

“The collaboration with the Duderstadt team has enabled us to develop a cutting-edge tool that allows patients to truly experience an MRI before having a scan,” said Dr. Richard K.J. Brown, professor of radiology. The patient puts on a headset and is ‘virtually transported’ into an MRI tube. A calming voice explains the MRI exam, as the patient hears the realistic sounds of the magnet in motion, simulating an exam experience. 

The team also is developing an Augmented Reality tool to improve the safety of CT-guided biopsies.

Team members include doctors Brown, Jadranka Stojanovska, Matt Davenport, Ella Kazerooni, Elaine Caoili from Radiology, and Dan Fessahazion, Sean Petty, Stephanie O’Malley,Theodore Hall and several students from the Visualization Studio. 

“The Duderstadt Center and the Visualization Studio exists to foster exactly these kinds of collaborations,” said Daniel Fessahazion, center’s associate director for emerging technologies. “We have a deep understanding of the technology and collaborate with faculty to explore its applicability to create unique solutions.” 

Dr. Elaine Caoili, Saroja Adusumilli Collegiate Professor of Radiology, demonstrates and Augmented Reality tool under development that will improve the safety of CT-guided biopsies.

AI’s Nelson said the first step of this new initiative is to assess and convene early innovators in XR from across the university to shape how this technology may best support serve residential and online learning. 

“We have a unique opportunity with this campus-wide initiative to build upon the efforts of engaged students, world-class faculty, and our diverse alumni network to impact the future of learning,” he said.

Customer Discovery Using 360 Video

Customer Discovery Using 360 Video

Year after year, students in Professor Dawn White’s Entrepreneurship 411 course are tasked with doing a “customer discovery” – a process where students interested in creating a business, interview professionals in a given field to assess their needs and how products they develop could address these needs and alleviate some of the difficulties they encounter on a daily basis.

Often when given this assignment, students would defer to their peers for feedback instead of reaching out to strangers working in these fields of interest. This demographic being so similar to the students themselves, would result in a fairly biased outcome that didn’t truly get to the root issue of why someone might want or need a specific product. Looking for an alternative approach, Dawn teamed up with her long time friend, Professor Alison Bailey, who teaches DEI at the University, and Aileen Huang-Saad from Biomedical Engineering, and approached the Duderstadt Center with their idea: What if students could interact with a simulated and more diverse professional to conduct their customer discovery?

After exploring the many routes this could take for development, including things like motion capture-driven CGI avatars, 360 video became the decided platform on which to create this simulation. 360 Video viewed within an Oculus Rift VR headset ultimately gave the highest sense of realism and immersion when conducting an interview, which was important for making the interview process feel authentic.

Up until this point, 360 videos were largely passive experiences. They did not allow users to tailor the experience based on their choices or interact with the scene in any way. This Customer Discovery project required the 360 videos to be responsive – when a student asked a recognized customer discovery question, the appropriate video response would need to be triggered to play. And to do this, the development required both some programming logic to trigger different videos but also an integrated voice recognition software so students could ask a question out loud and have the speech recognized within the application.

Dawn and Alison sourced three professionals to serve as their simulated actors for this project:

Fritz discusses his career as an IT professional

Fritz – Fritz is a young black man with a career as an IT professional


Cristina – Cristina is a middle aged woman with a noticeable accent, working in education


Charles – Charles is a white adult man employed as a barista

These actors were chosen for their authenticity and diversity, having qualities that may lead interviewers to make certain assumptions or expose biases in their interactions with them. With the help of talented students at the Visualization Studio, these professionals were filmed responding to various customer discovery questions using the Ricoh Theta 360 camera and a spatial microphone (this allows for spatial audio in VR, so you feel like the sound is coming from a specific direction where the actor is sitting). For footage of one response to be blended with the next, the actors had to remember to revert their hands and face to the same pose between responses so the footage could be aligned. They also were filmed giving generic responses to any unplanned questions that may get asked as well as twiddling their thumbs and patiently waiting – footage that could be looped to fill any idle time between questions.

Once the footage was acquired, the frame ranges for each response were noted and passed off to programmers to implement into the Duderstadt Center’s in-house VR rendering software, Jugular. As an initial prototype of the concept, the application was originally intended to run as a proctored simulation – students engaging in the simulation would wear an Oculus Rift and ask their questions out loud, with the proctor listening in and triggering the appropriate actor response using keyboard controls. For a more natural feel, Dawn was interested in exploring voice recognition to make the process more automated.

Within Jugular, students view an interactive 360 video where they are seated across from one of three professionals available for interviewing. Using the embedded microphone in the Oculus Rift they are able to ask questions that are recognized using Dialogue Flow, that in turn trigger the appropriate video response, allowing students to conduct mock interviews.

With Dawn employing some computer science students to tackle the voice recognition element over the summer, they were able to integrate this feature into Jugular using the Dialogue Flow agent with Python scripts. Students could now be immersed in an Oculus Rift, speaking to a 360 video filmed actor, and have their voice interpreted as they asked their questions out loud, using the embedded microphone on the Rift.

Upon it’s completion, the Customer Discovery application was piloted in the Visualization Studio with Dawn’s students for the Winter 2019 semester.

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

As part of the Kelsey museum’s most grandiose exhibition to date, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii features over 230 artifacts from the ancient world. These artifacts originate from the ancient villa of Oplontis, an area near Pompeii that was destroyed when Mt. Vesuvius erupted.

Real world location of the ancient villa of Oplontis

The traveling exhibit explores the lavish lifestyle and economic interests of ancient Rome’s wealthiest. This location is currently being excavated and is currently off limits to the general public, but as part of the Kelsey’s exhibit, visitors will get to experience the location with the assistance of virtual reality headsets like the Oculus Rift and tablet devices.

The Duderstadt Center worked closely with curator Elaine Gazda as well as the Oplontis Project team from the University of Texas to optimize a virtual re-creation for the Oculus Rift & MIDEN and to generate panoramic viewers for tablet devices. The virtual environment uses high resolution photos and scan data captured on location, mapped to the surface of 3D models to give a very real sense of being at the real-world location.

Visitors to the Kelsey can navigate Oplontis in virtual reality through the use of an Oculus Rift headset, or through panoramas presented on tablet devices.

Visitors to the Kelsey can traverse this recreation using the Rift, or they can travel to the Duderstadt to experience it in the MIDEN – and not only can viewers experience the villa as they appear in modern day-they can also toggle on an artist’s re-creation of what the villas would have looked like before their destruction. In the re-created version of the scene, the ornate murals covering the walls of the villa are restored and foliage and ornate statues populate the scene. Alongside the virtual reality experience, the Kelsey Museum will also house a physically reconstructed replica of one of the rooms found in the villa as part of the exhibit.

Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii is a traveling exhibit that will also be on display at the Museum of the Rockies at the Montana State University, Bozeman, and the Smith College Museum of Art in Northampton, Massachusetts.

On Display at the Kelsey Museum, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii