Extended Reality: changing the face of learning, teaching, and research

Extended Reality: changing the face of learning, teaching, and research

Written by Laurel Thomas, Michigan News

Students in a film course can evoke new emotions in an Orson Welles classic by virtually changing the camera angles in a dramatic scene.

Any one of us could take a smartphone, laptop, paper, Play-doh and an app developed at U-M, and with a little direction become a mixed reality designer. 

A patient worried about an upcoming MRI may be able put fears aside after virtually experiencing the procedure in advance. 

Dr. Jadranka Stojanovska, one of the collaborators on the virtual MRI, tries on the device

This is XR—Extended Reality—and the University of Michigan is making a major investment in how to utilize the technology to shape the future of learning. 

Recently, Provost Martin Philbert announced a three-year funded initiative led by the Center for Academic Innovation to fund XR, a term used to encompass augmented reality, virtual reality, mixed reality and other variations of computer-generated real and virtual environments and human-machine interactions. 

The Initiative will explore how XR technologies can strengthen the quality of a Michigan education, cultivate interdisciplinary practice, and enhance a national network of academic innovation. 

Throughout the next three years, the campus community will explore new ways to integrate XR technologies in support of residential and online learning and will seek to develop innovative partnerships with external organizations around XR in education.

“Michigan combines its public mission with a commitment to research excellence and innovation in education to explore how XR will change the way we teach and learn from the university of the future,” said Jeremy Nelson, new director of the XR Initiative in the Center for Academic Innovation.

Current Use of XR

 
Applications of the technology are already changing the learning experience across the university in classrooms and research labs with practical application for patients in health care settings. 

In January 2018, a group of students created the Alternative Reality Initiative to provide a community for hosting development workshops, discussing industry news, and connecting students in the greater XR ecosystem.

In Matthew Solomon‘s film course, students can alter a scene in Orson Welles’ classic “Citizen Kane.” U-M is home to one of the largest Orson Welles collections in the world.

Solomon’s concept for developers was to take a clip from the movie and model a scene to look like a virtual reality setting—almost like a video game. The goal was to bring a virtual camera in the space so students could choose shot angles to change the look and feel of the scene. 

This VR tool will be used fully next semester to help students talk about filmmaker style, meaning and choice.

“We can look at clips in class and be analytical but a tool like this can bring these lessons home a little more vividly,” said Solomon, associate professor in the Department of Film, Television and Media.

A scene from Orson Welles’ “Citizen Kane” from the point of view of a virtual camera that allows students to alter the action.

Sara Eskandari, who just graduated with a Bachelor of Arts from the Penny Stamps School of Art and a minor in Computer Science, helped develop the tool for Solomon’s class as a member of the Visualization Studio team.

“I hope students can enter an application like ‘Citizen Kane’ and feel comfortable experimenting, iterating, practicing, and learning in a low-stress environment,” Eskandari said. “Not only does this give students the feeling of being behind an old-school camera, and supplies them with practice footage to edit, but the recording experience itself removes any boundaries of reality. 

“Students can float to the ceiling to take a dramatic overhead shot with the press of a few buttons, and a moment later record an extreme close up with entirely different lighting.”

Sara Blair, vice provost for academic and faculty affairs, and the Patricia S. Yaeger Collegiate Professor of English Language and Literature, hopes to see more projects like “Citizen Kane.”

“An important part of this project, which will set it apart from experiments with XR on many other campuses, is our interest in humanities-centered perspectives to shape innovations in teaching and learning at a great liberal arts institution,” Blair said. “How can we use XR tools and platforms to help our students develop historical imagination or to help students consider the value and limits of empathy, and the way we produce knowledge of other lives than our own? 

“We hope that arts and humanities colleagues won’t just participate in this [initiative] but lead in developing deeper understandings of what we can do with XR technologies, as we think critically about our engagements with them.”

UM Faculty Embracing XR

 

Mark W. Newman, professor of information and of electrical engineering, and chair of the Augmented and Virtual Reality Steering Committee, said indications are that many faculty are thinking about ways to use the technology in teaching and research, as evidenced by progress on an Interdisciplinary Graduate Certificate Program in Augmented and Virtual Reality.

Newman chaired the group that pulled together a number of faculty working in XR to identify the scope of the work under way on campus, and to recommend ways to encourage more interdisciplinary collaboration. He’s now working with deans and others to move forward with the certificate program that would allow experiential learning and research collaborations on XR projects.

“Based on this, I can say that there is great enthusiasm across campus for increased engagement in XR and particularly in providing opportunities for students to gain experience employing these technologies in their own academic work,” Newman said, addressing the impact the technology can have on education and research.

“With a well-designed XR experience, users feel fully present in the virtual environment, and this allows them to engage their senses and bodies in ways that are difficult if not impossible to achieve with conventional screen-based interactive experiences. We’ve seen examples of how this kind of immersion can dramatically aid the communication and comprehension of otherwise challenging concepts, but so far we’re only scratching the surface in terms of understanding exactly how XR impacts users and how best to design experiences that deliver the effects intended by experience creators.

Experimentation for All

 

Encouraging everyone to explore the possibilities of mixed reality (MR) is a goal of Michael Nebeling, assistant professor in the School of Information, who has developed unique tools that can turn just about anyone into an augmented reality designer using his ProtoAR or 360proto software.

Most AR projects begin with a two-dimensional design on paper that are then made into a 3D model, typically by a team of experienced 3D artists and programmers. 

Michael Nebeling’s mixed reality app for everyone.

With Nebeling’s ProtoAR app content can be sketched on paper, or molded with Play-doh, then the designer either moves the camera around the object or spins the piece in front of the lens to create motion. ProtoAR then blends the physical and digital content to come up with various AR applications.

Using his latest tool, 360proto, they can even make the paper sketches interactive so that users can experience the AR app live on smartphones and headsets, without

Michael Nebeling’s mixed reality app for everyone.

spending hours and hours on refining and implementing the design in code.

These are the kind of technologies that not only allow his students to learn about AR/VR in his courses, but also have practical applications. For example, people can

experience their dream kitchen at  home, rather than having to use their imaginations when clicking things together on home improvement sites. He also is working on getting many solutions directly into future web browsers so that people can access AR/VR modes when visiting home improvement sites, cooking a recipe in the kitchen, planning a weekend trip with museum or gallery visits, or when reading articles on wikipedia or the news.

Nebeling is committed to “making mixed reality a thing that designers do and users want.”

“As a researcher, I can see that mixed reality has the potential to fundamentally change the way designers create interfaces and users interact with information,” he said. “As a user of current AR/VR applications, however, it’s difficult to see that potential even for me.”

He wants to enable a future in which “mixed reality will be mainstream, available and accessible to anyone, at the snap of a finger. Where everybody will be able to ‘play,’ be it as consumer or producer.”

 

XR and the Patient Experience

A team in the Department of Radiology, in collaboration with the Duderstadt Center Visualization Studio, has developed a Virtual Reality tool to simulate an MRI, with the goal of reducing last minute cancellations due to claustrophobia that occur in an estimated 4-14% of patients. The clinical trial is currently enrolling patients. 
VR MRI Machine

“The collaboration with the Duderstadt team has enabled us to develop a cutting-edge tool that allows patients to truly experience an MRI before having a scan,” said Dr. Richard K.J. Brown, professor of radiology. The patient puts on a headset and is ‘virtually transported’ into an MRI tube. A calming voice explains the MRI exam, as the patient hears the realistic sounds of the magnet in motion, simulating an exam experience. 

The team also is developing an Augmented Reality tool to improve the safety of CT-guided biopsies.

Team members include doctors Brown, Jadranka Stojanovska, Matt Davenport, Ella Kazerooni, Elaine Caoili from Radiology, and Dan Fessahazion, Sean Petty, Stephanie O’Malley,Theodore Hall and several students from the Visualization Studio. 

“The Duderstadt Center and the Visualization Studio exists to foster exactly these kinds of collaborations,” said Daniel Fessahazion, center’s associate director for emerging technologies. “We have a deep understanding of the technology and collaborate with faculty to explore its applicability to create unique solutions.” 

Dr. Elaine Caoili, Saroja Adusumilli Collegiate Professor of Radiology, demonstrates and Augmented Reality tool under development that will improve the safety of CT-guided biopsies.

AI’s Nelson said the first step of this new initiative is to assess and convene early innovators in XR from across the university to shape how this technology may best support serve residential and online learning. 

“We have a unique opportunity with this campus-wide initiative to build upon the efforts of engaged students, world-class faculty, and our diverse alumni network to impact the future of learning,” he said.

The Jewish Tradition of Tsedakah as Exemplified in Pushkes – Online Exhibit

The Jewish Tradition of Tsedakah as Exemplified in Pushkes – Online Exhibit

The pushke exhibit first appeared at the Jean & Samuel Frankel Center for Judaic Studies in the summer of 2015. The exhibit was composed of 40 pushkes (charitable donation boxes) of all shapes and sizes, situated in a series of display cases. The many diverse charity boxes reflect the breadth of the Jewish Heritage Collection Dedicated to Mark and Dave Harris, and illustrate the value of giving in Jewish communities throughout the world. Prior to being moved into storage for safekeeping, the collection underwent a lengthy scanning processes with help from the Duderstadt Center, to convert the collection into digitized 3D objects expanding accessibility by allowing the exhibit to be preserved and view-able online.

The Pushke Collection was digitized by the Duderstadt Center using the process of Photogrammetry. In this process, several high fidelity digital photographs are captured 360 degrees around the subject. These photos are analyzed by a computer algorithm to identify matching features on a per-pixel basis between photographs. These identified features are then used to triangulate a position within 3D space, allowing a 3D model of the object to be generated. The color information from the initial photographs is then mapped to the surface of the object in order to achieve a realistic digital replica. Select pieces of the Pushke collection have been further refined to correct imperfections resulting from the capturing process by an artist using digital sculpting and painting software, with the entire digital collection also being optimized for more efficient viewing on the web.

A web viewer was then developed and integrated into the Frankel Center’s WordPress site, to display and allow manipulation of the various pushkes in the collection. The web viewer allows each pushke to be rotated 360 degrees, and for the pushkes to be zoomed in or out, allowing for more detailed viewing than what traditional photographs typically allow.

The result of this effort, the Frankel Center’s online exhibit, “Charity Saves from Death: The Jewish Tradition of Tsedakah as Exemplified in Pushkes” can be viewed here: https://exhibits.judaic.lsa.umich.edu/pushke

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

As part of the Kelsey museum’s most grandiose exhibition to date, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii features over 230 artifacts from the ancient world. These artifacts originate from the ancient villa of Oplontis, an area near Pompeii that was destroyed when Mt. Vesuvius erupted.

Real world location of the ancient villa of Oplontis

The traveling exhibit explores the lavish lifestyle and economic interests of ancient Rome’s wealthiest. This location is currently being excavated and is currently off limits to the general public, but as part of the Kelsey’s exhibit, visitors will get to experience the location with the assistance of virtual reality headsets like the Oculus Rift and tablet devices.

The Duderstadt Center worked closely with curator Elaine Gazda as well as the Oplontis Project team from the University of Texas to optimize a virtual re-creation for the Oculus Rift & MIDEN and to generate panoramic viewers for tablet devices. The virtual environment uses high resolution photos and scan data captured on location, mapped to the surface of 3D models to give a very real sense of being at the real-world location.

Visitors to the Kelsey can navigate Oplontis in virtual reality through the use of an Oculus Rift headset, or through panoramas presented on tablet devices.

Visitors to the Kelsey can traverse this recreation using the Rift, or they can travel to the Duderstadt to experience it in the MIDEN – and not only can viewers experience the villa as they appear in modern day-they can also toggle on an artist’s re-creation of what the villas would have looked like before their destruction. In the re-created version of the scene, the ornate murals covering the walls of the villa are restored and foliage and ornate statues populate the scene. Alongside the virtual reality experience, the Kelsey Museum will also house a physically reconstructed replica of one of the rooms found in the villa as part of the exhibit.

Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii is a traveling exhibit that will also be on display at the Museum of the Rockies at the Montana State University, Bozeman, and the Smith College Museum of Art in Northampton, Massachusetts.

On Display at the Kelsey Museum, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii

S.C.I Hard Available in App Store

S.C.I Hard Available in App Store

Those with spinal cord injuries (SCI) encounter a drastically different world when they are released from the hospital. With varying degrees of disability, mobility and function, the world around them becomes a collection of physical and mental challenges which is a complete departure from their previous lifestyles. Whether they are in crutches or manual/automatic wheelchairs, they need to learn mobility, scheduling, and social tasks once again.

Players in S.C.I Hard must navigate a chaotic club scene to wrangle escaped tarsier monkeys

S.C.I Hard is a mobile game developed by the Duderstadt Center and designed by Dr. Michelle Meade for the Center for Technology & Independence (TIKTOC RERC) with funding from a NIDRR Field Initiated Development Grant.

Its purpose is to assist persons with spinal cord injury and develop and apply the necessary skills to keep their bodies healthy while managing the many aspects of SCI care, serving as a fun and engaging manual for individuals with spinal cord injuries learning independence. Tasks such as scheduling, mobility, and social interaction are all integrated subtly into the game. Players engage in goofy quests, from befriending roid-raging girlscouts in the park to collecting tarsier monkeys running rampant at a night club. The goal of S.C.I Hard was to be different from most medically oriented games, so players don’t feel like they’re being lectured or bombarded with  boring medical jargon, and instead learn the important concepts of their condition in a more light-hearted and engaging way.

Players shop for a handicap accessible vehicle to take their road test as they learn independence

With more than 30 different scenarios and mini-games, a full cast of odd characters to talk with, and dozens of collectible items and weapons only you can save the town from impending doom. SCI-Hard puts you, the player, in the chair of someone with a Spinal Cord Injury. Introducing you to new challenges and obstacles all while trying to save the world from legions of mutated animals. Join the fight and kick a** while sitting down!

S.C.I Hard is now available for free on Apple and Android devices through the app store, but will require participation in the subsequent study or feedback group to play:

Apple Devices: https://itunes.apple.com/us/app/sci-hard/id1050205395?mt=8

Android Devices: https://play.google.com/store/apps/details?id=edu.umich.mobile.SciHard&hl=en

To learn more about the subsequent study or to participate in the study involving S.C.I Hard, visit:
http://cthi.medicine.umich.edu/projects/tiktoc-rerc/projects/r2

Michigan Alumnus: Libraries with No Limits

Michigan Alumnus: Libraries with No Limits

The Duderstadt Center’s MIDEN is featured on the cover of the Michigan Alumnus with the caption “Libraries of the Future”. This tribute to Michigan’s high-tech libraries is continued on page 36 with an article that explores the new additions to our libraries that enhance student and instructor experiences. The article introduces new visualization stations in the Duderstadt Center (dubbed “VizHubs”) that are similar to the type of collaborative work spaces found at Google and Apple.

Read the full article here.

Steel Structures – Collaborative Learning with Oculus Rift

Steel Structures – Collaborative Learning with Oculus Rift

Civil & Environmental Engineering: Design of Metal Structures (CEE413) uses a cluster of Oculus Rift head-mounted displays to visualize buckling metal columns in virtual reality. The cluster is configured in the Duderstadt Center’s Jugular software so that the instructor leads a guided tour using a joystick while three students follow his navigation. This configuration allows the instructor to control movement around the virtual object while students are only able to look around.

Developed in a collaboration with the Visualization Studio, using the Duderstadt Center’s Jugular software this simulation can run on both an Oculus Rift or within the MIDEN.

Robert Alexander’s “Audification Explained” Featured on BBC World Service

Robert Alexander’s “Audification Explained” Featured on BBC World Service

Sonification is the conversion of data sets to audio files. Robert Alexander II is a Sonification Specialist working with NASA, who uses satellite recordings of the sun’s emissions to discover new solar phenomena. The Duderstadt Center worked with Robert to produce a short video explaining the concept of data audification.

Recently Robert was featured in a BBC World Service clip along with his video about making music from the sun: http://www.bbc.co.uk/programmes/p03crzsv

Lia Min: RAW, April 7 – 8

Lia Min: RAW

Research fellow Lia Min will be exhibiting “RAW”  in the 3D lab’s MIDEN April 7 & 8th from 4 – 6pm. All are welcome to attend. Lia Min’s exhibit is an intersection of art and science, assembled through her training as a neuroscientist. Her data set, commonly referred to as a “Brainbow“,  focuses on a lobe of a fruit fly brain at the base of an antenna. This visualization scales microns to centimeters to enlarge the specimen with an overall visual volume of about 1.8 x 1.8 x 0.4 meters.

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba, a Collaborative Exploration of Data

On Nov. 17th-19th the Duderstadt Center’s Visualization Expert, Ted Hall, will be in Austin, Texas representing the Duderstadt Center at SC15, a super computing event. The technology on display will allow people in Austin to be projected into the MIDEN, the University of Michigan’s immersive virtual reality cave, allowing visitors in both Ann Arbor and in Austin to explore the body of a mummified mammoth.

The mummified remains of Lyuba.

The mammoth in question is a calf called Lyuba, found in Siberia in 2007 after being preserved underground for 50,000 years. This specimen is considered the best preserved mammoth mummy in the world, and is currently on display in the Shemanovskiy Museum and Exhibition Center in Salekhard, Russia.

University of Michigan Professor Daniel Fisher and his colleagues at the University of Michigan Museum of Paleontology arranged to have the mummy scanned using X-Ray computed tomography in Ford Motor Company’s Nondestructive Evaluation Laboratory. Adam Rountrey then applied a color map to the density data to reveal the internal anatomical structures.

Lyuba with her skeleton visible.

The Duderstadt Center got this data as an image stack for interactive volumetric visualization. The stack comprises 1,132 JPEG image slices with 762×700 pixel resolution per slice. Each of the resulting voxels is 1mm cubed.

When this data is brought into the Duderstadt Center’s Jugular software, the user can interactively slice through the mammoth’s total volume by manipulating a series of hexagonal planes, revealing the internal structure. In the MIDEN, the user can explore the mammoth in the same way while the mammoth appears to exist in front of them in three virtual dimensions. The MIDEN’s Virtual Cadaver used a similar process.

For the demo at SC15, users in Texas can occupy the same virtual space as another user in Ann Arbor’s MIDEN. Via a Kinect sensor in Austin, a 3D mesh of the user will be projected into the MIDEN alongside Lyuba allowing for simultaneous interaction and exploration of the data.

Showings will take place in the MIDEN

Sean Petty and Ted Hall simultaneously explore the Lyuba data set, with Ted’s form being projected into the virtual space of the MIDEN via Kinect sensor.

More about the Lyuba specimen:
Fisher, Daniel C.; Shirley, Ethan A.; Whalen, Christopher D.; Calamari, Zachary T.; Rountrey, Adam N.;
Tikhonov, Alexei N.; Buigues, Bernard; Lacombat, Frédéric; Grigoriev, Semyon; Lazarev, Piotr A. (2014 July). “X-ray Computed Tomography of Two Mammoth Calf Mummies.” Journal of Paleontology 88(4):664-675. DOI: http://dx.doi.org/10.1666/13-092
https://en.wikipedia.org/wiki/Lyuba
http://www.dallasnews.com/lifestyles/travel/headlines/20100418-42-000-year-old-baby-mammoth-4566.ece

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

CT data was brought into Zbrush & Topogun to be segmented and re-topologized. Influence was then added to the skin mesh allowing it to deform as the bones were manipulated.

Hera Kim-Berman is a Clinical Assistant Professor with the University of Michigan School of Dentistry. She recently approached the Duderstadt Center with an idea that would allow surgeons to prototype jaw surgery specific to patient data extracted from CT scans. Hera’s concept involved the ability to digitally manipulate portions of the skull in virtual reality, just as surgeons would when physically working with a patient, allowing them to preview different scenarios and evaluate how effective a procedure might be prior to engaging in surgery.

Before re-positioning the jaw segments, the jaw has a shallow profile.

After providing the Duderstadt Center with CT scan data, Shawn O’Grady was able to extract 3D meshes of the patient’s skull and skin using Magics. From there, Stephanie O’Malley worked with the models to make them interactive and suitable for real-time platforms. This involved bringing the skull into a software like Zbrush and creating slices in the mesh to correspond to areas identified by Hera as places where the skull would potentially be segmented during surgery. The mesh was then also optimized to perform at a higher frame rate when incorporated into real-time platforms. The skin mesh was also altered, undergoing a process called “re-topologizing” which allowed it to be more smoothly deformed.  From there, the segmented pieces of the skull were re-assembled, and then assigned influence over areas of the skin in a process called “rigging”. This allowed for areas of the skin to move with selected bones as they were separated and shifted by a surgeon in 3D space.

After re-positioning of the jaw segments, the jaw is more pronounced.

Once a working model was achieved, it was passed off to Ted Hall and student programmer Zachary Kiekover, to be implemented into the Duderstadt Center’s Jugular Engine, allowing the demo to run at large scale and in stereoscopic 3D from within the virtual reality MIDEN but also on smaller head mounted displays like the Oculus Rift. Additionally, more intuitive user controls were added which allowed for easier selection of the various bones using a game controller or motion tracked hand gestures via the Leap Motion. This meant surgeons could not only view the procedure from all angles in stereoscopic 3D, but they could also physically grab the bones they wanted to manipulate and transpose them in 3D space.

Zachary demonstrates the ability to manipulate the model using the Leap Motion.