Fall 2024 XR Classes

Fall 2024 XR Classes

Looking for Classes that incorporate XR?

EECS 440 – Extended Reality for Social Impact (Capstone / MDE)

More Info Here
Contact with Questions:
Austin Yarger
ayarger@umich.edu

Extended Reality for Social Impact — Design, development, and application of virtual and augmented reality software for social impact. Topics include: virtual reality, augmented reality, game engines, ethics / accessibility, interaction design patterns, agile project management, stakeholder outreach, XR history / culture, and portfolio construction. Student teams develop and exhibit socially impactful new VR / AR applications.


ENTR 390.005 & 390.010 – Intro to Entrepreneurial Design, VR Lab

More Info Here
Contact with Questions:
Sara ‘Dari’ Eskandari
seskanda@umich.edu

In this lab, you’ll learn how to develop virtual reality content for immersive experiences in the Meta Quest, MIDEN or for Virtual Production using Unreal Engine and 3d modeling software. You’ll also be introduced to asset creation and scene assembly by bringing assets into the Unreal Engine & creating interactive experiences. At the end of the class you’ll be capable of developing virtual reality experiences, simulations, and tools to address real-world problems.

Students will have an understanding of how to generate digital content for Virtual Reality platforms; be knowledgeable on versatile file formats, content pipelines, hardware platforms and industry standards; understand methods of iterative design and the creation of functional prototypes using this medium; employ what is learned in the lecture section of this course to determine what is possible, what is marketable, and what are the various distribution methods available within this platform; become familiar with documenting their design process and also pitching their ideas to others, receiving and providing quality feedback.


UARTS 260 – Empathy in Pointclouds

More Info Here
Contact with Questions:
Dawn Gilpin
dgilpin@umich.edu

Empathy In Point Clouds: Spatializing Design Ideas and Storytelling through Immersive Technologies integrates LiDAR scanning, photogrammetry, and Unreal Engine into education, expanding the possible methodologies and processes of architectural design. Entering our third year of the FEAST program, we turn our attention to storytelling and worldbuilding using site-specific point cloud models as the context for our narratives. This year the team will produce 1-2 spatial narratives for the three immersive technology platforms we are working with: Meta Quest VR headset, MiDEN/VR CAVE, and the LED stage.


ARTDES 217 – Bits and Atoms

More Info Here
Contact with Questions:
Sophia Brueckner
sbrueckn@umich.edu

This is an introduction to digital fabrication within the context of art and design. Students learn about the numerous types of software and tools available and develop proficiency with the specific software and tools at Stamps. Students discuss the role of digital fabrication in creative fields.


ARTDES 420 – Sci-Fi Prototyping

More Info Here
Contact with Questions:
Sophia Brueckner
sbrueckn@umich.edu

This course ties science fiction with speculative/critical design as a means to encourage the ethical and thoughtful design of new technologies. With a focus on the creation of functional prototypes, this course combines the analysis of science fiction with physical fabrication or code-based interpretations of the technologies they depict.


SI 559 – Introduction to AR/VR Application Design

More Info Here
Contact with Questions:
Michael Nebeling
nebeling@umich.edu

This course will introduce students to Augmented Reality (AR) and Virtual Reality (VR) interfaces. This course covers basic concepts; students will create two mini-projects, one focused on AR and one on VR, using prototyping tools. The course requires neither special background nor programming experience.


FTVM 394 / DIGITAL 394 – Topics in Digital Media Production, Virtual Reality

More Info Here
Contact with Questions:
Yvette Granata
ygranata@umich.edu

This course provides an introduction to key software tools, techniques, and fundamental concepts supporting digital media arts production and design. Students will learn and apply the fundamentals of design and digital media production with software applications, web-based coding techniques and study the principals of design that translate across multiple forms of media production.


UARTS 260/360/460/560 – THE BIG CITY: Lost & Found in XR

More Info Here
Contact with Questions:
Matthew Solomon & Sara Eskandari
mpsolo@umich.edu / seskanda@umich.edu

No copies are known to exist of 1928 lost film THE BIG CITY, only still photographs, a cutting continuity, and a detailed scenario of the film. This is truly a shame because the film featured a critical mass of black performers — something extremely uncommon at the time. Using Unreal Engine, detailed 3D model renderings, and live performance, students will take users back in time into the fictional Harlem Black Bottom cabaret and clubs shown in the film. Students will experience working in a small game development team to create a high-fidelity, historical recreation of the sets using 3D modeling, 2D texturing skills, level design, and game development pipelines. They will experience a unique media pipeline of game design for live performance and cutting-edge virtual production. This project will also dedicate focus towards detailed documentation in order to honor the preservation of THE BIG CITY that allows us to attempt this endeavor and the black history that fuels it.


MOVESCI 313 – The Art of Anatomy

Contact with Questions:
Melissa Gross & Jenny Gear
mgross@umich.edu / gearj@umich.edu

Learn about human anatomy and how it has historically been taught through human history covering a variety of mediums including the recent adoption of XR tools. Students will get hands-on experience with integrating and prototyping AR and VR Visualization technologies for medical and anatomical study.


ARCH 565 – Research in Environmental Technology

Contact with Questions:
Mojtaba Navvab
moji@umich.edu

The focus of this course is the introduction to research methods in environmental technology. Qualitative and quantitative research results are studied with regard to their impact on architectural design. Each course participant undertakes an investigation in a selected area of environmental technology. The experimental approach may use physical modeling, computer simulation, or other appropriate methods (VR).


FTVM 455.004 – Topics in Film: Eco Imaginations
WGS 412.001 – Fem Art Practices

Contact with Questions:
Petra Kuppers
petra@umich.edu

These courses will include orientations to XR technologies and sessions leveraging Unreal Engine and Quixel 3d assets to create immersive virtual reality environments.

Security Robots Study

Security Robots

Using XR to conduct studies in robotics

Maredith Byrd


Xin Ye is a University of Michigan Master’s Student at the School of Information. She approached The Duderstadt Center with her Master’s Thesis Defense Project to test the favorability of humanoid robots. Stephanie O’Malley at the Visualization Studio helped Xin to develop a simulation using three types of security robots with varying features to see if a more humanoid robot is viewed with more favorable experiences.

Panoramic of Umich Hallway

The simulation’s goal is to make participants feel like they were interacting with a real robot standing in front of them, so the MIDEN was the perfect tool to use for this experiment. The MIDEN (Michigan Immersive Digital Experience Nexus) is a 10 x 10 x 10 square box that relies on projections so the user can naturally walk in a virtual environment. An environment is constructed in Unreal Engine and projected into the MIDEN allowing the user to still see their physical body within the projected digital world, and the digital world is created to be highly detailed. 

Panoramic of the MIDEN

Users step into the MIDEN and by wearing 3D glasses are immersed in a digital environment that recreates common locations on a college campus: such as a university hallway/commons area OR an outdoor parking lot. After a short while, the participant gains the attention of the security robot, and it approaches them to question them.

Setting up the MIDEN

Xin Ye then triggers the appropriate response so users think the robot is responding intelligently. The robots were all configured to have different triggerable answers to participants that Xin Ye could initiate behind the curtains of the MIDEN. This is a technique referred to in studies as “Wizard of Oz” because the participant thinks the robotic projection has an artificial intelligence just as a real robot in this situation would possess when in reality it is a human deciding the appropriate response.

Knightscope
Ramsee
Pepper

This project aimed to evaluate the human perception of different types of security robots – some more humanoid than others, to see if a more humanoid robot was viewed more favorably. Three different types of robots were used: Knightscope, Ramsee, and Pepper. Knightscope is a cone-shaped robot that lacks any humanoid features. Ramsee is a little more humanoid with simple facial features, while Pepper is the most humanoid with more complex features as well as arms and legs.  

Participants interacted with 1 of 3 different robot types. The robot would approach the participant in the MIDEN, and question them – asking for them to present an MCard, put on a face mask, or if they’ve witnessed anything suspicious. To ensure that these robots all had a fair chance, each used the same “Microsoft David” automated male voice. Once the dialogue chain is complete, the robot thanks the participant and moves away. The participant then removes the 3D glasses and is taken to another location in the building for an exit interview. After the simulation, participants were interviewed about their interactions with the robots. If any participant realized that it was a human controlling the robot, they were disqualified from the study. 

Knightscope in Hallway
Ramsee in Hallway

Xin Ye presented her findings in a paper titled, “Human Security Robot Interaction and Anthropomorphism: An Examination of Pepper, RAMSEE, and Knightscope Robots” at the 32nd IEEE International Conference on Robot & Human Interactive Communication in Busan, South Korea.

Fall 2023 XR Classes

Fall 2023 XR Classes

Looking for Classes that incorporate XR?

EECS 498 – Extended Reality & Society


Credits : 4
More Info Here
Contact with Questions:
Austin Yarger
ayarger@umich.edu

From pediatric medical care, advanced manufacturing, and commerce to film analysis, first-responder training, and unconscious bias training, the fledgling, immersive field of extended reality may take us far beyond the realm of traditional video games and entertainment, and into the realm of diverse social impact.

“EECS 498 : Extended Reality and Society” is a programming-intensive senior capstone / MDE course that empowers students with the knowledge and experience to…

    • Implement medium-sized virtual and augmented reality experiences using industry-standard techniques and technologies.
    • Game Engines (Unreal Engine / Unity), Design Patterns, Basic Graphics Programming, etc.
    • Design socially-conscious, empowering user experiences that engage diverse audiences.
    • Contribute to cultural discourse on the hopes, concerns, and implications of an XR-oriented future.
    • Privacy / security concerns, XR film review (The Matrix, Black Mirror, etc)
    • Carry out user testing and employ feedback after analysis.
    • Requirements + Customer Analysis, Iterative Design Process, Weekly Testing, Analytics, etc.
    • Work efficiently in teams of 2-4 using agile production methods and software.
    • Project Management Software (Jira), Version Control (Git), Burndown Charting and Resource Allocation, Sprints, etc.

Students will conclude the course with at least three significant, socially-focused XR projects in their public portfolios.

 

ENTR 390 – Intro to Entrepreneurial Design, VR Lab


Credits : 3
More Info Here
Contact with Questions:
Sara ‘Dari’ Eskandari
seskanda@umich.edu

In this lab, you’ll learn how to develop virtual reality content for immersive experiences in the Oculus Rift, MIDEN or for Virtual Production using Unreal Engine and 3d modeling software. You’ll also be introduced to asset creation and scene assembly by bringing assets into the Unreal Engine & creating interactive experiences. At the end of the class you’ll be capable of developing virtual reality experiences, simulations, and tools to address real-world problems.

Students will have an understanding of how to generate digital content for Virtual Reality platforms; be knowledgeable on versatile file formats, content pipelines, hardware platforms and industry standards; understand methods of iterative design and the creation of functional prototypes using this medium; employ what is learned in the lecture section of this course to determine what is possible, what is marketable, and what are the various distribution methods available within this platform; become familiar with documenting their design process and also pitching their ideas to others, receiving and providing quality feedback.

 

FTVM 307 – Film Analysis for Filmmakers


Credits : 3
More Info Here
Contact with Questions:
Matthew Solomon
mpsolo@umich.edu

 Filmmakers learn about filmmaking by watching films. This course reverse engineers movies to understand how they were produced. The goal is to learn from a finished film how the scenes were produced in front of the camera and microphone and how the captured material was edited. Students in this class use VR to reimagine classic film scenes – giving them the ability to record and edit footage from a virtual set.

 

UARTS 260 / EIPC FEAST – Empathy in Pointclouds


Credits: 1-5
More Info Here
Contact with Questions:
Dawn Gilpin
dgilpin@umich.edu

Empathy In Point Clouds: Spatializing Design Ideas and Storytelling through Immersive Technologies integrates LiDAR scanning, photogrammetry, and UnReal Engine into education, expanding the possible methodologies and processes of architectural design. Entering our third year of the FEAST program, we turn our attention to storytelling and worldbuilding using site-specific point cloud models as the context for our narratives. This year the team will produce 1-2 spatial narratives for the three immersive technology platforms we are working with: VR headset, MiDEN/VR CAVE, and the LED stage.

 

 

ARTDES 217 – Bits and Atoms


Credits: 3
More Info Here
Contact with Questions:
Sophia Brueckner
sbrueckn@umich.edu

This is an introduction to digital fabrication within the context of art and design. Students learn about the numerous types of software and tools available and develop proficiency with the specific software and tools at Stamps. Students discuss the role of digital fabrication in creative fields.

 

ARTDES 420 – Sci-Fi Prototyping


Credits: 3
More Info Here
Contact with Questions:
Sophia Brueckner
sbrueckn@umich.edu

This course ties science fiction with speculative/critical design as a means to encourage the ethical and thoughtful design of new technologies. With a focus on the creation of functional prototypes, this course combines the analysis of science fiction with physical fabrication or code-based interpretations of the technologies they depict.

 

SI 559 – Introduction to AR/VR Application Design

Credits: 3
More Info Here
Contact with Questions:
Michael Nebeling
nebeling@umich.edu

This course will introduce students to Augmented Reality (AR) and Virtual Reality (VR) interfaces. This course covers basic concepts; students will create two mini-projects, one focused on AR and one on VR, using prototyping tools. The course requires neither special background nor programming experience.

 

FTVM 394 – Digital Media Production, Virtual Reality

Credits: 4
More Info Here
Contact with Questions:
Yvette Granata
ygranata@umich.edu

This course provides an introduction to key software tools, techniques, and fundamental concepts supporting digital media arts production and design. Students will learn and apply the fundamentals of design and digital media production with software applications, web-based coding techniques and study the principals of design that translate across multiple forms of media production.

Architectural Lighting Scenarios Envisioned in the MIDEN

Architectural Lighting Scenarios Envisioned in the MIDEN

ARCH 535 & Arch 545, Winter 2022

Mojtaba Navvab, Ted Hall


Prof. Mojtaba Navvab teaches environmental technology in the Taubman College of Architecture and Urban Planning, with particular interests in lighting and acoustics.  He is a regular user of the Duderstadt Center’s MIDEN (Michigan Immersive Digital Experience Nexus) – in teaching as well as sponsored research.

On April 7, 2022, he brought a combined class of ARCH 535 and ARCH 545 students to the MIDEN to see, and in some cases hear, their projects in full-scale virtual reality.

Recreating the sight and sound of the 18-story atrium space of the Hyatt Regency Louisville, where the Kentucky All State Choir gathers to sing the National Anthem.

Arch 535: To understand environmental technology design techniques through case studies and compliance with building standards.  VR applications are used to view the design solutions.

Arch 545: To apply the theory, principles, and lighting design techniques using a virtual reality laboratory.

“The objectives are to bring whatever you imagine to reality in a multimodal perception; in the MIDEN environment, whatever you create becomes a reality.  This aims toward simulation, visualization, and perception of light and sound in a virtual environment.”

Recreating and experiencing one of the artworks by James Turrell.

“Human visual perception is psychophysical because any attempt to understand it necessarily draws upon the disciplines of physics, physiology, and psychology.  A ‘Perceptionist’ is a person concerned with the total visual environment as interpreted in the human mind.”

“Imagine if you witnessed or viewed a concert hall or a choir performance in a cathedral.  You could describe the stimulus generated by the architectural space by considering each of the senses independently as a set of unimodal stimuli.  For example, your eyes would be stimulated with patterns of light energy bouncing off the simulated interior surfaces or luminous environment while you listen to an orchestra playing or choir singing with a correct auralized room acoustics.”

A few selected images photographed in the MIDEN are included in this article.  For the user wearing the stereoscopic glasses, the double images resolve into an immersive 3D visual experience that they can step into, with 270° of peripheral vision.

Students explore a daylight design solution for a library.

Learning to Develop for Mixed Reality – The ENTR 390 “VR Lab”

Learning to Develop for Virtual Reality – The ENTR 390 “VR Lab”

XR Prototyping

For the past several years, students enrolled in the Center for Entrepreneurship’s Intro to Entrepreneurial Design Virtual Reality course have been introduced to programming and content creation pipelines for XR development using a variety of Visualization Studio resources. Their goal? Create innovative applications for XR. From creating video games to changing the way class material is accessed with XR capable textbooks, if you have an interest in learning how to make your own app for Oculus Rift, MIDEN or even a smart phone, this might be a class to enroll in. Students interested in this course are not required to have any prior programming or 3d modeling knowledge, and if you’ve never used a VR headset that’s OK too. This course will teach you everything you need to know.

Henry Duhaime presents his VR game for Oculus Rift, in which players explore the surface of Mars in search of a missing NASA rover.
Michael Meadows prototypes AR capable textbooks using a mobile phone and Apple’s ARKit.

Multi-Sensing the Universe

Multi-Sensing the Universe

Envisioning a Toroidal universe

Robert Alexander teamed with Danielle Battaglia, a senior in Art & Design, to compose and integrate audio effects into her conceptual formal model of the Toroidal Universe.  Danielle combined Plato’s notion of the universe as a dodecahedron with modern notions of black holes, worm holes, and child universes.  Their multi-sensory multiverse came together in the MIDEN and was exhibited there as part of the Art & Design senior integrative art exhibition.

Interested in using the MIDEN to do something similar? Contact us.

Students Learn 3D Modeling for Virtual Reality

Students Learn 3D Modeling for Virtual Reality

making tiny worlds

Stephanie O’Malley


ArtDes240 is course offered by the Stamps School of Art & Design and taught by Stephanie O’Malley that teaches students 3D modeling & animation.  As one of only a few 3D digital classes offered at the University of Michigan, AD240 sees student interest from several schools across campus with students looking to gain a better understanding of 3D art as it pertains to the video game industry.

The students in AD240 are given a crash-course in 3D modeling in 3D Studio Max and level creation within the Unreal Editor. It is then within Unreal that all of their objects are positioned, terrain is sculpted, and atmospheric effects such as time of day, weather, or fog can be added.

“Candyland” – Elise Haadsma & Heidi Liu, developed using 3D Studio Max and Unreal Engine
“Candyland” – Elise Haadsma & Heidi Liu, developed using 3D Studio Max and Unreal Engine

With just 5 weeks to model their entire environment, bring it into Unreal,  package it as an executable, and test it in the MIDEN (or on the Oculus Rift), the resulting student projects were truly impressive. Art & Design Students Elise Haadsma & Heidi Liu took inspiration from the classic board game, “Candyland” to create a life-size game board environment in Unreal consisting of a lollipop forest, mountains of Hershey’s kisses, even a gingerbread house and chocolate river.

Lindsay Balaka  from the School of Music, Theater & Dance, chose to create her scene using the Duderstadt Center’s in-house rendering software “Jugular” instead of Unreal Engine-Her creation, “Galaxy Cakes”, is a highly stylized (reminiscent of an episode of the 1960’s cartoon, The Jetson’s) cupcake shop, complete with spatial audio emanating from the corner Jukebox.

Lindsay Balaka’s “Galaxy Cakes” environment
An abandoned school, created by Vicki Liu in 3D Studio Max and Unreal Engine

Vicki Liu, also of Art & Design, created a realistic horror scene using Unreal. After navigating down a poorly lit hallway of an abandoned nursery school, you will find yourself in a run down classroom inhabited by some kind of mad man. A tally of days passed has been scratched into the walls, an eerie message scrawled onto the chalkboard, and furniture haphazardly barricades the windows.

While the goal of the final project was to create a traversible environment for virtual reality, some students took it a step further.

Art & Design student Gus Schissler created an environment composed of neurons in Unreal intended for viewing within the Oculus Rift. He then integrated data from an Epoch neurotransmitter (a device capable of reading brain waves) to allow the viewer to telepathically interact with the environment. The viewers mood when picked up by the Epoch not only changed the way the environment looked by adjusting the intensities of the light being emitted by the neurons, but also allowed the viewer to think specific commands (push, pull, etc) in order to navigate their way past various obstacles in the environment.

Students spend the last two weeks of the semester scheduling time with Ted Hall and Sean Petty to test their scenes and ensure everything runs and looks correctly on the day of their presentations. This was a class that not just introduced students to the design process, but to also allowed them to get hands on experience with upcoming technologies as virtual reality continues to expand in the game and film industries.

Student Gus Schissler demonstrates his Neuron environment for Oculus Rift that uses inputs from an Epoch neurotransmitter to interact.

Student Uses Photogrammetry to Miniaturize Herself

Stamps Student Uses Photogrammetry to Miniaturize Herself

  Stamps student Annie Turpin came to the Duderstadt Center with an idea for her Sophomore studio project: She wanted to create a hologram system, similar to the “Pepper’s Pyramid” or “Pepper’s Ghost” display, that would allow her to project a miniaturized version of herself into a pinhole camera.

Pepper’s Ghost relied on carefully placed mirrors to give the illusion of a transparent figure

  The concept of Pepper’s Pyramid is derived from an illusion technique created by John Henry Pepper in 1862. Originally coined “Pepper’s Ghost”, the trick initially relied on a large pane of glass to reflect an illuminated room or person that was hidden from view. This gave the impression of a “ghost” and became a technique frequently used in theatre to create a phantasmagoria. Similar methods are still used today, often substituting Mylar foil in place of glass and using CG content (such as the 2012 Coachella performance, in which a “holographic” Tupac was resurrected to sing alongside Dr. Dre).

Pepper’s Pyramid takes the concept of Pepper’s Ghost, and gives it 3 dimensions using a pyramid of Plexiglas instead of mirrors.

  “Pepper’s Pyramid” is a similar concept. Instead of a single pane of glass reflecting a single angle, a video is duplicated 4 times and projected downward onto a pyramid of Plexiglas, allowing the illusion to be viewed from multiple angles and for the content to be animated.

  For Annie’s project, she re-created a small version of Pepper’s Pyramid to fit inside a pinhole camera that she had constructed, and used a mobile phone to project the video instead of a monitor. She then had herself 3D scanned using the Duderstadt Center’s Photogrammetry rig to generate a realistic 3D model of herself that was animated and then exported as an MP4 video.

Annie’s pinhole camera

  The process of Photogrammetry allows an existing object or person to be converted into a full color, highly detailed, 3D model. This is done using a series of digital photographs captured 360 degrees around the subject. While Photogrammetry can be done at home for most static subjects, the Duderstadt Center’s Photogrammetry resources are set up to allow moving subjects like people to be scanned as well. The process using surface detail on the subject to plot points in 3D space and construct a 3D model. For scans of people, these models can even have a digital skeleton created to drive their motion, and be animated as CGI characters. Annie’s resulting scan was animated to rotate in place, and projected into the the plexiglas pyramid as a “hologram” for viewing through her pinhole camera.

The result of 3D printing Annie’s photogrammetry scan

  Annie would make use of Photogrammetry again the following year, when she had herself 3d scanned again, but this time for the purpose of 3D printing the resulting model for a diorama. In this instance, she was scanned using Photogrammetry in what is referred to as “T-Pose”. This is a pose where the subject stands with their arms and legs apart, so their limbs can be articulated into a different position later. After Annie’s model was generated, it was posed to have her sitting in a computer chair and working on a laptop. This model was sent to the Duderstadt Center’s J750 3D color printer to produce a 6″ high 3D printed model.

  This printer allows for full spectrum color and encases the model in a support structure that must be carefully removed, but allows for more intricate features and overhangs on the model.

Annie carefully removes the support structure from her 3D printed model

A duplicate print of Annie’s creation can now be viewed in the display case within the Duderstadt Center’s Fabrication Studio.

Passion & Violence: Anna Galeotti’s MIDEN Installation

Passion & Violence

Anna Galeotti’s MIDEN INstallation

Ph.D. Fullbright Scholar (Winter, 2014) Anna Galeotti:  exploring the concept of “foam” or “bubbles” as a possible model for audiovisual design elements and their relationships. Her art installation, “Passion and Violence in Brazil” was displayed in the Duderstadt Center’s MIDEN.

Interested in using the MIDEN to do something similar? Contact us.

Creating Cave-Like Digital Structures with Photogrammetry

Creating Cave-Like Digital Structures with Photogrammetry

Students in Professor Matias Del Campo’s Architecture Thesis class have been exploring organic, cave-like structures for use in a real-world underground architectural space.

His students were tasked with constructing textured surfaces reminiscent of cave interiors such as stalactites and stalagmites, rocky surfaces, and erosion using a variety of mediums-from spray foam to poured concrete.

These creations were then scanned at the Duderstadt Center using the process of Photogrammetry to convert their model to digital form. The resulting digital models could then be altered (retouched, scaled or mirrored, for example) by the students for design purposes when incorporating the forms into the planned space.