Delving into the art (instead of science) of anatomy

Delving into the art (instead of science) of anatomy

New XR Course for FAll 2024

Author


The first thing the students saw were the bones.

There were more than a hundred of them, stacked neatly in plastic bins on a long table in the front of classroom 2060. Some were long and slender, others bulbous and asymmetrical. All had the same glossy sheen.

From far away, they resembled delicate china figurines. Up close, it was easier to tell that they were 3D-printed versions of the same bones you’d find in a human pelvis or mouth or arm.

The students in the “Art of Anatomy” mini-course rummaged through the bone-like objects, serious expressions on their faces as they deliberated which to choose.

Their assignment that day was to create a sculptural arrangement. It could be anatomically correct; it could resemble nothing that you’d typically find in a skeleton. Then they had to take a picture of their designs and draw the shadows they’d made, using graphite pencils or charcoal.

The activity was intended to explore the body from a different perspective, to discover the angles and shapes its parts could make, and to ask the question that lay at the core of every session in this course: How does interacting with models of our anatomy, which try to approximate the experience of real human bodies, compare to encountering the real thing?

The next hour or so was nearly silent, except for the clunking of the tiny bones and the scratching of the pencils. Maya Moufawad, a pre-dental and art major, had chosen two halves of a jaw, complete with teeth. She fit them together and then affixed them to two smaller bones, making it look like the Flintstones had gotten their hands on a dental mold and decided to display it as art, using bones as the frame.

Movement science student Abby Kramer went with a thoracic vertebra, a lumbar vertebra, and a sacral bone. She liked being able to hold the bones, to turn them around and flip them upside down to better understand their structure and proportion.

She connected the lumbar vertebral bone directly with the sacrum, which would have been in the appropriate location anatomically. But, as she noted later: “There were still a lot of unknowns.” You couldn’t fully understand the body by looking at these bones. They were, both literally and figuratively, missing connective tissue.

“We’re trying to get them to understand that even the most factual anatomical model is still a fiction,” says Jennifer Gear, an art history and movement science lecturer who co-designed and taught the course. “It’s still removed from the body. In what ways and for what reasons? How do you stop thinking about these things as objective truths — but rather, to see them as believable fictions?”

***

When movement science student Regan Lee walked into the Capuchin Crypt in Rome, she, too, was fascinated by the bones.

In this case, they were real human bones from deceased Catholic friars, used to adorn a mausoleum that is like few others on Earth. The crypt is literally decorated with human remains — skulls framing archways, tibias and femurs arranged in elaborate crosses and mandalas on the walls and ceilings.

At the time, Lee was on a day trip during movement science associate professor Melissa Gross’ class, “Art and Anatomy in the Italian Renaissance,” for which students travel to Italy and use the classical statues and paintings of the Renaissance era as a guide to learning about anatomical structures.

As she walked away from the unique crypt, Lee was “nerding out.”

“I think everyone should see this,” she told Gross.

Gross had a different idea.

“What if we made this a class?” she pondered. “Let’s have students make their own art with the bones they’re used to looking at. We could 3-D print bones that the students could think critically about.”

“That’s crazy,” Lee responded. “Are you serious?”

***

Gross was indeed. She’d 3-D printed a small number of bones for previous anatomy courses, so she knew it could be done. And she’d spent her career creating innovative interdisciplinary courses in an attempt to engage students, stimulating them to learn in ways that worked better for them.

Together with Gear, she’d applied that paradigm of thinking to create the “Art and Anatomy in the Italian Renaissance” course Lee was so enjoying. She thought she and Gear could build off that successful partnership and come up with a class that challenged students to revisit their preconceptions about both art and the body.

The pair started brainstorming. They wanted to teach a projects-based class — one with no tests, plenty of guest speakers, and lots of hands-on activities. They wanted to take students to different locations: the Hatcher Library’s Special Collections Research Center to look at Renaissance-era anatomy books, the Taubman Health Sciences Library to examine digital cadavers via the interactive Anatomage table, the Visualization Studio at the James and Anne Duderstadt Center on North Campus to play around with bones in virtual and augmented reality.

“I think of the classroom as a sandbox,” Gear says, “and I’m going to bring my best toys. Because I’ve got to be there with the students every day, too, and I don’t want to be bored. So I try to think about what would be fun to do, and this was a class that could lend itself to fun things.”

They wanted to ground the course in an arts-based approach, using critical thinking to respectfully challenge assumptions and foster dialogue that valued different perspectives. To do so, they planned to advertise in different schools on campus to attract students with varying backgrounds.

“Our goal was to open the students’ minds to other ways of seeing, of moving, of experiencing,” Gross says.

***

Coincidentally, the U-M Arts Initiative was looking for proposals for its Arts & Curriculum grant, which promotes the integration of arts into course development and teaching. In November 2022, the initiative gave its approval — and $19,611 worth of funds — to support Gross and Gear’s seven-week-long mini-course.

The pair used some of the grant money to pay Lee, who began the arduous task of printing the bones. Even the smallest ones took hours, and the printers often malfunctioned. Lee stuffed the ones that failed to print in her bag, and they clanked around as she walked.

“Even my apartment had bones everywhere,” Lee says.

Eventually, most of the bones made their way to SKB’s classroom 2060, as did 20 students — some from Kinesiology, some from Engineering, some from the Stamps School of Art and Design.

The students drew the bones, sometimes asking those who specialized in art to help the others portray the structures accurately.

Maya Moufawad drawing her 3D-printed bone sculpture

They paged through 16th-century books full of woodcut illustrations of bodies and bones, their faces full of wonder at the opportunity.

Ariana Ravitz looking at ancient anatomy books

They manipulated a digital cadaver on the Anatomage table, working as a group to make decisions about which bones and muscles and tissues to look at first and how to explore them. In that case, the Kines and biomedical engineering students often took the lead in explaining the names of the bones and where they were located.

They dissected five real animal carcasses and bones that Gross had gotten from generous butchers at Plum Market; one student, who disliked the smell of meat, was able to overcome her discomfort enough to participate with the support of her fellow classmates.

They talked about the ethics of using bones and bodies for research or education. In their reflection for that class session, students discussed whether they would donate their bodies to science given what they’d learned, noting that it was rare for them to feel this comfortable talking about such a difficult topic in class.

It began to feel like a kind of alchemy was taking place on Fridays from noon to 2 p.m.

“Every single class, I found myself being encouraged to think deeper, within my own knowledge and with the help of my peers,” one student wrote in a reflection. “The class’ emphasis on helping each other to understand is something I value so much. In fact, these discussions were so interesting to me that I always called my mom about them afterwards, because I was so excited to continue the conversation.”

“To see them bring their authentic selves to the challenges we’re setting every week, for them to treat it so seriously,” Gross says, “it feels like we’ve touched something important.”

***

On the final day of class, the students had one last opportunity to see the bones in a new way.

The Emerging Technologies Group at the Duderstadt Center had taken the digital files used to 3D print the bones and uploaded them to their visualization platforms, including virtual and augmented reality set-ups.

Movement science student Gordon Luo held a controller in one hand, using his index finger to press a button that grabbed the bone on his computer screen and moved it around. Then he found a way to digitally measure the bone.

“That’s so cool,” he says.

He was so immersed in the experience that he nearly tripped over the desk, less aware of his physical surroundings compared to the virtual world of the bones.

“It’s cool to realize this is where we’re at with technology,” he says.

Art student Summer Pengelly and biomedical engineering student Angel Rose Sajan were wearing HoloLens headsets that projected the bones hologram-style onto their surroundings.

“We’re building an elephant,” Pengelly tells me. “Or placing the bones so they’re shaped like an elephant head. I wish I could take a photo so I could show you. Oh, I just did.”

The photo was still contained in the software, so Pengelly picked up a piece of paper and started drawing the arrangement they’d made.


She and Sajan both agreed that they liked the HoloLens better than the VR headsets.

“It’s easier to manipulate the bones,” Pengelly says. “Using your hands as controllers gives you more access.”

“I kept turning the controller to figure out how to hold it,” Sajan says.

In the back of the visualization studio lay yet another digital environment to explore. Called the MIDEN for Michigan Immersive Digital Experience Nexus, it projects images onto the walls and floor of a room. Users wear headsets that place them within the environment created and give them tools to manipulate the objects in the space. In this case, students were able to slice a cadaver into different planes.

Cece Crowther and another student explore the MIDEN in the Duderstadt Center.

“MIDEN might be my favorite [of the technologies],” says Cece Crowther, a biomedical engineering student. “The Anatomage Table had the same energy as medical school. This felt more artistic.”

“But I could call three different [sessions] my favorite in this class,” she says. “Every class has been unique.”

***

A cake with an artistic pattern made from repeating bone patterns

When the mixed reality class wound down, everyone gathered to eat celebratory cake. The top of the cake had an artistic design, made by creating a repeating pattern of one of the bone sculptures a student had designed early in the course.

“We’ve touched, looked at, manipulated, and drawn bones,” Gross says to the group. “Now we’re eating bones to wrap it all up.”

As students ate their cake, they reflected on the course, sharing feedback like, “I will not stop recommending this class to people” and “I made my schedule around this class.” Several mentioned that they’d gained so much from working alongside folks with different backgrounds.

“I appreciate this class so much because it normalizes the idea of art and science working together,” Moufawad, the art and pre-dental major, tells me. “Whenever I tell people what I’m studying, they always think it’s random, but it’s really not. There’s so much at the intersection of these two topics, and I love that this class celebrates that.”

A few weeks later, after the students have written their final reflections, I meet Gross in her first-floor office. She’s giddy over the success of the course. Her eyes light up and her tone becomes reverential as she talks about what she and Gear, with the help of some committed students, have managed to achieve.

“This experience we spent so many hours designing and thinking about, it actually worked,” Gross says. “Some important vein got exposed, and we’re not sure what’s flowing. It’ll take some time to unpack what was so empowering for so many students, but it’s a big fulfillment for us as teachers.”

“Delight,” she says, “is too soft a word.”

The Art of Anatomy course was made possible by a grant from the Arts Initiative at the University of Michigan to recipient Melissa Gross. Gross and Gear plan to offer the course again in fall 2024.

Full Article from the University of Michigan School of Kinesiology:

https://www.kines.umich.edu/news-events/news/delving-art-instead-science-anatomy

Planting Disabled Futures – A call for artists to collaborate

Planting Disabled Futures

OPen Call for Artist Collaborators

Author


Petra Kuppers is disability culture activist and a community performance artist. She creates participatory community performance environments that think/feel into public space, tenderness, site-specific art, access and experimentation. Petra grounds herself in disability culture methods, and uses ecosomatics, performance, and speculative writing to engage audiences toward more socially just and enjoyable futures.


Her latest project, Planting Disabled Futures, is funded by a Just Tech fellowship.

In the Planting Disabled Futures project, Petra aims to use live performance approaches and virtual reality (and other) technologies to share energy, liveliness, ongoingness, crip joy and experiences of pain. 

In the development of the Virtual Reality (VR) components of the project, we will ask: How can VR allow us to celebrate difference, rather than engage in hyper-mobile fantasies of overcoming and of disembodied life? How can our disabled bodymindspirits develop non-extractive intimacies, in energetic touch, using VR as a tool toward connecting with plants, with the world, even in pain, in climate emergency, in our ongoing COVID world?

A watercolor mock-up of the Crip Cave, with Moira Williams’ Stim Tent, two VR stations, a potential sound bed, and a table for drawing/writing.

Petra envisions a sensory art installation equipped with a VR experience, stimming tent, a soundbed and a drawing and writing table. The VR experience would be supplemented by actors providing opportunities to engage with unique taste, touch and smell sensations as the environment is navigated.

A cyanotype (blue) and watercolor mock-up of what the VR app might look like: a violet leaf with sensation hubs, little white ink portals, that might lead to an audio dream journey

The VR experience involved in the Crip-Cave is expected to be tree-like environment that allows participants to select either a visual or an auditory experience. Participants can travel down to the roots and experience earth critters or up to the branches and into the leafy canopy. In both locations, “sensory hubs” would take participants on a journey to other worlds – worlds potentially populated with content produced by fellow artists.

A cyanotype/watercolor mock-up of little critters that might accompany you on your journey through the environment.

Artist collaborators are welcome to contribute their talents generating 3d worlds in Unreal Engine, reciting poetry, animating or composing music to create a dream journey in virtual reality. Artists generating digital content they would like considered for inclusion in this unique art installation can reach out to: [email protected]


To learn more about Planting Disabled Futures, visit:
https://www.petrakuppers.com/planting-disabled-futures

Engineering Grants for XR

The Enhancing Engineering Education Grants Program is designed to support innovative strategies for engaging and supporting all learners in Michigan Engineering undergraduate courses. This program springs from a collaboration among ADUEADGPECAENCRLT-Engin, and Nexus. Proposals will be invited across the range of innovations in engineering education, including instructional practices, course design and content, and instructional technology.

As part of the initial Enhancing Education using Technology (EET) proposal to the College to support the instructional needs of faculty, grants were offered to support the implementation of innovative ideas that instructors needed money to accomplish. The first year of the grants program was FY23 and all grant money was awarded to faculty. It included three major grants of $50K each on the topics of XR, DEI, and Tandem. Additional smaller grants were also awarded to faculty. At the completion of this first year, the team used the past year’s knowledge to propose improvements and changes to the program.

For AY 2024-2025, there are three grants available to support instructional faculty members:

Education Innovation Grants

Grants of up to $10K are available to COE faculty & staff

About the Grant

Grants of up to $10K are available to individual or small groups of Michigan Engineering instructional faculty and staff members seeking to implement innovative teaching methods and/or tools.

Group 2 applications are now being accepted. This call for proposals is open to all eligible applicants and does not necessitate a previous Group 1 proposal or submission.


Proposal Evaluation Criteria
  • Applies a novel method or tool to achieve objectives
  • Reflects innovation in teaching methods or approaches
  • For online courses they utilize the Quality Matters framework and work with Nexus to do so.
  • Involves partnering with Nexus or CRLT-E to co-teach a new faculty development workshop
  • Builds upon evidence-based best practices for enhancing student learning
  • Promotes equitable instruction for all learners
  • Implements practices or tools that have the potential for great impact (either large impact on a small population, or be something that could be applied to a larger population at Michigan Engineering)
  • Other funding opportunities do not exist for this type of work
  • Achieves synergy with goals, strengths, and ongoing work of the College of Engineering, especially as it relates to Michigan Engineering’s Strategic Vision

Group 2 applications close Wednesday, May 1, 2024

Project Statement:

  • Clearly describe the proposed project
  • Explain the value of the project
  • Identify the specific innovation and its relation to evidence-based practices
  • Explain how the project supports equitable instruction and enhanced student learning
  • Discuss the project’s potential for application in broader contexts

Project Evaluation Plan:
Explain how the success of this project will be evaluated, documented, and disseminated. Approaches might include midterm course assessments, focus groups, and surveys, among others.

Budget Request:

  • Graduate or undergraduate student salaries
  • Materials and supplies
  • Project evaluation expenses
  • Travel and registration fees for teaching-related conferences, seminars or workshops
  • Faculty member summer salary (up to $2K of the project’s total budget)

Timeline:
Submissions will be accepted until Wednesday, May 1, 2024 with funding decisions announced late May.

Strategic Technology Grants

COE Projects Focussed on XR, online/hybrid learning and/or generative artificial intelligence

About the Grant

Grants of up to $50,000 are available to teams of at least three Michigan Engineering instructional faculty and staff members to implement innovative teaching methods and/or tools that require an investment of time/resources and collaboration for deployment that is larger than what is available via Education Innovation Grants. Projects should focus on strategic themes of XR, online/hybrid learning and/or generative artificial intelligence.


Proposal Evaluation Criteria
  • Applies a novel method, modality or tool to achieve objectives
  • Reflects innovation in teaching methods or approaches
  • Builds upon evidence-based best practices for enhancing student learning
  • Promotes equitable instruction for all learners
  • If online, leverages the Quality Matters rubric and best practices with online course design and development
  • Implements practices or tools that have the potential for great impact (either large impact on a small population, or be something that could be applied to a larger population at Michigan Engineering)
  • Achieves synergy with goals, strengths, and ongoing work of ADUEADGPRCAENCRLT-EnginNexus, and/or the broader College of Engineering, especially as it relates to Michigan Engineering’s Strategic Vision

Applications closed March 4, 2024

Identify your proposal’s strategic theme:

  • Online/hybrid learning
  • Generative artificial intelligence
  • XR

Project Statement:

  • Clearly describe the proposed project
  • Explain the value of the project
  • Identify the specific innovation and its relation to evidence-based practices
  • Explain how the project supports equitable instruction and enhanced student learning
  • If online, describe how the course design and development effort will leverage the Quality Matters rubric
  • Discuss the project’s potential for great impact
  • Describe your goals for collaboration with at least one E3 grant sponsor (ADUE, ADGPE, CAEN, CRLT-Engin, and/or Nexus)

Project Evaluation Plan:
Explain how the success of this project will be evaluated, documented, and disseminated.

Budget Request:

  • Graduate or undergraduate student salaries
  • Instructional software and classroom technology
  • Materials and supplies
  • Project evaluation expenses
  • Travel and registration fees for teaching-related conferences, seminars or workshops
  • Faculty member summer salary (up to $2K of the project’s total budget)

Team Roster:
Provide a list of all team members, with descriptions of their respective roles and very brief bios.

Timeline:

Submissions are due on Monday, March 4, 2024 with funding decisions announced in April.

Software Pilot Grants

GRANT FUNDING UP TO $10K for COE Faculty & STAFF SEEKING TO PILOT INSTRUCTIONAL SOFTWARE

About the Grant

Grants of up to $10K are available to instructional faculty and staff members seeking to pilot innovative and results-oriented instructional software that has the potential to improve teaching and learning in Michigan Engineering. Proposals may be submitted by individuals requesting software for a specific class or a team of faculty members requesting software for a group of classes.

In the spirit of innovation, all ideas are welcome. Proposals that call for the use of collaborative teaching and learning strategies are encouraged. Priority will be given to projects that, if proven successful, can be replicated throughout the College.

Please note that there are many routes for procuring software licenses at the University of Michigan. We encourage you to reach out to our team at [email protected] to help determine if this grant program is appropriate for your request before submitting a proposal.


REQUIRED DELIVERABLES
  • Presentation of a case study of your application of the software and how it impacted your students’ learning objectives to the Michigan Engineering faculty community
  • Engagement with CAEN on evaluation of software for possible college adoption
  • Acting as a faculty advocate for this software and sharing how you are using it in your class

Applications for Fall 2024 close April 1, 2024

Course Information:
Logistical course details including frequency the course is taught, enrollment summary, etc.

Learning Gaps:
Describe the learning gap(s) you have identified in your lesson/module/unit/course.

Teaching Intervention (Pedagogical Support):
Explain the teaching and technology intervention(s) that will close the stated learning gaps. Identify the evidence-based practices that support the efficacy of the proposed software solution.

Comparative Tool Evaluation:

  • Identify 3-4 comparable software tools (including your proposed tool) that could fill the established learning gaps.
  • List the criteria you will use to evaluate the 3-4 comparable tools to inform your decision making.

Project Evaluation Plan:

  • Explain how the success of this software will be evaluated, documented, and disseminated -approaches might include midterm course assessments, focus groups, and surveys, among others.
  • Explain how you will evaluate if this software met the needs of you and your students. How will you identify if it has improved the educational experience?

Budget Request:
Provide the number of licenses, estimated cost per license,  and estimated total cost for this software.

Timeline:
To use the software for instruction in the Fall 2024 term, proposals must be submitted by April 1, 2024.

Fall 2024 XR Classes

Fall 2024 XR Classes

Looking for Classes that incorporate XR?

EECS 440 – Extended Reality for Social Impact (Capstone / MDE)

More Info Here
Contact with Questions:
Austin Yarger
[email protected]

Extended Reality for Social Impact — Design, development, and application of virtual and augmented reality software for social impact. Topics include: virtual reality, augmented reality, game engines, ethics / accessibility, interaction design patterns, agile project management, stakeholder outreach, XR history / culture, and portfolio construction. Student teams develop and exhibit socially impactful new VR / AR applications.


ENTR 390.005 & 390.010 – Intro to Entrepreneurial Design, VR Lab

More Info Here
Contact with Questions:
Sara ‘Dari’ Eskandari
[email protected]

In this lab, you’ll learn how to develop virtual reality content for immersive experiences in the Meta Quest, MIDEN or for Virtual Production using Unreal Engine and 3d modeling software. You’ll also be introduced to asset creation and scene assembly by bringing assets into the Unreal Engine & creating interactive experiences. At the end of the class you’ll be capable of developing virtual reality experiences, simulations, and tools to address real-world problems.

Students will have an understanding of how to generate digital content for Virtual Reality platforms; be knowledgeable on versatile file formats, content pipelines, hardware platforms and industry standards; understand methods of iterative design and the creation of functional prototypes using this medium; employ what is learned in the lecture section of this course to determine what is possible, what is marketable, and what are the various distribution methods available within this platform; become familiar with documenting their design process and also pitching their ideas to others, receiving and providing quality feedback.


UARTS 260 – Empathy in Pointclouds

More Info Here
Contact with Questions:
Dawn Gilpin
[email protected]

Empathy In Point Clouds: Spatializing Design Ideas and Storytelling through Immersive Technologies integrates LiDAR scanning, photogrammetry, and Unreal Engine into education, expanding the possible methodologies and processes of architectural design. Entering our third year of the FEAST program, we turn our attention to storytelling and worldbuilding using site-specific point cloud models as the context for our narratives. This year the team will produce 1-2 spatial narratives for the three immersive technology platforms we are working with: Meta Quest VR headset, MiDEN/VR CAVE, and the LED stage.


ARTDES 217 – Bits and Atoms

More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This is an introduction to digital fabrication within the context of art and design. Students learn about the numerous types of software and tools available and develop proficiency with the specific software and tools at Stamps. Students discuss the role of digital fabrication in creative fields.


ARTDES 420 – Sci-Fi Prototyping

More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This course ties science fiction with speculative/critical design as a means to encourage the ethical and thoughtful design of new technologies. With a focus on the creation of functional prototypes, this course combines the analysis of science fiction with physical fabrication or code-based interpretations of the technologies they depict.


SI 559 – Introduction to AR/VR Application Design

More Info Here
Contact with Questions:
Michael Nebeling
[email protected]

This course will introduce students to Augmented Reality (AR) and Virtual Reality (VR) interfaces. This course covers basic concepts; students will create two mini-projects, one focused on AR and one on VR, using prototyping tools. The course requires neither special background nor programming experience.


FTVM 394 / DIGITAL 394 – Topics in Digital Media Production, Virtual Reality

More Info Here
Contact with Questions:
Yvette Granata
[email protected]

This course provides an introduction to key software tools, techniques, and fundamental concepts supporting digital media arts production and design. Students will learn and apply the fundamentals of design and digital media production with software applications, web-based coding techniques and study the principals of design that translate across multiple forms of media production.


UARTS 260/360/460/560 – THE BIG CITY: Lost & Found in XR

More Info Here
Contact with Questions:
Matthew Solomon & Sara Eskandari
[email protected] / [email protected]

No copies are known to exist of 1928 lost film THE BIG CITY, only still photographs, a cutting continuity, and a detailed scenario of the film. This is truly a shame because the film featured a critical mass of black performers — something extremely uncommon at the time. Using Unreal Engine, detailed 3D model renderings, and live performance, students will take users back in time into the fictional Harlem Black Bottom cabaret and clubs shown in the film. Students will experience working in a small game development team to create a high-fidelity, historical recreation of the sets using 3D modeling, 2D texturing skills, level design, and game development pipelines. They will experience a unique media pipeline of game design for live performance and cutting-edge virtual production. This project will also dedicate focus towards detailed documentation in order to honor the preservation of THE BIG CITY that allows us to attempt this endeavor and the black history that fuels it.


MOVESCI 313 – The Art of Anatomy

Contact with Questions:
Melissa Gross & Jenny Gear
[email protected] / [email protected]

Learn about human anatomy and how it has historically been taught through human history covering a variety of mediums including the recent adoption of XR tools. Students will get hands-on experience with integrating and prototyping AR and VR Visualization technologies for medical and anatomical study.


ARCH 565 – Research in Environmental Technology

Contact with Questions:
Mojtaba Navvab
[email protected]

The focus of this course is the introduction to research methods in environmental technology. Qualitative and quantitative research results are studied with regard to their impact on architectural design. Each course participant undertakes an investigation in a selected area of environmental technology. The experimental approach may use physical modeling, computer simulation, or other appropriate methods (VR).


FTVM 455.004 – Topics in Film: Eco Imaginations
WGS 412.001 – Fem Art Practices

Contact with Questions:
Petra Kuppers
[email protected]

These courses will include orientations to XR technologies and sessions leveraging Unreal Engine and Quixel 3d assets to create immersive virtual reality environments.

New MIDEN – Unveiling of the upgraded MIDEN for Fall ‘23

New MIDEN – Unveiling of the upgraded MIDEN for Fall ‘23

New dual view capabilities

Maredith Byrd


We have upgraded the MIDEN! The new projectors use LEDs with much brighter and higher resolution using four Christie Digital M 4K25 RGB Laser Projectors. The new projectors use LEDs that have a longer lifespan. We used to have to limit how often and how long the MIDEN was run because the previous lamps had a very limited lifespan of just 1250 hours. For a 10′ x 10′ Screen, the resolution for each screen will be 2160×2160, which is double the previous resolution. There are now 25,000 hours of Lifespan at 100% brightness and 50,000 hours at 50% brightness.The new capabilities allow for two people to experience the view at once. They can see the same virtual content aligned to each of their unique perspectives and simultaneously interact with the content.

In a typical setup, 3D stereoscopic content (like what you would experience in a 3D movie) is projected onto three walls and the floor and stitched seamlessly together. Users wear a set of motion-tracked glasses that allow their perspective to be updated depending on where they are standing or looking, and use a motion-tracked video game controller to navigate beyond the confines of the 10’x10’ room. To the user wearing the 3D glasses, the projected content appears entirely to scale and has realistic depth – they can look underneath tables that appear to be situated in front of them, despite the table being projected onto one of the walls.

The MIDEN supports 3D Modeling formats exported by the most popular modeling software: Blender, 3ds Max, Maya, Sketchup, Rhino, Revit, etc. These models can be exported in the following formats and then imported into our “Jugular” software: OBJ, FBX, STL, and VRML formats. The MIDEN can also produce Unreal Engine scenes where we use the nDisplay plugin to split the scene into 4 different cameras to correspond with the 4 projectors in the MIDEN. 

MIDEN users experience immersion in a virtual environment without it blocking their view of themselves or their surroundings as a VR headset does. Since VR “CAVE” is a trademarked term, ours is called the MIDEN, which stands for Michigan Immersive Digital Experience Nexus and the MIDEN takes traditional “CAVE” technology much further – it is driven by our in-house developed rendering engine that affords more flexibility than a typical “CAVE” setup.

The MIDEN is more accessible than VR headsets, meaning it takes less time to set up and begin using compared to headsets. The game controller used is a standard Xbox-type gaming pad, familiar to most gamers. The MIDEN has increased immersion, the vision of the real world is not hidden, so users do not have to worry about trip hazards or becoming disoriented. The MIDEN users see their real body unlike in a VR headset where the body is most likely a virtual avatar. This results in less motion sickness. 

It can be used for Architectural Review, Data Analysis, Art Installations, Learning 3D modeling, and much more. From seeing the true scale of a structure in relation to the body to sensory experiences with unique visuals and spatialized audio, the MIDEN is capable of assisting these projects to a new level.

The MIDEN is available to anyone to use for a project, class exercise, or tour by request. They can contact [email protected] to arrange to use it. Use of the MIDEN does require staff to run it, and we recommend anyone looking to view their custom content in the MIDEN arrange a few sessions ahead of their event to test their content and ensure their scene is configured properly.

Two individuals in the MIDEN point to the same virtual image with different views.</center>
This is how the MIDEN configures itself.

Security Robots Study

Security Robots

Using XR to conduct studies in robotics

Maredith Byrd


Xin Ye is a University of Michigan Master’s Student at the School of Information. She approached The Duderstadt Center with her Master’s Thesis Defense Project to test the favorability of humanoid robots. Stephanie O’Malley at the Visualization Studio helped Xin to develop a simulation using three types of security robots with varying features to see if a more humanoid robot is viewed with more favorable experiences.

Panoramic of Umich Hallway

The simulation’s goal is to make participants feel like they were interacting with a real robot standing in front of them, so the MIDEN was the perfect tool to use for this experiment. The MIDEN (Michigan Immersive Digital Experience Nexus) is a 10 x 10 x 10 square box that relies on projections so the user can naturally walk in a virtual environment. An environment is constructed in Unreal Engine and projected into the MIDEN allowing the user to still see their physical body within the projected digital world, and the digital world is created to be highly detailed. 

Panoramic of the MIDEN

Users step into the MIDEN and by wearing 3D glasses are immersed in a digital environment that recreates common locations on a college campus: such as a university hallway/commons area OR an outdoor parking lot. After a short while, the participant gains the attention of the security robot, and it approaches them to question them.

Setting up the MIDEN

Xin Ye then triggers the appropriate response so users think the robot is responding intelligently. The robots were all configured to have different triggerable answers to participants that Xin Ye could initiate behind the curtains of the MIDEN. This is a technique referred to in studies as “Wizard of Oz” because the participant thinks the robotic projection has an artificial intelligence just as a real robot in this situation would possess when in reality it is a human deciding the appropriate response.

Knightscope
Ramsee
Pepper

This project aimed to evaluate the human perception of different types of security robots – some more humanoid than others, to see if a more humanoid robot was viewed more favorably. Three different types of robots were used: Knightscope, Ramsee, and Pepper. Knightscope is a cone-shaped robot that lacks any humanoid features. Ramsee is a little more humanoid with simple facial features, while Pepper is the most humanoid with more complex features as well as arms and legs.  

Participants interacted with 1 of 3 different robot types. The robot would approach the participant in the MIDEN, and question them – asking for them to present an MCard, put on a face mask, or if they’ve witnessed anything suspicious. To ensure that these robots all had a fair chance, each used the same “Microsoft David” automated male voice. Once the dialogue chain is complete, the robot thanks the participant and moves away. The participant then removes the 3D glasses and is taken to another location in the building for an exit interview. After the simulation, participants were interviewed about their interactions with the robots. If any participant realized that it was a human controlling the robot, they were disqualified from the study. 

Knightscope in Hallway
Ramsee in Hallway

Xin Ye presented her findings in a paper titled, “Human Security Robot Interaction and Anthropomorphism: An Examination of Pepper, RAMSEE, and Knightscope Robots” at the 32nd IEEE International Conference on Robot & Human Interactive Communication in Busan, South Korea.

Fall 2023 XR Classes

Fall 2023 XR Classes

Looking for Classes that incorporate XR?

EECS 498 – Extended Reality & Society


Credits : 4
More Info Here
Contact with Questions:
Austin Yarger
[email protected]

From pediatric medical care, advanced manufacturing, and commerce to film analysis, first-responder training, and unconscious bias training, the fledgling, immersive field of extended reality may take us far beyond the realm of traditional video games and entertainment, and into the realm of diverse social impact.

“EECS 498 : Extended Reality and Society” is a programming-intensive senior capstone / MDE course that empowers students with the knowledge and experience to…

    • Implement medium-sized virtual and augmented reality experiences using industry-standard techniques and technologies.
    • Game Engines (Unreal Engine / Unity), Design Patterns, Basic Graphics Programming, etc.
    • Design socially-conscious, empowering user experiences that engage diverse audiences.
    • Contribute to cultural discourse on the hopes, concerns, and implications of an XR-oriented future.
    • Privacy / security concerns, XR film review (The Matrix, Black Mirror, etc)
    • Carry out user testing and employ feedback after analysis.
    • Requirements + Customer Analysis, Iterative Design Process, Weekly Testing, Analytics, etc.
    • Work efficiently in teams of 2-4 using agile production methods and software.
    • Project Management Software (Jira), Version Control (Git), Burndown Charting and Resource Allocation, Sprints, etc.

Students will conclude the course with at least three significant, socially-focused XR projects in their public portfolios.

 

ENTR 390 – Intro to Entrepreneurial Design, VR Lab


Credits : 3
More Info Here
Contact with Questions:
Sara ‘Dari’ Eskandari
[email protected]

In this lab, you’ll learn how to develop virtual reality content for immersive experiences in the Oculus Rift, MIDEN or for Virtual Production using Unreal Engine and 3d modeling software. You’ll also be introduced to asset creation and scene assembly by bringing assets into the Unreal Engine & creating interactive experiences. At the end of the class you’ll be capable of developing virtual reality experiences, simulations, and tools to address real-world problems.

Students will have an understanding of how to generate digital content for Virtual Reality platforms; be knowledgeable on versatile file formats, content pipelines, hardware platforms and industry standards; understand methods of iterative design and the creation of functional prototypes using this medium; employ what is learned in the lecture section of this course to determine what is possible, what is marketable, and what are the various distribution methods available within this platform; become familiar with documenting their design process and also pitching their ideas to others, receiving and providing quality feedback.

 

FTVM 307 – Film Analysis for Filmmakers


Credits : 3
More Info Here
Contact with Questions:
Matthew Solomon
[email protected]

 Filmmakers learn about filmmaking by watching films. This course reverse engineers movies to understand how they were produced. The goal is to learn from a finished film how the scenes were produced in front of the camera and microphone and how the captured material was edited. Students in this class use VR to reimagine classic film scenes – giving them the ability to record and edit footage from a virtual set.

 

UARTS 260 / EIPC FEAST – Empathy in Pointclouds


Credits: 1-5
More Info Here
Contact with Questions:
Dawn Gilpin
[email protected]

Empathy In Point Clouds: Spatializing Design Ideas and Storytelling through Immersive Technologies integrates LiDAR scanning, photogrammetry, and UnReal Engine into education, expanding the possible methodologies and processes of architectural design. Entering our third year of the FEAST program, we turn our attention to storytelling and worldbuilding using site-specific point cloud models as the context for our narratives. This year the team will produce 1-2 spatial narratives for the three immersive technology platforms we are working with: VR headset, MiDEN/VR CAVE, and the LED stage.

 

 

ARTDES 217 – Bits and Atoms


Credits: 3
More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This is an introduction to digital fabrication within the context of art and design. Students learn about the numerous types of software and tools available and develop proficiency with the specific software and tools at Stamps. Students discuss the role of digital fabrication in creative fields.

 

ARTDES 420 – Sci-Fi Prototyping


Credits: 3
More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This course ties science fiction with speculative/critical design as a means to encourage the ethical and thoughtful design of new technologies. With a focus on the creation of functional prototypes, this course combines the analysis of science fiction with physical fabrication or code-based interpretations of the technologies they depict.

 

SI 559 – Introduction to AR/VR Application Design

Credits: 3
More Info Here
Contact with Questions:
Michael Nebeling
[email protected]

This course will introduce students to Augmented Reality (AR) and Virtual Reality (VR) interfaces. This course covers basic concepts; students will create two mini-projects, one focused on AR and one on VR, using prototyping tools. The course requires neither special background nor programming experience.

 

FTVM 394 – Digital Media Production, Virtual Reality

Credits: 4
More Info Here
Contact with Questions:
Yvette Granata
[email protected]

This course provides an introduction to key software tools, techniques, and fundamental concepts supporting digital media arts production and design. Students will learn and apply the fundamentals of design and digital media production with software applications, web-based coding techniques and study the principals of design that translate across multiple forms of media production.

Scientific Visualization of Pain

Scientific Visualization of Pain

XR at the Headache & Orofacial Pain Effort (HOPE) Lab

Dr. Alexandre DaSilva is an Associate Professor in the School of Dentistry, an Adjunct Associate Professor of Psychology in the College of Literature, Science & Arts, and a neuroscientist in the Molecular and Behavioral Neuroscience Institute.  Dr. DaSilva and his associates study pain – not only its cause, but also its diagnosis and treatment – in his Headache & Orofacial Pain Effort (HOPE) Lab, located in the 300 N. Ingalls Building.

Dr. Alex DaSilva slices through a PET scan of a “migraine brain” in the MIDEN, to find areas of heightened μ-opioid activity.

Virtual and augmented reality have been important tools in this endeavor, and Dr. DaSilva has brought several projects to the Digital Media Commons (DMC) in the Duderstadt Center over the years.

In one line of research, Dr. DaSilva has obtained positron emission tomography (PET) scans of patients in the throes of migraine headaches.  The raw data obtained from these scans are three-dimensional arrays of numbers that encode the activation levels of dopamine or μ-opioid in small “finite element” volumes of the brain.  As such, they’re incomprehensible.  But, we bring the data to life through DMC-developed software that maps the numbers into a blue-to-red color gradient and renders the elements in stereoscopic 3D virtual reality (VR) – in the Michigan Immersive Digital Experience Nexus (MIDEN), or in head-mounted displays such as the Oculus Rift.  In VR, the user can effortlessly slide section planes through the volumes of data, at any angle or offset, to hunt for the red areas where the dopamine or μ-opioid signals are strongest.  Understanding how migraine headaches affect the brain may help in devising more focused and effective treatments.

Dr. Alex DaSilva’s associate, Hassan Jassar, demonstrates the real-time fNIRS-to-AR brain activation visualization, as seen through a HoloLens, as well as the tablet-based app for painting pain sensations on an image of a head. [Photo credit: Hour Detroit magazine, March 28, 2017. https://www.hourdetroit.com/health/virtual-reality-check/

In another line of research, Dr. DaSilva employs functional near-infrared spectroscopy (fNIRS) to directly observe brain activity associated with pain in “real time”, as the patient experiences it.  As Wikipedia describes it: “Using fNIRS, brain activity is measured by using near-infrared light to estimate cortical hemodynamic activity which occur in response to neural activity.”  [https://en.wikipedia.org/wiki/Functional_near-infrared_spectroscopy]  The study participant wears an elastic skullcap fitted with dozens of fNIRS sensors wired to a control box, which digitizes the signal inputs and sends the numeric data to a personal computer running a MATLAB script.  From there, a two-part software development by the DMC enables neuroscientists to visualize the data in augmented reality (AR).  The first part is a MATLAB function that opens a Wi-Fi connection to a Microsoft HoloLens and streams the numeric data out to it.  The second part is a HoloLens app that receives that data stream and renders it as blobs of light that change hue and size to represent the ± polarity and intensity of each signal.  The translucent nature of HoloLens AR rendering allows the neuroscientist to overlay this real-time data visualization on the actual patient.  Being able to directly observe neural activity associated with pain may enable a more objective scale, versus asking a patient to verbally rate their pain, for example “on a scale of 1 to 5”.  Moreover, it may be especially helpful for diagnosing or empathizing with patients who are unable to express their sensations verbally at all, whether due to simple language barriers or due to other complicating factors such as autism, dementia, or stroke.

Yet another DMC software development, the “PainTrek” mobile application also started by Dr. DaSilva, allows patients to “paint their pain” on an image of a manikin head that can be rotated freely on the screen, as a more convenient and intuitive reporting mechanism than filling out a common questionnaire.

PainTrek app allows users to “paint” regions of the body experiencing pain to indicate and track pain intensity.

Architectural Lighting Scenarios Envisioned in the MIDEN

Architectural Lighting Scenarios Envisioned in the MIDEN

ARCH 535 & Arch 545, Winter 2022

Mojtaba Navvab, Ted Hall


Prof. Mojtaba Navvab teaches environmental technology in the Taubman College of Architecture and Urban Planning, with particular interests in lighting and acoustics.  He is a regular user of the Duderstadt Center’s MIDEN (Michigan Immersive Digital Experience Nexus) – in teaching as well as sponsored research.

On April 7, 2022, he brought a combined class of ARCH 535 and ARCH 545 students to the MIDEN to see, and in some cases hear, their projects in full-scale virtual reality.

Recreating the sight and sound of the 18-story atrium space of the Hyatt Regency Louisville, where the Kentucky All State Choir gathers to sing the National Anthem.

Arch 535: To understand environmental technology design techniques through case studies and compliance with building standards.  VR applications are used to view the design solutions.

Arch 545: To apply the theory, principles, and lighting design techniques using a virtual reality laboratory.

“The objectives are to bring whatever you imagine to reality in a multimodal perception; in the MIDEN environment, whatever you create becomes a reality.  This aims toward simulation, visualization, and perception of light and sound in a virtual environment.”

Recreating and experiencing one of the artworks by James Turrell.

“Human visual perception is psychophysical because any attempt to understand it necessarily draws upon the disciplines of physics, physiology, and psychology.  A ‘Perceptionist’ is a person concerned with the total visual environment as interpreted in the human mind.”

“Imagine if you witnessed or viewed a concert hall or a choir performance in a cathedral.  You could describe the stimulus generated by the architectural space by considering each of the senses independently as a set of unimodal stimuli.  For example, your eyes would be stimulated with patterns of light energy bouncing off the simulated interior surfaces or luminous environment while you listen to an orchestra playing or choir singing with a correct auralized room acoustics.”

A few selected images photographed in the MIDEN are included in this article.  For the user wearing the stereoscopic glasses, the double images resolve into an immersive 3D visual experience that they can step into, with 270° of peripheral vision.

Students explore a daylight design solution for a library.

Learning to Develop for Mixed Reality – The ENTR 390 “VR Lab”

Learning to Develop for Virtual Reality – The ENTR 390 “VR Lab”

XR Prototyping

For the past several years, students enrolled in the Center for Entrepreneurship’s Intro to Entrepreneurial Design Virtual Reality course have been introduced to programming and content creation pipelines for XR development using a variety of Visualization Studio resources. Their goal? Create innovative applications for XR. From creating video games to changing the way class material is accessed with XR capable textbooks, if you have an interest in learning how to make your own app for Oculus Rift, MIDEN or even a smart phone, this might be a class to enroll in. Students interested in this course are not required to have any prior programming or 3d modeling knowledge, and if you’ve never used a VR headset that’s OK too. This course will teach you everything you need to know.

Henry Duhaime presents his VR game for Oculus Rift, in which players explore the surface of Mars in search of a missing NASA rover.
Michael Meadows prototypes AR capable textbooks using a mobile phone and Apple’s ARKit.