Planting Disabled Futures – A call for artists to collaborate

Planting Disabled Futures

OPen Call for Artist Collaborators

Author


Petra Kuppers is disability culture activist and a community performance artist. She creates participatory community performance environments that think/feel into public space, tenderness, site-specific art, access and experimentation. Petra grounds herself in disability culture methods, and uses ecosomatics, performance, and speculative writing to engage audiences toward more socially just and enjoyable futures.


Her latest project, Planting Disabled Futures, is funded by a Just Tech fellowship.

In the Planting Disabled Futures project, Petra aims to use live performance approaches and virtual reality (and other) technologies to share energy, liveliness, ongoingness, crip joy and experiences of pain. 

In the development of the Virtual Reality (VR) components of the project, we will ask: How can VR allow us to celebrate difference, rather than engage in hyper-mobile fantasies of overcoming and of disembodied life? How can our disabled bodymindspirits develop non-extractive intimacies, in energetic touch, using VR as a tool toward connecting with plants, with the world, even in pain, in climate emergency, in our ongoing COVID world?

A watercolor mock-up of the Crip Cave, with Moira Williams’ Stim Tent, two VR stations, a potential sound bed, and a table for drawing/writing.

Petra envisions a sensory art installation equipped with a VR experience, stimming tent, a soundbed and a drawing and writing table. The VR experience would be supplemented by actors providing opportunities to engage with unique taste, touch and smell sensations as the environment is navigated.

A cyanotype (blue) and watercolor mock-up of what the VR app might look like: a violet leaf with sensation hubs, little white ink portals, that might lead to an audio dream journey

The VR experience involved in the Crip-Cave is expected to be tree-like environment that allows participants to select either a visual or an auditory experience. Participants can travel down to the roots and experience earth critters or up to the branches and into the leafy canopy. In both locations, “sensory hubs” would take participants on a journey to other worlds – worlds potentially populated with content produced by fellow artists.

A cyanotype/watercolor mock-up of little critters that might accompany you on your journey through the environment.

Artist collaborators are welcome to contribute their talents generating 3d worlds in Unreal Engine, reciting poetry, animating or composing music to create a dream journey in virtual reality. Artists generating digital content they would like considered for inclusion in this unique art installation can reach out to: [email protected]


To learn more about Planting Disabled Futures, visit:
https://www.petrakuppers.com/planting-disabled-futures

Renew Scleroderma – Mobile Health Tracking App for Managing Scleroderma

Renew Scleroderma

Mobile Health Tracking App for managing Scleroderma

The Renew Scleroderma app aims to assist individuals with Scleroderma by giving access to a full list of resources and activities designed to help manage their condition. RENEW stands for Resilience-based Energy Management to Enhance Well-being.

Scleroderma is a rare autoimmune condition that causes inflammation and thickening of the skin in the hands and face. The high amount of collagen in the skin can advance to internal organs potentially causing complications in multiple bodily systems. Those who are diagnosed with scleroderma have a high symptom burden and need to learn strategies for self-management.

The mobile app presents users with information on Scleroderma as well as weekly activities they can perform to manage the condition. The app requires people to set goals and track health behaviors such as activity, pacing, sleep, relaxation, and engagement in physical activities. The user is encouraged to set goals within the app to complete certain activities, and their progress towards these goals is accessible to their assigned health coach from a secure web portal. Patients have regular check-ins with their health coach, discuss their progress, and adjust their plan to manage the condition based on their progress in the app. Tracking symptoms in real-time. Participants can track their health behaviors specific to the learning modules and the RENEW program. 

Image of a mobile device demonstrating the app

Renew is quick and easy to use, users download the mobile app from either the the Google Play or iTunes app stores and create an account. The app is currently developed for iOS & Android mobile devices, allowing wide accessibility to the general public. 

The benefit of Renew is that this technology can relay a user’s progress to the database accessed from a secure web portal. This web portal allows users to easily connect to an assigned University of Michigan Health Coach who has access to the information they input into the app. Users are assigned a health coach from a pool of qualified health coaches at Michigan Medicine – all of whom have Scleroderma themselves. The health coach can view their progress within the mobile app to provide feedback. That way the health coach can also see how their mentees are doing and prepare for their one on one meetings. 

One main consideration in the design process was to ensure that the app is physically easy for users to interact with. Most people with scleroderma have limited hand function, so the team consulted directly with users on where to put navigation buttons, how big the buttons needed to be, and how information should be entered into the app to reduce fatigue.

Susan Murphy acted as a faculty member for the development team consisting of Sara ‘Dari’ Eskandari, Daniel Vincenz and Sean Petty. The LiveWell App Factory supported the development of this application to Support Health and Function of People with Disabilities funded by a grant from the National Institute on Disability, Independent Living and Rehabilitation Research in the U.S. Department of Health and Human Services. With a working prototype completed and piloted with patients, future iterations of the app have been handed off to Atomic Object – a custom software development and design company local to Ann Arbor.

Video of the Mobile App Preview

Fall 2024 XR Classes

Fall 2024 XR Classes

Looking for Classes that incorporate XR?

EECS 440 – Extended Reality for Social Impact (Capstone / MDE)

More Info Here
Contact with Questions:
Austin Yarger
[email protected]

Extended Reality for Social Impact — Design, development, and application of virtual and augmented reality software for social impact. Topics include: virtual reality, augmented reality, game engines, ethics / accessibility, interaction design patterns, agile project management, stakeholder outreach, XR history / culture, and portfolio construction. Student teams develop and exhibit socially impactful new VR / AR applications.


ENTR 390.005 & 390.010 – Intro to Entrepreneurial Design, VR Lab

More Info Here
Contact with Questions:
Sara ‘Dari’ Eskandari
[email protected]

In this lab, you’ll learn how to develop virtual reality content for immersive experiences in the Meta Quest, MIDEN or for Virtual Production using Unreal Engine and 3d modeling software. You’ll also be introduced to asset creation and scene assembly by bringing assets into the Unreal Engine & creating interactive experiences. At the end of the class you’ll be capable of developing virtual reality experiences, simulations, and tools to address real-world problems.

Students will have an understanding of how to generate digital content for Virtual Reality platforms; be knowledgeable on versatile file formats, content pipelines, hardware platforms and industry standards; understand methods of iterative design and the creation of functional prototypes using this medium; employ what is learned in the lecture section of this course to determine what is possible, what is marketable, and what are the various distribution methods available within this platform; become familiar with documenting their design process and also pitching their ideas to others, receiving and providing quality feedback.


UARTS 260 – Empathy in Pointclouds

More Info Here
Contact with Questions:
Dawn Gilpin
[email protected]

Empathy In Point Clouds: Spatializing Design Ideas and Storytelling through Immersive Technologies integrates LiDAR scanning, photogrammetry, and Unreal Engine into education, expanding the possible methodologies and processes of architectural design. Entering our third year of the FEAST program, we turn our attention to storytelling and worldbuilding using site-specific point cloud models as the context for our narratives. This year the team will produce 1-2 spatial narratives for the three immersive technology platforms we are working with: Meta Quest VR headset, MiDEN/VR CAVE, and the LED stage.


ARTDES 217 – Bits and Atoms

More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This is an introduction to digital fabrication within the context of art and design. Students learn about the numerous types of software and tools available and develop proficiency with the specific software and tools at Stamps. Students discuss the role of digital fabrication in creative fields.


ARTDES 420 – Sci-Fi Prototyping

More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This course ties science fiction with speculative/critical design as a means to encourage the ethical and thoughtful design of new technologies. With a focus on the creation of functional prototypes, this course combines the analysis of science fiction with physical fabrication or code-based interpretations of the technologies they depict.


SI 559 – Introduction to AR/VR Application Design

More Info Here
Contact with Questions:
Michael Nebeling
[email protected]

This course will introduce students to Augmented Reality (AR) and Virtual Reality (VR) interfaces. This course covers basic concepts; students will create two mini-projects, one focused on AR and one on VR, using prototyping tools. The course requires neither special background nor programming experience.


FTVM 394 / DIGITAL 394 – Topics in Digital Media Production, Virtual Reality

More Info Here
Contact with Questions:
Yvette Granata
[email protected]

This course provides an introduction to key software tools, techniques, and fundamental concepts supporting digital media arts production and design. Students will learn and apply the fundamentals of design and digital media production with software applications, web-based coding techniques and study the principals of design that translate across multiple forms of media production.


UARTS 260/360/460/560 – THE BIG CITY: Lost & Found in XR

More Info Here
Contact with Questions:
Matthew Solomon & Sara Eskandari
[email protected] / [email protected]

No copies are known to exist of 1928 lost film THE BIG CITY, only still photographs, a cutting continuity, and a detailed scenario of the film. This is truly a shame because the film featured a critical mass of black performers — something extremely uncommon at the time. Using Unreal Engine, detailed 3D model renderings, and live performance, students will take users back in time into the fictional Harlem Black Bottom cabaret and clubs shown in the film. Students will experience working in a small game development team to create a high-fidelity, historical recreation of the sets using 3D modeling, 2D texturing skills, level design, and game development pipelines. They will experience a unique media pipeline of game design for live performance and cutting-edge virtual production. This project will also dedicate focus towards detailed documentation in order to honor the preservation of THE BIG CITY that allows us to attempt this endeavor and the black history that fuels it.


MOVESCI 313 – The Art of Anatomy

Contact with Questions:
Melissa Gross & Jenny Gear
[email protected] / [email protected]

Learn about human anatomy and how it has historically been taught through human history covering a variety of mediums including the recent adoption of XR tools. Students will get hands-on experience with integrating and prototyping AR and VR Visualization technologies for medical and anatomical study.


ARCH 565 – Research in Environmental Technology

Contact with Questions:
Mojtaba Navvab
[email protected]

The focus of this course is the introduction to research methods in environmental technology. Qualitative and quantitative research results are studied with regard to their impact on architectural design. Each course participant undertakes an investigation in a selected area of environmental technology. The experimental approach may use physical modeling, computer simulation, or other appropriate methods (VR).


FTVM 455.004 – Topics in Film: Eco Imaginations
WGS 412.001 – Fem Art Practices

Contact with Questions:
Petra Kuppers
[email protected]

These courses will include orientations to XR technologies and sessions leveraging Unreal Engine and Quixel 3d assets to create immersive virtual reality environments.

Security Robots Study

Security Robots

Using XR to conduct studies in robotics

Maredith Byrd


Xin Ye is a University of Michigan Master’s Student at the School of Information. She approached The Duderstadt Center with her Master’s Thesis Defense Project to test the favorability of humanoid robots. Stephanie O’Malley at the Visualization Studio helped Xin to develop a simulation using three types of security robots with varying features to see if a more humanoid robot is viewed with more favorable experiences.

Panoramic of Umich Hallway

The simulation’s goal is to make participants feel like they were interacting with a real robot standing in front of them, so the MIDEN was the perfect tool to use for this experiment. The MIDEN (Michigan Immersive Digital Experience Nexus) is a 10 x 10 x 10 square box that relies on projections so the user can naturally walk in a virtual environment. An environment is constructed in Unreal Engine and projected into the MIDEN allowing the user to still see their physical body within the projected digital world, and the digital world is created to be highly detailed. 

Panoramic of the MIDEN

Users step into the MIDEN and by wearing 3D glasses are immersed in a digital environment that recreates common locations on a college campus: such as a university hallway/commons area OR an outdoor parking lot. After a short while, the participant gains the attention of the security robot, and it approaches them to question them.

Setting up the MIDEN

Xin Ye then triggers the appropriate response so users think the robot is responding intelligently. The robots were all configured to have different triggerable answers to participants that Xin Ye could initiate behind the curtains of the MIDEN. This is a technique referred to in studies as “Wizard of Oz” because the participant thinks the robotic projection has an artificial intelligence just as a real robot in this situation would possess when in reality it is a human deciding the appropriate response.

Knightscope
Ramsee
Pepper

This project aimed to evaluate the human perception of different types of security robots – some more humanoid than others, to see if a more humanoid robot was viewed more favorably. Three different types of robots were used: Knightscope, Ramsee, and Pepper. Knightscope is a cone-shaped robot that lacks any humanoid features. Ramsee is a little more humanoid with simple facial features, while Pepper is the most humanoid with more complex features as well as arms and legs.  

Participants interacted with 1 of 3 different robot types. The robot would approach the participant in the MIDEN, and question them – asking for them to present an MCard, put on a face mask, or if they’ve witnessed anything suspicious. To ensure that these robots all had a fair chance, each used the same “Microsoft David” automated male voice. Once the dialogue chain is complete, the robot thanks the participant and moves away. The participant then removes the 3D glasses and is taken to another location in the building for an exit interview. After the simulation, participants were interviewed about their interactions with the robots. If any participant realized that it was a human controlling the robot, they were disqualified from the study. 

Knightscope in Hallway
Ramsee in Hallway

Xin Ye presented her findings in a paper titled, “Human Security Robot Interaction and Anthropomorphism: An Examination of Pepper, RAMSEE, and Knightscope Robots” at the 32nd IEEE International Conference on Robot & Human Interactive Communication in Busan, South Korea.

Fall 2023 XR Classes

Fall 2023 XR Classes

Looking for Classes that incorporate XR?

EECS 498 – Extended Reality & Society


Credits : 4
More Info Here
Contact with Questions:
Austin Yarger
[email protected]

From pediatric medical care, advanced manufacturing, and commerce to film analysis, first-responder training, and unconscious bias training, the fledgling, immersive field of extended reality may take us far beyond the realm of traditional video games and entertainment, and into the realm of diverse social impact.

“EECS 498 : Extended Reality and Society” is a programming-intensive senior capstone / MDE course that empowers students with the knowledge and experience to…

    • Implement medium-sized virtual and augmented reality experiences using industry-standard techniques and technologies.
    • Game Engines (Unreal Engine / Unity), Design Patterns, Basic Graphics Programming, etc.
    • Design socially-conscious, empowering user experiences that engage diverse audiences.
    • Contribute to cultural discourse on the hopes, concerns, and implications of an XR-oriented future.
    • Privacy / security concerns, XR film review (The Matrix, Black Mirror, etc)
    • Carry out user testing and employ feedback after analysis.
    • Requirements + Customer Analysis, Iterative Design Process, Weekly Testing, Analytics, etc.
    • Work efficiently in teams of 2-4 using agile production methods and software.
    • Project Management Software (Jira), Version Control (Git), Burndown Charting and Resource Allocation, Sprints, etc.

Students will conclude the course with at least three significant, socially-focused XR projects in their public portfolios.

 

ENTR 390 – Intro to Entrepreneurial Design, VR Lab


Credits : 3
More Info Here
Contact with Questions:
Sara ‘Dari’ Eskandari
[email protected]

In this lab, you’ll learn how to develop virtual reality content for immersive experiences in the Oculus Rift, MIDEN or for Virtual Production using Unreal Engine and 3d modeling software. You’ll also be introduced to asset creation and scene assembly by bringing assets into the Unreal Engine & creating interactive experiences. At the end of the class you’ll be capable of developing virtual reality experiences, simulations, and tools to address real-world problems.

Students will have an understanding of how to generate digital content for Virtual Reality platforms; be knowledgeable on versatile file formats, content pipelines, hardware platforms and industry standards; understand methods of iterative design and the creation of functional prototypes using this medium; employ what is learned in the lecture section of this course to determine what is possible, what is marketable, and what are the various distribution methods available within this platform; become familiar with documenting their design process and also pitching their ideas to others, receiving and providing quality feedback.

 

FTVM 307 – Film Analysis for Filmmakers


Credits : 3
More Info Here
Contact with Questions:
Matthew Solomon
[email protected]

 Filmmakers learn about filmmaking by watching films. This course reverse engineers movies to understand how they were produced. The goal is to learn from a finished film how the scenes were produced in front of the camera and microphone and how the captured material was edited. Students in this class use VR to reimagine classic film scenes – giving them the ability to record and edit footage from a virtual set.

 

UARTS 260 / EIPC FEAST – Empathy in Pointclouds


Credits: 1-5
More Info Here
Contact with Questions:
Dawn Gilpin
[email protected]

Empathy In Point Clouds: Spatializing Design Ideas and Storytelling through Immersive Technologies integrates LiDAR scanning, photogrammetry, and UnReal Engine into education, expanding the possible methodologies and processes of architectural design. Entering our third year of the FEAST program, we turn our attention to storytelling and worldbuilding using site-specific point cloud models as the context for our narratives. This year the team will produce 1-2 spatial narratives for the three immersive technology platforms we are working with: VR headset, MiDEN/VR CAVE, and the LED stage.

 

 

ARTDES 217 – Bits and Atoms


Credits: 3
More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This is an introduction to digital fabrication within the context of art and design. Students learn about the numerous types of software and tools available and develop proficiency with the specific software and tools at Stamps. Students discuss the role of digital fabrication in creative fields.

 

ARTDES 420 – Sci-Fi Prototyping


Credits: 3
More Info Here
Contact with Questions:
Sophia Brueckner
[email protected]

This course ties science fiction with speculative/critical design as a means to encourage the ethical and thoughtful design of new technologies. With a focus on the creation of functional prototypes, this course combines the analysis of science fiction with physical fabrication or code-based interpretations of the technologies they depict.

 

SI 559 – Introduction to AR/VR Application Design

Credits: 3
More Info Here
Contact with Questions:
Michael Nebeling
[email protected]

This course will introduce students to Augmented Reality (AR) and Virtual Reality (VR) interfaces. This course covers basic concepts; students will create two mini-projects, one focused on AR and one on VR, using prototyping tools. The course requires neither special background nor programming experience.

 

FTVM 394 – Digital Media Production, Virtual Reality

Credits: 4
More Info Here
Contact with Questions:
Yvette Granata
[email protected]

This course provides an introduction to key software tools, techniques, and fundamental concepts supporting digital media arts production and design. Students will learn and apply the fundamentals of design and digital media production with software applications, web-based coding techniques and study the principals of design that translate across multiple forms of media production.

Scientific Visualization of Pain

Scientific Visualization of Pain

XR at the Headache & Orofacial Pain Effort (HOPE) Lab

Dr. Alexandre DaSilva is an Associate Professor in the School of Dentistry, an Adjunct Associate Professor of Psychology in the College of Literature, Science & Arts, and a neuroscientist in the Molecular and Behavioral Neuroscience Institute.  Dr. DaSilva and his associates study pain – not only its cause, but also its diagnosis and treatment – in his Headache & Orofacial Pain Effort (HOPE) Lab, located in the 300 N. Ingalls Building.

Dr. Alex DaSilva slices through a PET scan of a “migraine brain” in the MIDEN, to find areas of heightened μ-opioid activity.

Virtual and augmented reality have been important tools in this endeavor, and Dr. DaSilva has brought several projects to the Digital Media Commons (DMC) in the Duderstadt Center over the years.

In one line of research, Dr. DaSilva has obtained positron emission tomography (PET) scans of patients in the throes of migraine headaches.  The raw data obtained from these scans are three-dimensional arrays of numbers that encode the activation levels of dopamine or μ-opioid in small “finite element” volumes of the brain.  As such, they’re incomprehensible.  But, we bring the data to life through DMC-developed software that maps the numbers into a blue-to-red color gradient and renders the elements in stereoscopic 3D virtual reality (VR) – in the Michigan Immersive Digital Experience Nexus (MIDEN), or in head-mounted displays such as the Oculus Rift.  In VR, the user can effortlessly slide section planes through the volumes of data, at any angle or offset, to hunt for the red areas where the dopamine or μ-opioid signals are strongest.  Understanding how migraine headaches affect the brain may help in devising more focused and effective treatments.

Dr. Alex DaSilva’s associate, Hassan Jassar, demonstrates the real-time fNIRS-to-AR brain activation visualization, as seen through a HoloLens, as well as the tablet-based app for painting pain sensations on an image of a head. [Photo credit: Hour Detroit magazine, March 28, 2017. https://www.hourdetroit.com/health/virtual-reality-check/

In another line of research, Dr. DaSilva employs functional near-infrared spectroscopy (fNIRS) to directly observe brain activity associated with pain in “real time”, as the patient experiences it.  As Wikipedia describes it: “Using fNIRS, brain activity is measured by using near-infrared light to estimate cortical hemodynamic activity which occur in response to neural activity.”  [https://en.wikipedia.org/wiki/Functional_near-infrared_spectroscopy]  The study participant wears an elastic skullcap fitted with dozens of fNIRS sensors wired to a control box, which digitizes the signal inputs and sends the numeric data to a personal computer running a MATLAB script.  From there, a two-part software development by the DMC enables neuroscientists to visualize the data in augmented reality (AR).  The first part is a MATLAB function that opens a Wi-Fi connection to a Microsoft HoloLens and streams the numeric data out to it.  The second part is a HoloLens app that receives that data stream and renders it as blobs of light that change hue and size to represent the ± polarity and intensity of each signal.  The translucent nature of HoloLens AR rendering allows the neuroscientist to overlay this real-time data visualization on the actual patient.  Being able to directly observe neural activity associated with pain may enable a more objective scale, versus asking a patient to verbally rate their pain, for example “on a scale of 1 to 5”.  Moreover, it may be especially helpful for diagnosing or empathizing with patients who are unable to express their sensations verbally at all, whether due to simple language barriers or due to other complicating factors such as autism, dementia, or stroke.

Yet another DMC software development, the “PainTrek” mobile application also started by Dr. DaSilva, allows patients to “paint their pain” on an image of a manikin head that can be rotated freely on the screen, as a more convenient and intuitive reporting mechanism than filling out a common questionnaire.

PainTrek app allows users to “paint” regions of the body experiencing pain to indicate and track pain intensity.

Architectural Lighting Scenarios Envisioned in the MIDEN

Architectural Lighting Scenarios Envisioned in the MIDEN

ARCH 535 & Arch 545, Winter 2022

Mojtaba Navvab, Ted Hall


Prof. Mojtaba Navvab teaches environmental technology in the Taubman College of Architecture and Urban Planning, with particular interests in lighting and acoustics.  He is a regular user of the Duderstadt Center’s MIDEN (Michigan Immersive Digital Experience Nexus) – in teaching as well as sponsored research.

On April 7, 2022, he brought a combined class of ARCH 535 and ARCH 545 students to the MIDEN to see, and in some cases hear, their projects in full-scale virtual reality.

Recreating the sight and sound of the 18-story atrium space of the Hyatt Regency Louisville, where the Kentucky All State Choir gathers to sing the National Anthem.

Arch 535: To understand environmental technology design techniques through case studies and compliance with building standards.  VR applications are used to view the design solutions.

Arch 545: To apply the theory, principles, and lighting design techniques using a virtual reality laboratory.

“The objectives are to bring whatever you imagine to reality in a multimodal perception; in the MIDEN environment, whatever you create becomes a reality.  This aims toward simulation, visualization, and perception of light and sound in a virtual environment.”

Recreating and experiencing one of the artworks by James Turrell.

“Human visual perception is psychophysical because any attempt to understand it necessarily draws upon the disciplines of physics, physiology, and psychology.  A ‘Perceptionist’ is a person concerned with the total visual environment as interpreted in the human mind.”

“Imagine if you witnessed or viewed a concert hall or a choir performance in a cathedral.  You could describe the stimulus generated by the architectural space by considering each of the senses independently as a set of unimodal stimuli.  For example, your eyes would be stimulated with patterns of light energy bouncing off the simulated interior surfaces or luminous environment while you listen to an orchestra playing or choir singing with a correct auralized room acoustics.”

A few selected images photographed in the MIDEN are included in this article.  For the user wearing the stereoscopic glasses, the double images resolve into an immersive 3D visual experience that they can step into, with 270° of peripheral vision.

Students explore a daylight design solution for a library.

Learning to Develop for Mixed Reality – The ENTR 390 “VR Lab”

Learning to Develop for Virtual Reality – The ENTR 390 “VR Lab”

XR Prototyping

For the past several years, students enrolled in the Center for Entrepreneurship’s Intro to Entrepreneurial Design Virtual Reality course have been introduced to programming and content creation pipelines for XR development using a variety of Visualization Studio resources. Their goal? Create innovative applications for XR. From creating video games to changing the way class material is accessed with XR capable textbooks, if you have an interest in learning how to make your own app for Oculus Rift, MIDEN or even a smart phone, this might be a class to enroll in. Students interested in this course are not required to have any prior programming or 3d modeling knowledge, and if you’ve never used a VR headset that’s OK too. This course will teach you everything you need to know.

Henry Duhaime presents his VR game for Oculus Rift, in which players explore the surface of Mars in search of a missing NASA rover.
Michael Meadows prototypes AR capable textbooks using a mobile phone and Apple’s ARKit.

Multi-Sensing the Universe

Multi-Sensing the Universe

Envisioning a Toroidal universe

Robert Alexander teamed with Danielle Battaglia, a senior in Art & Design, to compose and integrate audio effects into her conceptual formal model of the Toroidal Universe.  Danielle combined Plato’s notion of the universe as a dodecahedron with modern notions of black holes, worm holes, and child universes.  Their multi-sensory multiverse came together in the MIDEN and was exhibited there as part of the Art & Design senior integrative art exhibition.

Interested in using the MIDEN to do something similar? Contact us.

Robots Who Goof: Can We Trust Them?

Robotics in Unreal Engine

Robots Who Goof: Can We Trust Them?

EVERYONE MAKES MISTAKES

The human-like, android robot used in the virtual experimental task of handling boxes.

When robots make mistakes—and they do from time to time—reestablishing trust with human co-workers depends on how the machines own up to the errors and how human-like they appear, according to University of Michigan research.

In a study that examined multiple trust repair strategies—apologies, denials, explanations or promises—the researchers found that certain approaches directed at human co-workers are better than others and often are impacted by how the robots look.

“Robots are definitely a technology but their interactions with humans are social and we must account for these social interactions if we hope to have humans comfortably trust and rely on their robot co-workers,” said Lionel Robert, associate professor at the U-M School of Information and core faculty of the Robotics Institute.

“Robots will make mistakes when working with humans, decreasing humans’ trust in them. Therefore, we must develop ways to repair trust between humans and robots. Specific trust repair strategies are more effective than others and their effectiveness can depend on how human the robot appears.”

For their study published in the Proceedings of 30th IEEE International Conference on Robot and Human Interactive Communication, Robert and doctoral student Connor Esterwood examined how the repair strategies—including a new strategy of explanations—impact the elements that drive trust: ability (competency), integrity (honesty) and benevolence (concern for the trustor).

The mechanical arm robot used in the virtual experiment.

The researchers recruited 164 participants to work with a robot in a virtual environment, loading boxes onto a conveyor belt. The human was the quality assurance person, working alongside a robot tasked with reading serial numbers and loading 10 specific boxes. One robot was anthropomorphic or more humanlike, the other more mechanical in appearance.

Sara Eskandari and Stephanie O’Malley of the Emerging Technology Group at U-M’s James and Anne Duderstadt Center helped develop the experimental virtual platform.

The robots were programed to intentionally pick up a few wrong boxes and to make one of the following trust repair statements: “I’m sorry I got the wrong box” (apology), “I picked the correct box so something else must have gone wrong” (denial), “I see that was the wrong serial number” (explanation), or “I’ll do better next time and get the right box” (promise).

Previous studies have examined apologies, denials and promises as factors in trust or trustworthiness but this is the first to look at explanations as a repair strategy, and it had the highest impact on integrity, regardless of the robot’s appearance.

When the robot was more humanlike, trust was even easier to restore for integrity when explanations were given and for benevolence when apologies, denials and explanations were offered.

As in the previous research, apologies from robots produced higher integrity and benevolence than denials. Promises outpaced apologies and denials when it came to measures of benevolence and integrity.

Esterwood said this study is ongoing with more research ahead involving other combinations of trust repairs in different contexts, with other violations.

“In doing this we can further extend this research and examine more realistic scenarios like one might see in everyday life,” Esterwood said. “For example, does a barista robot’s explanation of what went wrong and a promise to do better in the future repair trust more or less than a construction robot?”

This originally appeared on Michigan News.

More information: