Renew Scleroderma – Mobile Health Tracking App for Managing Scleroderma

Renew Scleroderma

Mobile Health Tracking App for managing Scleroderma

The Renew Scleroderma app aims to assist individuals with Scleroderma by giving access to a full list of resources and activities designed to help manage their condition. RENEW stands for Resilience-based Energy Management to Enhance Well-being.

Scleroderma is a rare autoimmune condition that causes inflammation and thickening of the skin in the hands and face. The high amount of collagen in the skin can advance to internal organs potentially causing complications in multiple bodily systems. Those who are diagnosed with scleroderma have a high symptom burden and need to learn strategies for self-management.

The mobile app presents users with information on Scleroderma as well as weekly activities they can perform to manage the condition. The app requires people to set goals and track health behaviors such as activity, pacing, sleep, relaxation, and engagement in physical activities. The user is encouraged to set goals within the app to complete certain activities, and their progress towards these goals is accessible to their assigned health coach from a secure web portal. Patients have regular check-ins with their health coach, discuss their progress, and adjust their plan to manage the condition based on their progress in the app. Tracking symptoms in real-time. Participants can track their health behaviors specific to the learning modules and the RENEW program. 

Image of a mobile device demonstrating the app

Renew is quick and easy to use, users download the mobile app from either the the Google Play or iTunes app stores and create an account. The app is currently developed for iOS & Android mobile devices, allowing wide accessibility to the general public. 

The benefit of Renew is that this technology can relay a user’s progress to the database accessed from a secure web portal. This web portal allows users to easily connect to an assigned University of Michigan Health Coach who has access to the information they input into the app. Users are assigned a health coach from a pool of qualified health coaches at Michigan Medicine – all of whom have Scleroderma themselves. The health coach can view their progress within the mobile app to provide feedback. That way the health coach can also see how their mentees are doing and prepare for their one on one meetings. 

One main consideration in the design process was to ensure that the app is physically easy for users to interact with. Most people with scleroderma have limited hand function, so the team consulted directly with users on where to put navigation buttons, how big the buttons needed to be, and how information should be entered into the app to reduce fatigue.

Susan Murphy acted as a faculty member for the development team consisting of Sara ‘Dari’ Eskandari, Daniel Vincenz and Sean Petty. The LiveWell App Factory supported the development of this application to Support Health and Function of People with Disabilities funded by a grant from the National Institute on Disability, Independent Living and Rehabilitation Research in the U.S. Department of Health and Human Services. With a working prototype completed and piloted with patients, future iterations of the app have been handed off to Atomic Object – a custom software development and design company local to Ann Arbor.

Video of the Mobile App Preview

New MIDEN – Unveiling of the upgraded MIDEN for Fall ‘23

New MIDEN – Unveiling of the upgraded MIDEN for Fall ‘23

New dual view capabilities

Maredith Byrd


We have upgraded the MIDEN! The new projectors use LEDs with much brighter and higher resolution using four Christie Digital M 4K25 RGB Laser Projectors. The new projectors use LEDs that have a longer lifespan. We used to have to limit how often and how long the MIDEN was run because the previous lamps had a very limited lifespan of just 1250 hours. For a 10′ x 10′ Screen, the resolution for each screen will be 2160×2160, which is double the previous resolution. There are now 25,000 hours of Lifespan at 100% brightness and 50,000 hours at 50% brightness.The new capabilities allow for two people to experience the view at once. They can see the same virtual content aligned to each of their unique perspectives and simultaneously interact with the content.

In a typical setup, 3D stereoscopic content (like what you would experience in a 3D movie) is projected onto three walls and the floor and stitched seamlessly together. Users wear a set of motion-tracked glasses that allow their perspective to be updated depending on where they are standing or looking, and use a motion-tracked video game controller to navigate beyond the confines of the 10’x10’ room. To the user wearing the 3D glasses, the projected content appears entirely to scale and has realistic depth – they can look underneath tables that appear to be situated in front of them, despite the table being projected onto one of the walls.

The MIDEN supports 3D Modeling formats exported by the most popular modeling software: Blender, 3ds Max, Maya, Sketchup, Rhino, Revit, etc. These models can be exported in the following formats and then imported into our “Jugular” software: OBJ, FBX, STL, and VRML formats. The MIDEN can also produce Unreal Engine scenes where we use the nDisplay plugin to split the scene into 4 different cameras to correspond with the 4 projectors in the MIDEN. 

MIDEN users experience immersion in a virtual environment without it blocking their view of themselves or their surroundings as a VR headset does. Since VR “CAVE” is a trademarked term, ours is called the MIDEN, which stands for Michigan Immersive Digital Experience Nexus and the MIDEN takes traditional “CAVE” technology much further – it is driven by our in-house developed rendering engine that affords more flexibility than a typical “CAVE” setup.

The MIDEN is more accessible than VR headsets, meaning it takes less time to set up and begin using compared to headsets. The game controller used is a standard Xbox-type gaming pad, familiar to most gamers. The MIDEN has increased immersion, the vision of the real world is not hidden, so users do not have to worry about trip hazards or becoming disoriented. The MIDEN users see their real body unlike in a VR headset where the body is most likely a virtual avatar. This results in less motion sickness. 

It can be used for Architectural Review, Data Analysis, Art Installations, Learning 3D modeling, and much more. From seeing the true scale of a structure in relation to the body to sensory experiences with unique visuals and spatialized audio, the MIDEN is capable of assisting these projects to a new level.

The MIDEN is available to anyone to use for a project, class exercise, or tour by request. They can contact emergingtech@umich.edu to arrange to use it. Use of the MIDEN does require staff to run it, and we recommend anyone looking to view their custom content in the MIDEN arrange a few sessions ahead of their event to test their content and ensure their scene is configured properly.

Two individuals in the MIDEN point to the same virtual image with different views.</center>
This is how the MIDEN configures itself.

Security Robots Study

Security Robots

Using XR to conduct studies in robotics

Maredith Byrd


Xin Ye is a University of Michigan Master’s Student at the School of Information. She approached The Duderstadt Center with her Master’s Thesis Defense Project to test the favorability of humanoid robots. Stephanie O’Malley at the Visualization Studio helped Xin to develop a simulation using three types of security robots with varying features to see if a more humanoid robot is viewed with more favorable experiences.

Panoramic of Umich Hallway

The simulation’s goal is to make participants feel like they were interacting with a real robot standing in front of them, so the MIDEN was the perfect tool to use for this experiment. The MIDEN (Michigan Immersive Digital Experience Nexus) is a 10 x 10 x 10 square box that relies on projections so the user can naturally walk in a virtual environment. An environment is constructed in Unreal Engine and projected into the MIDEN allowing the user to still see their physical body within the projected digital world, and the digital world is created to be highly detailed. 

Panoramic of the MIDEN

Users step into the MIDEN and by wearing 3D glasses are immersed in a digital environment that recreates common locations on a college campus: such as a university hallway/commons area OR an outdoor parking lot. After a short while, the participant gains the attention of the security robot, and it approaches them to question them.

Setting up the MIDEN

Xin Ye then triggers the appropriate response so users think the robot is responding intelligently. The robots were all configured to have different triggerable answers to participants that Xin Ye could initiate behind the curtains of the MIDEN. This is a technique referred to in studies as “Wizard of Oz” because the participant thinks the robotic projection has an artificial intelligence just as a real robot in this situation would possess when in reality it is a human deciding the appropriate response.

Knightscope
Ramsee
Pepper

This project aimed to evaluate the human perception of different types of security robots – some more humanoid than others, to see if a more humanoid robot was viewed more favorably. Three different types of robots were used: Knightscope, Ramsee, and Pepper. Knightscope is a cone-shaped robot that lacks any humanoid features. Ramsee is a little more humanoid with simple facial features, while Pepper is the most humanoid with more complex features as well as arms and legs.  

Participants interacted with 1 of 3 different robot types. The robot would approach the participant in the MIDEN, and question them – asking for them to present an MCard, put on a face mask, or if they’ve witnessed anything suspicious. To ensure that these robots all had a fair chance, each used the same “Microsoft David” automated male voice. Once the dialogue chain is complete, the robot thanks the participant and moves away. The participant then removes the 3D glasses and is taken to another location in the building for an exit interview. After the simulation, participants were interviewed about their interactions with the robots. If any participant realized that it was a human controlling the robot, they were disqualified from the study. 

Knightscope in Hallway
Ramsee in Hallway

Xin Ye presented her findings in a paper titled, “Human Security Robot Interaction and Anthropomorphism: An Examination of Pepper, RAMSEE, and Knightscope Robots” at the 32nd IEEE International Conference on Robot & Human Interactive Communication in Busan, South Korea.