Customer Discovery Using 360 Video

Customer Discovery Using 360 Video

Year after year, students in Professor Dawn White’s Entrepreneurship 411 course are tasked with doing a “customer discovery” – a process where students interested in creating a business, interview professionals in a given field to assess their needs and how products they develop could address these needs and alleviate some of the difficulties they encounter on a daily basis.

Often when given this assignment, students would defer to their peers for feedback instead of reaching out to strangers working in these fields of interest. This demographic being so similar to the students themselves, would result in a fairly biased outcome that didn’t truly get to the root issue of why someone might want or need a specific product. Looking for an alternative approach, Dawn teamed up with her long time friend, Professor Alison Bailey, who teaches DEI at the University, and Aileen Huang-Saad from Biomedical Engineering, and approached the Duderstadt Center with their idea: What if students could interact with a simulated and more diverse professional to conduct their customer discovery?

After exploring the many routes this could take for development, including things like motion capture-driven CGI avatars, 360 video became the decided platform on which to create this simulation. 360 Video viewed within an Oculus Rift VR headset ultimately gave the highest sense of realism and immersion when conducting an interview, which was important for making the interview process feel authentic.

Up until this point, 360 videos were largely passive experiences. They did not allow users to tailor the experience based on their choices or interact with the scene in any way. This Customer Discovery project required the 360 videos to be responsive – when a student asked a recognized customer discovery question, the appropriate video response would need to be triggered to play. And to do this, the development required both some programming logic to trigger different videos but also an integrated voice recognition software so students could ask a question out loud and have the speech recognized within the application.

Dawn and Alison sourced three professionals to serve as their simulated actors for this project:

Fritz discusses his career as an IT professional

Fritz – Fritz is a young black man with a career as an IT professional


Cristina – Cristina is a middle aged woman with a noticeable accent, working in education


Charles – Charles is a white adult man employed as a barista

These actors were chosen for their authenticity and diversity, having qualities that may lead interviewers to make certain assumptions or expose biases in their interactions with them. With the help of talented students at the Visualization Studio, these professionals were filmed responding to various customer discovery questions using the Ricoh Theta 360 camera and a spatial microphone (this allows for spatial audio in VR, so you feel like the sound is coming from a specific direction where the actor is sitting). For footage of one response to be blended with the next, the actors had to remember to revert their hands and face to the same pose between responses so the footage could be aligned. They also were filmed giving generic responses to any unplanned questions that may get asked as well as twiddling their thumbs and patiently waiting – footage that could be looped to fill any idle time between questions.

Once the footage was acquired, the frame ranges for each response were noted and passed off to programmers to implement into the Duderstadt Center’s in-house VR rendering software, Jugular. As an initial prototype of the concept, the application was originally intended to run as a proctored simulation – students engaging in the simulation would wear an Oculus Rift and ask their questions out loud, with the proctor listening in and triggering the appropriate actor response using keyboard controls. For a more natural feel, Dawn was interested in exploring voice recognition to make the process more automated.

Within Jugular, students view an interactive 360 video where they are seated across from one of three professionals available for interviewing. Using the embedded microphone in the Oculus Rift they are able to ask questions that are recognized using Dialogue Flow, that in turn trigger the appropriate video response, allowing students to conduct mock interviews.

With Dawn employing some computer science students to tackle the voice recognition element over the summer, they were able to integrate this feature into Jugular using the Dialogue Flow agent with Python scripts. Students could now be immersed in an Oculus Rift, speaking to a 360 video filmed actor, and have their voice interpreted as they asked their questions out loud, using the embedded microphone on the Rift.

Upon it’s completion, the Customer Discovery application was piloted in the Visualization Studio with Dawn’s students for the Winter 2019 semester.

The Jewish Tradition of Tsedakah as Exemplified in Pushkes – Online Exhibit

The Jewish Tradition of Tsedakah as Exemplified in Pushkes – Online Exhibit

The pushke exhibit first appeared at the Jean & Samuel Frankel Center for Judaic Studies in the summer of 2015. The exhibit was composed of 40 pushkes (charitable donation boxes) of all shapes and sizes, situated in a series of display cases. The many diverse charity boxes reflect the breadth of the Jewish Heritage Collection Dedicated to Mark and Dave Harris, and illustrate the value of giving in Jewish communities throughout the world. Prior to being moved into storage for safekeeping, the collection underwent a lengthy scanning processes with help from the Duderstadt Center, to convert the collection into digitized 3D objects expanding accessibility by allowing the exhibit to be preserved and view-able online.

The Pushke Collection was digitized by the Duderstadt Center using the process of Photogrammetry. In this process, several high fidelity digital photographs are captured 360 degrees around the subject. These photos are analyzed by a computer algorithm to identify matching features on a per-pixel basis between photographs. These identified features are then used to triangulate a position within 3D space, allowing a 3D model of the object to be generated. The color information from the initial photographs is then mapped to the surface of the object in order to achieve a realistic digital replica. Select pieces of the Pushke collection have been further refined to correct imperfections resulting from the capturing process by an artist using digital sculpting and painting software, with the entire digital collection also being optimized for more efficient viewing on the web.

A web viewer was then developed and integrated into the Frankel Center’s WordPress site, to display and allow manipulation of the various pushkes in the collection. The web viewer allows each pushke to be rotated 360 degrees, and for the pushkes to be zoomed in or out, allowing for more detailed viewing than what traditional photographs typically allow.

The result of this effort, the Frankel Center’s online exhibit, “Charity Saves from Death: The Jewish Tradition of Tsedakah as Exemplified in Pushkes” can be viewed here: https://exhibits.judaic.lsa.umich.edu/pushke

S.C.I Hard Available in App Store

S.C.I Hard Available in App Store

Those with spinal cord injuries (SCI) encounter a drastically different world when they are released from the hospital. With varying degrees of disability, mobility and function, the world around them becomes a collection of physical and mental challenges which is a complete departure from their previous lifestyles. Whether they are in crutches or manual/automatic wheelchairs, they need to learn mobility, scheduling, and social tasks once again.

Players in S.C.I Hard must navigate a chaotic club scene to wrangle escaped tarsier monkeys

S.C.I Hard is a mobile game developed by the Duderstadt Center and designed by Dr. Michelle Meade for the Center for Technology & Independence (TIKTOC RERC) with funding from a NIDRR Field Initiated Development Grant.

Its purpose is to assist persons with spinal cord injury and develop and apply the necessary skills to keep their bodies healthy while managing the many aspects of SCI care, serving as a fun and engaging manual for individuals with spinal cord injuries learning independence. Tasks such as scheduling, mobility, and social interaction are all integrated subtly into the game. Players engage in goofy quests, from befriending roid-raging girlscouts in the park to collecting tarsier monkeys running rampant at a night club. The goal of S.C.I Hard was to be different from most medically oriented games, so players don’t feel like they’re being lectured or bombarded with  boring medical jargon, and instead learn the important concepts of their condition in a more light-hearted and engaging way.

Players shop for a handicap accessible vehicle to take their road test as they learn independence

With more than 30 different scenarios and mini-games, a full cast of odd characters to talk with, and dozens of collectible items and weapons only you can save the town from impending doom. SCI-Hard puts you, the player, in the chair of someone with a Spinal Cord Injury. Introducing you to new challenges and obstacles all while trying to save the world from legions of mutated animals. Join the fight and kick a** while sitting down!

S.C.I Hard is now available for free on Apple and Android devices through the app store, but will require participation in the subsequent study or feedback group to play:

Apple Devices: https://itunes.apple.com/us/app/sci-hard/id1050205395?mt=8

Android Devices: https://play.google.com/store/apps/details?id=edu.umich.mobile.SciHard&hl=en

To learn more about the subsequent study or to participate in the study involving S.C.I Hard, visit:
http://cthi.medicine.umich.edu/projects/tiktoc-rerc/projects/r2