Customer Discovery Using 360 Video

Customer Discovery Using 360 Video

Year after year, students in Professor Dawn White’s Entrepreneurship 411 course are tasked with doing a “customer discovery” – a process where students interested in creating a business, interview professionals in a given field to assess their needs and how products they develop could address these needs and alleviate some of the difficulties they encounter on a daily basis.

Often when given this assignment, students would defer to their peers for feedback instead of reaching out to strangers working in these fields of interest. This demographic being so similar to the students themselves, would result in a fairly biased outcome that didn’t truly get to the root issue of why someone might want or need a specific product. Looking for an alternative approach, Dawn teamed up with her long time friend, Professor Alison Bailey, who teaches DEI at the University, and Aileen Huang-Saad from Biomedical Engineering, and approached the Duderstadt Center with their idea: What if students could interact with a simulated and more diverse professional to conduct their customer discovery?

After exploring the many routes this could take for development, including things like motion capture-driven CGI avatars, 360 video became the decided platform on which to create this simulation. 360 Video viewed within an Oculus Rift VR headset ultimately gave the highest sense of realism and immersion when conducting an interview, which was important for making the interview process feel authentic.

Up until this point, 360 videos were largely passive experiences. They did not allow users to tailor the experience based on their choices or interact with the scene in any way. This Customer Discovery project required the 360 videos to be responsive – when a student asked a recognized customer discovery question, the appropriate video response would need to be triggered to play. And to do this, the development required both some programming logic to trigger different videos but also an integrated voice recognition software so students could ask a question out loud and have the speech recognized within the application.

Dawn and Alison sourced three professionals to serve as their simulated actors for this project:

Fritz discusses his career as an IT professional

Fritz – Fritz is a young black man with a career as an IT professional


Cristina – Cristina is a middle aged woman with a noticeable accent, working in education


Charles – Charles is a white adult man employed as a barista

These actors were chosen for their authenticity and diversity, having qualities that may lead interviewers to make certain assumptions or expose biases in their interactions with them. With the help of talented students at the Visualization Studio, these professionals were filmed responding to various customer discovery questions using the Ricoh Theta 360 camera and a spatial microphone (this allows for spatial audio in VR, so you feel like the sound is coming from a specific direction where the actor is sitting). For footage of one response to be blended with the next, the actors had to remember to revert their hands and face to the same pose between responses so the footage could be aligned. They also were filmed giving generic responses to any unplanned questions that may get asked as well as twiddling their thumbs and patiently waiting – footage that could be looped to fill any idle time between questions.

Once the footage was acquired, the frame ranges for each response were noted and passed off to programmers to implement into the Duderstadt Center’s in-house VR rendering software, Jugular. As an initial prototype of the concept, the application was originally intended to run as a proctored simulation – students engaging in the simulation would wear an Oculus Rift and ask their questions out loud, with the proctor listening in and triggering the appropriate actor response using keyboard controls. For a more natural feel, Dawn was interested in exploring voice recognition to make the process more automated.

Within Jugular, students view an interactive 360 video where they are seated across from one of three professionals available for interviewing. Using the embedded microphone in the Oculus Rift they are able to ask questions that are recognized using Dialogue Flow, that in turn trigger the appropriate video response, allowing students to conduct mock interviews.

With Dawn employing some computer science students to tackle the voice recognition element over the summer, they were able to integrate this feature into Jugular using the Dialogue Flow agent with Python scripts. Students could now be immersed in an Oculus Rift, speaking to a 360 video filmed actor, and have their voice interpreted as they asked their questions out loud, using the embedded microphone on the Rift.

Upon it’s completion, the Customer Discovery application was piloted in the Visualization Studio with Dawn’s students for the Winter 2019 semester.