Behind the Scenes: Re-creating Citizen Kane in VR

Behind the Scenes: Re-creating Citizen Kane in VR

inside a classic

Stephanie O’Malley


Students in Matthew Solomon’s classes are used to critically analyzing film. Now they get the chance to be the director for arguably one of the most influential films ever produced: Citizen Kane.

Using an application developed at the Duderstadt Center with grant funding provided by LSA Technology Services, students are placed in the role of the film’s director and able to record a prominent scene from the movie using a virtual camera. The film set which no longer exists, has been meticulously re-created in black and white CGI using reference photographs from the original set, with a CGI Orson Welles acting out the scene on repeat – his actions performed by Motion Capture actor Matthew Henerson, carefully chosen for his likeness to Orson Welles, with the Orson avatar generated from a photogrammetry scan of Matthew.

Top down view of the CGI re-creation of the film set for Citizen Kane

Analyzing the original film footage, doorways were measured, actor heights compared, and footsteps were counted, to determine a best estimate for the scale of the set when 3D modeling. With feedback from Citizen Kane expert, Harlan Lebo, fine details down to the topics of the books on the bookshelves were able to be determined.

Archival photograph provided by Vincent Longo of the original film set

Motion Capture actor Matthew Henerson was flown in to play the role of the digital Orson Welles. In a carefully choreographed session directed by Matthew’s PhD student, Vincent Longo, the iconic scene from Citizen Kane was re-enacted while the original footage played on an 80″ TV in the background, ensuring every step aligned to the original footage perfectly.

Actor Matthew Henerson in full mocap attire amidst the makeshift set for Citizen Kane – Props constructed using PVC. Photo provided by Shawn Jackson.

The boundaries of the set were taped on the floor so the data could be aligned to the digitally re-created set. Eight Vicon motion capture cameras, the same used throughout Hollywood for films like Lord of the Rings or Planet of the Apes, formed a circle around the makeshift set. These cameras rely on infrared light reflected off of tiny balls affixed to the motion capture suit to track the actor’s motion. Any props during the motion capture recording were carefully constructed out of cardboard and PVC (later to be 3D modeled) so as to not obstruct his movements. The 3 minutes of footage attempting to be re-created took 3 days to complete, comprised over 100 individual mocap takes and several hours of footage, which were then compared for accuracy and stitched together to complete the full route Orson travels through the environment.

Matthew Henerson
Orson Welles

  Matthew Henerson then swapped his motion capture suit for an actual suit, similar to that worn by Orson in the film, and underwent 3D scanning using the Duderstadt Center’s photogrammetry resources. 

Actor Matthew Henerson wears asymmetrical markers to assist the scanning process

Photogrammetry is a method of scanning existing objects or people, commonly used in Hollywood and throughout the video game industry to create a CGI likenesses of famous actors. This technology has been used in films like Star Wars (an actress similar in appearance to Carrie Fischer was scanned and then further sculpted, to create a more youthful Princess Leia) with entire studios now devoted to photogrammetry scanning. The process relies on several digital cameras surrounding the subject and taking simultaneous photographs.

Matthew Henerson being processed for Photogrammetry

The photos are submitted to a software that analyzes them on a per-pixel basis, looking for similar features across multiple photos. When a feature is recognized, it is triangulated using the focal length of the camera and it’s position relative to other identified features, allowing millions of tracking points to be generated. From this an accurate 3D model can be produced, with the original digital photos mapped to its surface to preserve photo-realistic color. These models can be further manipulated: Sometimes they are sculpted by an artist, or, with the addition of a digital “skeleton”, they can be driven by motion data to become a fully articulated digital character.

  The 3d modeled scene and scanned actor model were joined with mocap data and brought into the Unity game engine to develop the functionality students would need to film within the 3D set. A virtual camera was developed with all of the same settings you would find on a film camera from that era. When viewed in a virtual reality headset like the Oculus Rift, Matthew’s students can pick up the camera and physically move around to position it at different locations in the CGI environment, often capturing shots that otherwise would be difficult to do in a conventional film set. The footage students film within the app can be exported as MP4 video and then edited in their editing software of choice, just like any other camera footage.

  Having utilized the application for his course in the Winter of 2020, Matthew Solomon’s project with the Duderstadt Center was recently on display as part of the iLRN’s 2020 Immersive Learning Project Showcase & Competition. With Covid-19 making the conference a remote experience, the Citizen Kane project was able to be experienced in Virtual Reality by conference attendees using the FrameVR platform. Highlighting innovative ways of teaching with VR technologies, attendees from around the world were able to learn about the project and watch student edits made using the application.

Citizen Kane on display for iLRN’s 2020 Immersive Learning Project Showcase & Competition using Frame VR

Breaking Ground at Taubman College

Breaking Ground at Taubman College

The Taubman College of Architecture at the University of Michigan is adding another wing to the college.  Located on North Campus, the College of Architecture currently shares the 240,000 square foot space with the Penny Stamps School of Art and Design.  The architecture studios itself, located on the third floor of the building, occupy a space of around 30,000 square feet, making it the largest academic studio space in the world.

After a recent gift of $12.5 million made by Alfred Taubman, the college plans on building a new addition which will be called the A. Alfred Taubman wing.  The completed wing will be 36,000 square feet and will have new studios, new offices for faculty and new classrooms.

On April 25, 2015 the University of Michigan’s President Mark Schlissel along with Taubman College’s Dean Monica Ponce De Leon with donor A. Alfred Taubman present, broke ground at the site of the new wing.  The ceremonial shoveling was preformed by Taubman College’s Kuka Robot, a robot designed for architectural fabrication research but today was modified to assist in the ceremony.

The Duderstadt Center helped program the robot for the ceremony by filming a human shoveling in the lab.  The motion was captured in the lab by motion capture cameras and a program was developed for the robot to mimic the motion.

Integrated Design Solutions, a firm based in Troy, Michigan, along with architect Preston Scott Cohen are in charge of the design for the new college.  The building is scheduled to be completed in 2017.

Click here to read an article about the ceremony released on Taubman College’s website.

User Story: Rachael Miller and Carlos Garcia

User Story: Rachael Miller and Carlos Garcia 

Rachael Miller and Carlos Garcia discuss how their individual experiences with the Digital Media Commons (DMC) shaped their projects and ambitions. Rachael, an undergraduate in computer science, was able to expand her horizons by working in the Duderstadt Center on projects which dealt with virtual reality. She gained vital knowledge about motion capture by working in the MIDEN with the Kinect, and continues to apply her new skills to projects and internships today.

Carolos Garcia worked to combine technology and art in the form of projection mapping for his senior thesis Out of the Box. To approach the project, he began by searching for resources and found DMC to be the perfect fit. By establishing connections to staff in the 3D Lab, Groundworks, the Video Studio and many others, he was able to complete his project and go on to teach others the process as well. For a more behind the scenes look at both Carlos Garcia and Racheael Miller’s projects and process, please watch the video above!

 

Using Motion Capture To Test Robot Movement

Using Motion Capture To Test Robot Movement

Student analyzing movement of his group’s robot

At the end of every year, seniors in the College of Engineering are working hard to finish their capstone design projects. These projects are guided by a professor but built entirely by students. Keteki Saoji, a mechanical engineer focusing on manufacturing, took inspiration from Professor Revzen who studies legged locomotion in both insects and robots. Earlier in the year Professor Revzen published the results of experiments with tripping cockroaches which indicated that insects can use their body mechanics and momentum to stabilize their motions, rather than relying on their nervous system interpreting their environment and sending electrical messages to the muscles. The study predicts that robots which similarly lack feedback can be designed to be remarkably stable while running.

Saoji and her three teammates took on the challenge of creating a robot that would maintain such stability on very rough terrains. They worked with a hexapedal robot designed at the University of Pennsylvania that was shown to follow the same mechanically stabilizing dynamics as cockroaches. The team had to design new legs with sensors allowing the robot to detect when its feet hit the ground. The changes in motion introduced by sensing were so subtle that they needed special equipment to see the change. Using the Duderstadt Center’s eight-camera Motion Capture system, the team was able to track the intricacies of how the robot moved when sensory information is used and when it is not used. They took the data collected from the Motion Capture session to track how the robot moved with their mechanical and programming revisions, establishing that ground contact sensing allows robot motions to adapt more effectively to rougher ground.

A student’s robot covered in sensors.

Motion Capture and Kinects Analyze Movement in Tandem

Motion Capture and Kinects Analyze Movement in Tandem

As part of their research under the Dynamic Project Management (DPM) group, PhD candidates Joon Oh Seo and SangUk Han with UROP student Drew Nikolai used the Motion Capture system to study the ergonomics and biomechanics of climbing a ladder. The team, advised by Professor SangHyun Lee, is analyzing the movements of construction workers to identify behaviors that may lead to injury or undue stress on the body. Using MoCap the team can collect data on joint movement, and by using the Kinect they can collect depth information. By comparing the two data sets of Nikolai climbing and descending the ladder, Seo and Han can compare accuracy, and potentially use the Kinects to collect information at actual construction sites.

Autonomous Control of Helicopters

Autonomous Control of Helicopters

Under the guidance of Prof. Girard from Aerospace Engineering and with the help of the University of Michigan Duderstadt Center, two students developed novel algorithms to give autonomous motion control to miniature helicopters. Using the Duderstadt Center’s motion capture system they were able to determine the helicopters precise location in 3D space. With this information their algorithm determined proper throttle and heading for the helicopter to reach its goal. One classic example of their work involved two helicopters which were told to “defend” one of the creators (Zahid). Zahid wore a helmet with markers on it so the computer, and helicopters, knew where he was. Then, as he walked around the room the two helicopters followed beside him protecting him from any potential aggressors.

The Museum of Life and Death

The Museum of Life and Death

Andy Kirshner, a resident faculty member in the School of Music and Theater, used the University of Michigan Duderstadt Center’s motion capture service to record several movements for his production, The Museum of Life and Death which is described as:

“Set in the post-human 26th-century, The Museum of Life and Death is a radical reconsideration of the medieval Play of Everyman. Framed as a kind of post-human Masterpiece Theatre, and hosted by a chipper cyborg named Virgil, The Museum mixes 3D animation, projected video, live action, buddhist sutras, and original music to consider essential questions of Life, Death — and extinction — in our own time.”

Performance Homepage

Remote Dance Performances

Remote Dance Performances

Shortly after the acquisition of the University of Michigan’s first motion capture system faculty and students began exploring its use for the performing arts. One such project involved two dancers who coordinated their performances remotely. With one dancer performing in the MIDEN and the other in the Video Studio they effectively created a complete performance. The MIDEN performer wore our motion capture suit and had their point cloud (visualization of just her joints) streamed to the Video Studio where the other dancer was performing.

Another project related to remote performances involved a faculty member from the school of music who specialized in jazz compositions. He conducted a collection of performers remotely using methods similar to the dancers above. One unique challenge was the expressiveness and articulation of the composers hands and face. To solve this we placed additional markers on his face and hands so the remote musicians could identify his facial expressions and hand poses.