Behind the Scenes: Re-creating Citizen Kane in VR

Behind the Scenes: Re-creating Citizen Kane in VR

inside a classic

Stephanie O’Malley

Students in Matthew Solomon’s classes are used to critically analyzing film. Now they get the chance to be the director for arguably one of the most influential films ever produced: Citizen Kane.

Using an application developed at the Duderstadt Center with grant funding provided by LSA Technology Services, students are placed in the role of the film’s director and able to record a prominent scene from the movie using a virtual camera. The film set which no longer exists, has been meticulously re-created in black and white CGI using reference photographs from the original set, with a CGI Orson Welles acting out the scene on repeat – his actions performed by Motion Capture actor Matthew Henerson, carefully chosen for his likeness to Orson Welles, with the Orson avatar generated from a photogrammetry scan of Matthew.

Top down view of the CGI re-creation of the film set for Citizen Kane

Analyzing the original film footage, doorways were measured, actor heights compared, and footsteps were counted, to determine a best estimate for the scale of the set when 3D modeling. With feedback from Citizen Kane expert, Harlan Lebo, fine details down to the topics of the books on the bookshelves were able to be determined.

Archival photograph provided by Vincent Longo of the original film set

Motion Capture actor Matthew Henerson was flown in to play the role of the digital Orson Welles. In a carefully choreographed session directed by Matthew’s PhD student, Vincent Longo, the iconic scene from Citizen Kane was re-enacted while the original footage played on an 80″ TV in the background, ensuring every step aligned to the original footage perfectly.

Actor Matthew Henerson in full mocap attire amidst the makeshift set for Citizen Kane – Props constructed using PVC. Photo provided by Shawn Jackson.

The boundaries of the set were taped on the floor so the data could be aligned to the digitally re-created set. Eight Vicon motion capture cameras, the same used throughout Hollywood for films like Lord of the Rings or Planet of the Apes, formed a circle around the makeshift set. These cameras rely on infrared light reflected off of tiny balls affixed to the motion capture suit to track the actor’s motion. Any props during the motion capture recording were carefully constructed out of cardboard and PVC (later to be 3D modeled) so as to not obstruct his movements. The 3 minutes of footage attempting to be re-created took 3 days to complete, comprised over 100 individual mocap takes and several hours of footage, which were then compared for accuracy and stitched together to complete the full route Orson travels through the environment.

Matthew Henerson
Orson Welles

  Matthew Henerson then swapped his motion capture suit for an actual suit, similar to that worn by Orson in the film, and underwent 3D scanning using the Duderstadt Center’s photogrammetry resources. 

Actor Matthew Henerson wears asymmetrical markers to assist the scanning process

Photogrammetry is a method of scanning existing objects or people, commonly used in Hollywood and throughout the video game industry to create a CGI likenesses of famous actors. This technology has been used in films like Star Wars (an actress similar in appearance to Carrie Fischer was scanned and then further sculpted, to create a more youthful Princess Leia) with entire studios now devoted to photogrammetry scanning. The process relies on several digital cameras surrounding the subject and taking simultaneous photographs.

Matthew Henerson being processed for Photogrammetry

The photos are submitted to a software that analyzes them on a per-pixel basis, looking for similar features across multiple photos. When a feature is recognized, it is triangulated using the focal length of the camera and it’s position relative to other identified features, allowing millions of tracking points to be generated. From this an accurate 3D model can be produced, with the original digital photos mapped to its surface to preserve photo-realistic color. These models can be further manipulated: Sometimes they are sculpted by an artist, or, with the addition of a digital “skeleton”, they can be driven by motion data to become a fully articulated digital character.

  The 3d modeled scene and scanned actor model were joined with mocap data and brought into the Unity game engine to develop the functionality students would need to film within the 3D set. A virtual camera was developed with all of the same settings you would find on a film camera from that era. When viewed in a virtual reality headset like the Oculus Rift, Matthew’s students can pick up the camera and physically move around to position it at different locations in the CGI environment, often capturing shots that otherwise would be difficult to do in a conventional film set. The footage students film within the app can be exported as MP4 video and then edited in their editing software of choice, just like any other camera footage.

  Having utilized the application for his course in the Winter of 2020, Matthew Solomon’s project with the Duderstadt Center was recently on display as part of the iLRN’s 2020 Immersive Learning Project Showcase & Competition. With Covid-19 making the conference a remote experience, the Citizen Kane project was able to be experienced in Virtual Reality by conference attendees using the FrameVR platform. Highlighting innovative ways of teaching with VR technologies, attendees from around the world were able to learn about the project and watch student edits made using the application.

Citizen Kane on display for iLRN’s 2020 Immersive Learning Project Showcase & Competition using Frame VR

Student Uses Photogrammetry to Miniaturize Herself

Stamps Student Uses Photogrammetry to Miniaturize Herself

  Stamps student Annie Turpin came to the Duderstadt Center with an idea for her Sophomore studio project: She wanted to create a hologram system, similar to the “Pepper’s Pyramid” or “Pepper’s Ghost” display, that would allow her to project a miniaturized version of herself into a pinhole camera.

Pepper’s Ghost relied on carefully placed mirrors to give the illusion of a transparent figure

  The concept of Pepper’s Pyramid is derived from an illusion technique created by John Henry Pepper in 1862. Originally coined “Pepper’s Ghost”, the trick initially relied on a large pane of glass to reflect an illuminated room or person that was hidden from view. This gave the impression of a “ghost” and became a technique frequently used in theatre to create a phantasmagoria. Similar methods are still used today, often substituting Mylar foil in place of glass and using CG content (such as the 2012 Coachella performance, in which a “holographic” Tupac was resurrected to sing alongside Dr. Dre).

Pepper’s Pyramid takes the concept of Pepper’s Ghost, and gives it 3 dimensions using a pyramid of Plexiglas instead of mirrors.

  “Pepper’s Pyramid” is a similar concept. Instead of a single pane of glass reflecting a single angle, a video is duplicated 4 times and projected downward onto a pyramid of Plexiglas, allowing the illusion to be viewed from multiple angles and for the content to be animated.

  For Annie’s project, she re-created a small version of Pepper’s Pyramid to fit inside a pinhole camera that she had constructed, and used a mobile phone to project the video instead of a monitor. She then had herself 3D scanned using the Duderstadt Center’s Photogrammetry rig to generate a realistic 3D model of herself that was animated and then exported as an MP4 video.

Annie’s pinhole camera

  The process of Photogrammetry allows an existing object or person to be converted into a full color, highly detailed, 3D model. This is done using a series of digital photographs captured 360 degrees around the subject. While Photogrammetry can be done at home for most static subjects, the Duderstadt Center’s Photogrammetry resources are set up to allow moving subjects like people to be scanned as well. The process using surface detail on the subject to plot points in 3D space and construct a 3D model. For scans of people, these models can even have a digital skeleton created to drive their motion, and be animated as CGI characters. Annie’s resulting scan was animated to rotate in place, and projected into the the plexiglas pyramid as a “hologram” for viewing through her pinhole camera.

The result of 3D printing Annie’s photogrammetry scan

  Annie would make use of Photogrammetry again the following year, when she had herself 3d scanned again, but this time for the purpose of 3D printing the resulting model for a diorama. In this instance, she was scanned using Photogrammetry in what is referred to as “T-Pose”. This is a pose where the subject stands with their arms and legs apart, so their limbs can be articulated into a different position later. After Annie’s model was generated, it was posed to have her sitting in a computer chair and working on a laptop. This model was sent to the Duderstadt Center’s J750 3D color printer to produce a 6″ high 3D printed model.

  This printer allows for full spectrum color and encases the model in a support structure that must be carefully removed, but allows for more intricate features and overhangs on the model.

Annie carefully removes the support structure from her 3D printed model

A duplicate print of Annie’s creation can now be viewed in the display case within the Duderstadt Center’s Fabrication Studio.

The Jewish Tradition of Tsedakah as Exemplified in Pushkes – Online Exhibit

The Jewish Tradition of Tsedakah as Exemplified in Pushkes – Online Exhibit

The pushke exhibit first appeared at the Jean & Samuel Frankel Center for Judaic Studies in the summer of 2015. The exhibit was composed of 40 pushkes (charitable donation boxes) of all shapes and sizes, situated in a series of display cases. The many diverse charity boxes reflect the breadth of the Jewish Heritage Collection Dedicated to Mark and Dave Harris, and illustrate the value of giving in Jewish communities throughout the world. Prior to being moved into storage for safekeeping, the collection underwent a lengthy scanning processes with help from the Duderstadt Center, to convert the collection into digitized 3D objects expanding accessibility by allowing the exhibit to be preserved and view-able online.

The Pushke Collection was digitized by the Duderstadt Center using the process of Photogrammetry. In this process, several high fidelity digital photographs are captured 360 degrees around the subject. These photos are analyzed by a computer algorithm to identify matching features on a per-pixel basis between photographs. These identified features are then used to triangulate a position within 3D space, allowing a 3D model of the object to be generated. The color information from the initial photographs is then mapped to the surface of the object in order to achieve a realistic digital replica. Select pieces of the Pushke collection have been further refined to correct imperfections resulting from the capturing process by an artist using digital sculpting and painting software, with the entire digital collection also being optimized for more efficient viewing on the web.

A web viewer was then developed and integrated into the Frankel Center’s WordPress site, to display and allow manipulation of the various pushkes in the collection. The web viewer allows each pushke to be rotated 360 degrees, and for the pushkes to be zoomed in or out, allowing for more detailed viewing than what traditional photographs typically allow.

The result of this effort, the Frankel Center’s online exhibit, “Charity Saves from Death: The Jewish Tradition of Tsedakah as Exemplified in Pushkes” can be viewed here:

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

Leisure & Luxury in the Age of Nero: VR Exhibit for the Kelsey Museum

As part of the Kelsey museum’s most grandiose exhibition to date, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii features over 230 artifacts from the ancient world. These artifacts originate from the ancient villa of Oplontis, an area near Pompeii that was destroyed when Mt. Vesuvius erupted.

Real world location of the ancient villa of Oplontis

The traveling exhibit explores the lavish lifestyle and economic interests of ancient Rome’s wealthiest. This location is currently being excavated and is currently off limits to the general public, but as part of the Kelsey’s exhibit, visitors will get to experience the location with the assistance of virtual reality headsets like the Oculus Rift and tablet devices.

The Duderstadt Center worked closely with curator Elaine Gazda as well as the Oplontis Project team from the University of Texas to optimize a virtual re-creation for the Oculus Rift & MIDEN and to generate panoramic viewers for tablet devices. The virtual environment uses high resolution photos and scan data captured on location, mapped to the surface of 3D models to give a very real sense of being at the real-world location.

Visitors to the Kelsey can navigate Oplontis in virtual reality through the use of an Oculus Rift headset, or through panoramas presented on tablet devices.

Visitors to the Kelsey can traverse this recreation using the Rift, or they can travel to the Duderstadt to experience it in the MIDEN – and not only can viewers experience the villa as they appear in modern day-they can also toggle on an artist’s re-creation of what the villas would have looked like before their destruction. In the re-created version of the scene, the ornate murals covering the walls of the villa are restored and foliage and ornate statues populate the scene. Alongside the virtual reality experience, the Kelsey Museum will also house a physically reconstructed replica of one of the rooms found in the villa as part of the exhibit.

Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii is a traveling exhibit that will also be on display at the Museum of the Rockies at the Montana State University, Bozeman, and the Smith College Museum of Art in Northampton, Massachusetts.

On Display at the Kelsey Museum, Leisure & Luxury in the Age of Nero: The Villas of Oplontis Near Pompeii

Creating Cave-Like Digital Structures with Photogrammetry

Creating Cave-Like Digital Structures with Photogrammetry

Students in Professor Matias Del Campo’s Architecture Thesis class have been exploring organic, cave-like structures for use in a real-world underground architectural space.

His students were tasked with constructing textured surfaces reminiscent of cave interiors such as stalactites and stalagmites, rocky surfaces, and erosion using a variety of mediums-from spray foam to poured concrete.

These creations were then scanned at the Duderstadt Center using the process of Photogrammetry to convert their model to digital form. The resulting digital models could then be altered (retouched, scaled or mirrored, for example) by the students for design purposes when incorporating the forms into the planned space.

Art Students Model With Photogrammetry

Art Students Model With Photogrammetry

The Stamps School of Art and Design features a fabrication class called Bits and Atoms. This course is taught by Sophia Brueckner and it focuses on detailed and accurate modeling for 3D digital fabrication and manufacturing.

Sophia brought her students into the Duderstadt Center to use our new Photogrammetry rig. This rig features 3 cameras that take multiple photos of a subject placed on a rotating platform. Each photograph captures a different angle of the subject. When these photos are imported into a computer program, the result is a 3D model of the subject. The program tracks the movement of reference points in each photo in order to construct this model. This process is called photogrammetry.

The art students created digital models of themselves by sitting on the rotating platform. Their 3D models were then manipulated using Rhino and Meshmixer.

Photogrammetry for the Stearns Collection

Photogrammetry for the Stearns Collection

Photogrammetry results from the Stearns Collection: Here a drum is captured, and visible are the original digital photographs taken inside Stearns, the drum generated as a point cloud, the point cloud developed into a 3D mesh, and then a fully textured 3D model.

Donated in 1899 by wealthy Detroit drug manufacturer, Frederick Stearns, the  Stearn’s Collection is a university collection comprised of over 2,500 historical and contemporary musical instruments from all over the world, with many of the instruments in the collection being particularly fragile or one of a kind. In 1966 Stearns grew to include the only complete Javanese gamelan in the world, and being home to such masterpieces, the Stearns collection has become recognized internationally as unique. In 1974, due to concerns about preservation and display, much of the collection was relocated out of public view. Once residing in Hill Auditorium, the majority of the collection now sits in storage inside an old factory near downtown Ann Arbor.

The current location of the Stearns Collection. Photo Credit:

Current preservation efforts have involved photographing the collection and making the nearly 13,000 resulting images available online. However, over the past year the Duderstadt Center has been working with Chris Dempsey, curator of the Stearns Collection and Jennifer Brown, a University Library Associate in Learning & Teaching, on a new process for preservation: Utilizing Photogrammetry to document the collection. Photogrammetry is a process that relies on several digital photographs of an artifact to re-construct the physical object into a digital 3D model. While traditional methods of obtaining 3D models often utilize markers placed atop the object, the process of Photogrammetry is largely un-invasive, allowing for minimal, and sometimes, no direct handling of an artifact. Models resulting from this process, when captured properly, are typically very precise and allow the viewer to rotate the object 360 degrees, zoom in and out, measure, or otherwise analyze the object in many cases as though it were actually in front of them.

Equipped with a high resolution digital SLR camera, Jennifer traveled to the warehouse where much of the Stearns collection is now held to document some of the instruments that are not currently on display and have limited accessibility to the general public. Feeding the resulting images into an experimental Photogrammetry software developed for research purposes (“Visual SFM” and “CMVS”), Jennifer processed the photos taken of various instruments into high resolution 3D models that could eventually be placed on the web for more accessible public viewing and student interaction.