An Application for Greek Transcription

An Application for Greek Transcription

Practice is the only way to learn a new language. However, when learning ancient languages, such as Greek, it can be difficult to get immediate, reliable feedback on practice work. This is why Professor Pablo Alvarez in Papyrology is working with Duderstadt Center student programmer Edward Wijaya to create an app for students to practice transcribing ancient Greek manuscripts into digital writing.

The app is divided into three modes: Professor/curator mode, student mode, and discovery mode. The professor mode allows the curator to upload a picture of the manuscript and post a line by line digital transcription of the document. These are the “answers” to the document. In student mode, these manuscript are transcribed by the students. When they click the check button, the student is given a line by line comparison to the curator’s answers. Furthermore, the discovery mode allows individuals with no Greek training to learn about the letters and read descriptions in the notations used.

A wide variety of fragile manuscripts which are often inaccessible to students are available on the app allowing the students to  gain experience with diverse handwriting and histories

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

Surgical Planning for Dentistry: Digital Manipulation of the Jaw

CT data was brought into Zbrush & Topogun to be segmented and re-topologized. Influence was then added to the skin mesh allowing it to deform as the bones were manipulated.

Hera Kim-Berman is a Clinical Assistant Professor with the University of Michigan School of Dentistry. She recently approached the Duderstadt Center with an idea that would allow surgeons to prototype jaw surgery specific to patient data extracted from CT scans. Hera’s concept involved the ability to digitally manipulate portions of the skull in virtual reality, just as surgeons would when physically working with a patient, allowing them to preview different scenarios and evaluate how effective a procedure might be prior to engaging in surgery.

Before re-positioning the jaw segments, the jaw has a shallow profile.

After providing the Duderstadt Center with CT scan data, Shawn O’Grady was able to extract 3D meshes of the patient’s skull and skin using Magics. From there, Stephanie O’Malley worked with the models to make them interactive and suitable for real-time platforms. This involved bringing the skull into a software like Zbrush and creating slices in the mesh to correspond to areas identified by Hera as places where the skull would potentially be segmented during surgery. The mesh was then also optimized to perform at a higher frame rate when incorporated into real-time platforms. The skin mesh was also altered, undergoing a process called “re-topologizing” which allowed it to be more smoothly deformed.  From there, the segmented pieces of the skull were re-assembled, and then assigned influence over areas of the skin in a process called “rigging”. This allowed for areas of the skin to move with selected bones as they were separated and shifted by a surgeon in 3D space.

After re-positioning of the jaw segments, the jaw is more pronounced.

Once a working model was achieved, it was passed off to Ted Hall and student programmer Zachary Kiekover, to be implemented into the Duderstadt Center’s Jugular Engine, allowing the demo to run at large scale and in stereoscopic 3D from within the virtual reality MIDEN but also on smaller head mounted displays like the Oculus Rift. Additionally, more intuitive user controls were added which allowed for easier selection of the various bones using a game controller or motion tracked hand gestures via the Leap Motion. This meant surgeons could not only view the procedure from all angles in stereoscopic 3D, but they could also physically grab the bones they wanted to manipulate and transpose them in 3D space.

Zachary demonstrates the ability to manipulate the model using the Leap Motion.

Tour the Michigan Ion Beam Laboratory in 3D

Tour the Michigan Ion Beam Laboratory in 3D

3D Model of the Michigan Ion Beam Laboratory

The Michigan Ion Beam Laboratory (MIBL) was established in 1986 as part of the Department of Nuclear Engineering and Radiological Sciences in the College of Engineering. Located on the University of Michigan’s North Campus, the MIBL serves to provide unique and extensive facilities to support research and development. Recently, Professor Gary Was, Director of the MIBL reached out to the Duderstadt Center for assistance with developing content for the MIBL website to better introduce users to the capabilities of their lab as construction on a new particle accelerator reached completion.

Gary’s group was able to provide the Duderstadt Center with a scale model of the Ion Beam Laboratory generated in Inventor and a detailed synopsis of the various components and executable experiments. From there, the Stephanie O’Malley of the Duderstadt Center optimized and beautified the provided model, adding corresponding materials, labels and lighting. A series of fly-throughs, zoom-ins, and experiment animations were generated from this model that would serve to introduce visitors to the various capabilities of the lab.

These interactive animations were then integrated into the MIBL’s wordpress platform by student programmer, Yun-Tzu Chang. Visitors to the MIBL website are now able to compare the simplified digital replica of the space with actual photos of the equipment as well as run various experiments to better understand how each component functions.  To learn more about the Michigan Ion Beam Laboratory and to explore the space yourself, visit their website at  mibl.engin.umich.edu.

Unreal Engine in Stereoscopic Virtual Reality

Unreal Engine in Stereoscopic Virtual Reality

Up until now, the Oculus Rift has been the go-to system for gamers seeking the ultimate immersive experience, offering immersive stereo compatibility with game engines like Unreal and Unity 3D. Recently, the Duderstadt Center was able to push this experience even further, with Graphics Programmer Sean Petty adapting the Unreal Engine to work within the Duderstadt Center’s M.I.D.E.N – a fully immersive, stereoscopic 3D virtual reality experience.

Visitors entering the MIDEN are equipped with a pair of stereo glasses and a game controller, both outfitted with reflective markers that are then tracked by a series of Vicon cameras positioned around the top perimeter of the space. The existing capabilities of the MIDEN allow viewers to not only explore a space beyond the confines of the 10’x10′ room, but to also interact with objects using the provided game controller.

The services of the Duderstadt Center are open to all departments within the University, making visualization services, professional studio spaces, and exploratory technology accessible to artists, engineers, architects and more. The diverse atmosphere of the Digital Media Commons generates a multitude of cross-curricular collaborative projects each year – From live performances featuring orchestras manipulated via brain waves to exploring the anatomy of a digital cadaver in virtual reality.

In the past the Duderstadt Center’s MIDEN has been used to prototype architectural spaces, host artistic installations and assess human behavior or simulate training scenarios. Incorporating the Unreal Engine into a space like the MIDEN allows visitors to experience an intense level of realism never before achieved in this sort of environment, opening new doors not just for gamers, but for those seeking high quality visualizations for research and exploration. Unreal Engine brings a wide range of materials and visual effects to any scene. From realistic water, foliage or glass, to effects like fire and transitions in the time of day.

Sean Petty, graphics programmer of the Duderstadt Center, explains his process for getting Unreal to operate from within the MIDEN:

The MIDEN requires us to render a different view of the scene to each of the four walls from the perspective of the user. In order to achieve this we must track the location and orientation of the users eyes, which is accomplished by motion tracking a pair of glasses worn by the user. In the MIDEN there is a dedicated computer performing the necessary calculations, the first step to enabling MIDEN support in Unreal is to modify the engine to interface with this computer.

Visitors to the MIDEN are motion tracked within the space via reflective markers placed around a pair of stereo glasses and a hand held game controller. These markers are monitored by eight Vicon cameras located along the perimeter of the MIDEN.

Once the location of the user has been determined we must project the user’s view to each of the four walls. When rendering a scene in a standard desktop environment the camera is positioned in the center of the screen. A centered camera only requires a symmetric frustum projection which is the native transformation supported by Unreal. In the MIDEN, the center of the camera may be anywhere within the space and will often not be centered on a screen. This requires the use of an asymmetric frustum projection, which is functionality that had to be added to the engine.

Images for each wall are projected through a corresponding projector located behind the walls of the MIDEN. The floor is projected using a mirror located at the top of the space.

Unreal has native support for stereo by rendering the left and right views next to each other into the single image. This setup is used for devices such as the Oculus rift where the both images for the left and right eye are displayed at the same time. The MIDEN uses a technology called “active stereo”, where the displayed image flickers back and forth rapidly between the left and right images. This requires a modification to the engine so the left and right images are rendered to two separate buffers rather than to two sides of a single image.

Unreal Engine as seen from within the Duderstadt Center’s Virtual Reality MIDEN. The MIDEN is a 10’x10′ room comprised of 5 walls utilizing stereoscopic projection. Visitors are tracked using Vicon cameras allowing them to travel beyond the confines of the physical space.

The final step for displaying unreal scenes in the MIDEN is to get the four rendering computers communicating with each other. This ensures that when the user moves all the screens are updated appropriately to give a consistent view of the scene. The networking is accomplished using Unreal‘s built in network replication functionality, which is designed for use in multiplayer games.

With this latest development, researchers across all disciplines are now able to utilize this technology to reproduce lifelike environments for their studies giving subjects the ultimate immersive experience. It is hoped that this higher level of immersion offered by the Unreal Engine will have a dramatic impact in studies involving human behavior and environmental effects.

In addition to incorporating Unreal, the MIDEN also continues to operate using an in-house engine developed by Ted Hall & Sean Petty, called “Jugular,” which provides support for a broad range of models, materials, and interactivity. While Unreal offers finer elements of photo-realism for mesh-based geometry, Jugular supports easier import of a wider range of file types from a variety of sources, including not only meshes but also solid volumes and informatics graphs.

Xplore Engineering

Xplore Engineering

Xplore Engineering is a summer camp program designed for Engineering alumni and their children in 4th – 7th grade. Through a series of experiential workshops, participants get hands-on experience in a variety of engineering disciplines. This marked the second year the Duderstadt Center was invited to participate in the Xplore Engineering workshops, this time offering students the chance to design and then 3D print custom fashion rings. Kids were introduced to activities provided by Cubify.com that allow for the creation of simple 3D printed objects like dog tags, bracelets, or rings. Each child had an opportunity to work with their guardian to design a custom ring in the style of their choice in a workshop led by Stephanie O’Malley. Some created designs incorporating their initials, others went with unique designs or simple shapes. Once each child had completed their design, they were given an introduction to how 3D Printers work by Shawn O’Grady. Their files were assembled for printing in the Cubify software, and then each child had a chance to send their print to the Cube 3 3D Printers for printing, a unique opportunity for them to get involved in operating the technology. As they watched their creations be printed, the group was introduced to unique applications for 3D printing, from the creation of assets in stop motion movies like Coraline to the 3D printing of prosthetics! For more information on the Xplore Engineering summer camp, and other interesting opportunities with the school of Engineering, visit www.engin.umich.edu/mconnex

Printing in 3D
Use 3-dimensional printers in the U of M 3D printing lab to to program a 3D model and even take home one of your own. You’ll also get a behind-the-scenes tour of the 3D lab.
Thursday session 3
Photo: Jessica Knedgen
MconneX
www.engin.umich.edu/mconnex

Virtual Cadaver Featured in Proto Magazine

Virtual Cadaver Featured in Proto Magazine

Proto Magazine features articles on biomedicine and health care, targeting physicians, researchers and policy makers.

Proto is a natural science magazine produced by Massachusetts General Hospital in collaboration with Time Inc. Content Solutions. Launched in 2005, the magazine covers topics in the field of biomedicine and health care, targeting physicians, researchers and policy makers. In June, Proto featured an article, “Mortal Remains” that discusses alternatives to using real cadavers in the study of medicine.

Preserving human remains for use as a cadaver during a school semester has tremendous costs associated with it. The article in Proto magazine discusses options for revolutionizing this area of study, from the mention of old techniques like 17th Century anatomically correct wax models or Plastination (the process of removing fluids from the body and instead injecting a polymer) to new technology utilizing the Visible Human data, with a specific mention of the Duderstadt Center’s Virtual Cadaver.

To learn more, the full article from Proto Magazine can be found here.

vis_visible-human_miden_02
Sean Petty manipulates cross-sections of the Virtual Cadaver from within the 3D Lab’s virtual reality environment, the MIDEN.

Exploring Human Anatomy with the Anatomage Table

 

Exploring Human Anatomy with the Anatomage Table

The Anatomage table is a technologically advanced anatomy visualization system that allows users to explore the complex anatomy of the human body in digital form, eliminating the need for a human cadaver. The table presents a human figure at 1:1 scale, and utilizes data from the Visible Human effort with the additional capability of loading real patient data (CT, MRI, etc), making it a great resource for research, collaborative discovery, and the studying of surgical procedures. Funding to obtain the table was a collaborative effort between the schools of Dentistry, Movement Science, and Nursing although utilization is expected to expand to include Biology. Currently on display in the Duderstadt Center for exploration, the Anatomage table will be relocating to its more permanent home inside the Taubman Health Library in early July.

The Anatomage table allows users to explore the complex anatomy of the human body.

Photogrammetry for the Stearns Collection

Photogrammetry for the Stearns Collection

Photogrammetry results from the Stearns Collection: Here a drum is captured, and visible are the original digital photographs taken inside Stearns, the drum generated as a point cloud, the point cloud developed into a 3D mesh, and then a fully textured 3D model.

Donated in 1899 by wealthy Detroit drug manufacturer, Frederick Stearns, the  Stearn’s Collection is a university collection comprised of over 2,500 historical and contemporary musical instruments from all over the world, with many of the instruments in the collection being particularly fragile or one of a kind. In 1966 Stearns grew to include the only complete Javanese gamelan in the world, and being home to such masterpieces, the Stearns collection has become recognized internationally as unique. In 1974, due to concerns about preservation and display, much of the collection was relocated out of public view. Once residing in Hill Auditorium, the majority of the collection now sits in storage inside an old factory near downtown Ann Arbor.

The current location of the Stearns Collection. Photo Credit: www.dailymail.co.uk

Current preservation efforts have involved photographing the collection and making the nearly 13,000 resulting images available online. However, over the past year the Duderstadt Center has been working with Chris Dempsey, curator of the Stearns Collection and Jennifer Brown, a University Library Associate in Learning & Teaching, on a new process for preservation: Utilizing Photogrammetry to document the collection. Photogrammetry is a process that relies on several digital photographs of an artifact to re-construct the physical object into a digital 3D model. While traditional methods of obtaining 3D models often utilize markers placed atop the object, the process of Photogrammetry is largely un-invasive, allowing for minimal, and sometimes, no direct handling of an artifact. Models resulting from this process, when captured properly, are typically very precise and allow the viewer to rotate the object 360 degrees, zoom in and out, measure, or otherwise analyze the object in many cases as though it were actually in front of them.

Equipped with a high resolution digital SLR camera, Jennifer traveled to the warehouse where much of the Stearns collection is now held to document some of the instruments that are not currently on display and have limited accessibility to the general public. Feeding the resulting images into an experimental Photogrammetry software developed for research purposes (“Visual SFM” and “CMVS”), Jennifer processed the photos taken of various instruments into high resolution 3D models that could eventually be placed on the web for more accessible public viewing and student interaction.

Sonar Visualized in EECS

Sonar Visualized in EECS

Original point cloud data brought into Autodesk Recap

Professor Kamal Sarabandi, of Electrical Engingeering and Computer Science, and student Samuel Cook were looking into the accuracy of sonar equipment and came to the 3D Lab for assistance with visualizing their data. Their goal was to generate an accurate to scale 3D model of the EECS atrium that would be used to align their data to a physical space.

Gaps in point cloud data indicate an obstruction encountered by the sonar.

The Duderstadt Center’s Stephanie O’Malley and student consultant, Maggie Miller, used precise measurements and photo reference provided by Sam to re-create the atrium in 3D Studio Max. The point cloud data produced by their sonar was then exported as a *.PTS file, and brought into Autodesk Recap to quickly determine if everything appeared correct. When viewing point cloud data from the sonar, any significant gaps in the cloud indicate an obstruction, such as furniture, plants, or people.

Using the origin of the sonar device positioned on the second floor balcony, the data was aligned to the scene, and colored appropriately.  When the images produced by the sonar were aligned with the re-created EECS atrium, they were able to see the sonar picking up large objects such as benches or posts because those areas did not produce data points.  Professor Sarabandi’s research focus encompasses a wide range of topics in the area of applied electromagnetics.  The visualization efforts of the Duderstadt Center assisted in furthering his research by helping to improve the accuracy of their radar.

Sonar data aligned to a model of the EECS atrium

Museum of Natural History – Planetarium Shows

Museum of Natural History – Planetarium Shows

Shredding Stars: Stars are consumed by a black hole
The Museum of Natural History will soon be the host of several animations produced by the Duderstadt Center covering an array of space-related subjects. From understanding the behavior of black holes, to demonstrations of the life cycle of stars, Stephanie O’Malley, digital artist of the Duderstatdt Center, has created the animations in collaboration with Matthew Linke, the planetarium director, Lydia Bieri, professor in mathematics, and Kayhan Gultekin, an assistant researcher in astronomy.
Kicked out black holes: The gravitational pull of a black hole can cause multiple black holes to merge together. This spinning motion then causes the merged black holes to be kicked out of orbit.
The Museum of Natural History houses a vast collection of natural history objects ranging from local birds species, to larger mammals, to the skeletons of mammoths.  The museum is located on campus and provides educational opportunities and exhibits open to both the campus and the wider community.  The planetarium is located on the top floor of the museum.  Since 1958 the planetarium has put on informative shows about astronomy for visitors.  A full-dome screen is used to immerse guests in the night sky, and throughout the year staff put on seasonal star talks using the dome to visualize what the sky looks like at that time of the year.
 
The collaboration between visualization artists and scientists produced well-researched visualizations on an array of astronomy topics.  These animations are unique in that much of what has been visualized stems from raw data in many cases.  Nobody has ever photographed these events actually occurring in space and they are largely hypothetical in some cases.  These animations are scheduled to be projected on the museum’s full-dome screen and used as a tool in classes to better acquaint students with concepts discussed in class.  They are also being featured for a short time in a separate exhibit outside of the planetarium space.
 
Those familiar with Saturday Morning Physics lessons may recognize some of the animations, as they were shown recently during Lydia Bieri’s spot discussing gravitational lensing and gravity waves (Click here for the link to the video).
Gravitational Lensing: A gravitational lens refers to a distribution of matter (such as a cluster of galaxies) between a distant source and an observer, that is capable of bending the light from the source, as it travels towards the observer.

The animations created were each part of National Science Foundation funded grants. They were created in After Effects and 3D Studio Max, using a special plugin (Domemaster 3D camera shader) for Full Dome Planetarium warping (this is what gives single frames of an animation the correct distortion to be projected onto the planetarium’s curved ceiling). Frames were then rendered at 1200-4k pixel resolution to accommodate even very large planetariums looking to feature these animations.