Surgical Planning for Dentistry: Digital Manipulation of the Jaw
Hera Kim-Berman is a Clinical Assistant Professor with the University of Michigan School of Dentistry. She recently approached the Duderstadt Center with an idea that would allow surgeons to prototype jaw surgery specific to patient data extracted from CT scans. Hera’s concept involved the ability to digitally manipulate portions of the skull in virtual reality, just as surgeons would when physically working with a patient, allowing them to preview different scenarios and evaluate how effective a procedure might be prior to engaging in surgery.
After providing the Duderstadt Center with CT scan data, Shawn O’Grady was able to extract 3D meshes of the patient’s skull and skin using Magics. From there, Stephanie O’Malley worked with the models to make them interactive and suitable for real-time platforms. This involved bringing the skull into a software like Zbrush and creating slices in the mesh to correspond to areas identified by Hera as places where the skull would potentially be segmented during surgery. The mesh was then also optimized to perform at a higher frame rate when incorporated into real-time platforms. The skin mesh was also altered, undergoing a process called “re-topologizing” which allowed it to be more smoothly deformed. From there, the segmented pieces of the skull were re-assembled, and then assigned influence over areas of the skin in a process called “rigging”. This allowed for areas of the skin to move with selected bones as they were separated and shifted by a surgeon in 3D space.
Once a working model was achieved, it was passed off to Ted Hall and student programmer Zachary Kiekover, to be implemented into the Duderstadt Center’s Jugular Engine, allowing the demo to run at large scale and in stereoscopic 3D from within the virtual reality MIDEN but also on smaller head mounted displays like the Oculus Rift. Additionally, more intuitive user controls were added which allowed for easier selection of the various bones using a game controller or motion tracked hand gestures via the Leap Motion. This meant surgeons could not only view the procedure from all angles in stereoscopic 3D, but they could also physically grab the bones they wanted to manipulate and transpose them in 3D space.
The Michigan Ion Beam Laboratory (MIBL) was established in 1986 as part of the Department of Nuclear Engineering and Radiological Sciences in the College of Engineering. Located on the University of Michigan’s North Campus, the MIBL serves to provide unique and extensive facilities to support research and development. Recently, Professor Gary Was, Director of the MIBL reached out to the Duderstadt Center for assistance with developing content for the MIBL website to better introduce users to the capabilities of their lab as construction on a new particle accelerator reached completion.
Gary’s group was able to provide the Duderstadt Center with a scale model of the Ion Beam Laboratory generated in Inventor and a detailed synopsis of the various components and executable experiments. From there, the Stephanie O’Malley of the Duderstadt Center optimized and beautified the provided model, adding corresponding materials, labels and lighting. A series of fly-throughs, zoom-ins, and experiment animations were generated from this model that would serve to introduce visitors to the various capabilities of the lab.
These interactive animations were then integrated into the MIBL’s wordpress platform by student programmer, Yun-Tzu Chang. Visitors to the MIBL website are now able to compare the simplified digital replica of the space with actual photos of the equipment as well as run various experiments to better understand how each component functions. To learn more about the Michigan Ion Beam Laboratory and to explore the space yourself, visit their website at mibl.engin.umich.edu.
Up until now, the Oculus Rift has been the go-to system for gamers seeking the ultimate immersive experience, offering immersive stereo compatibility with game engines like Unreal and Unity 3D. Recently, the Duderstadt Center was able to push this experience even further, with Graphics Programmer Sean Petty adapting the Unreal Engine to work within the Duderstadt Center’s M.I.D.E.N – a fully immersive, stereoscopic 3D virtual reality experience.
Visitors entering the MIDEN are equipped with a pair of stereo glasses and a game controller, both outfitted with reflective markers that are then tracked by a series of Vicon cameras positioned around the top perimeter of the space. The existing capabilities of the MIDEN allow viewers to not only explore a space beyond the confines of the 10’x10′ room, but to also interact with objects using the provided game controller.
The services of the Duderstadt Center are open to all departments within the University, making visualization services, professional studio spaces, and exploratory technology accessible to artists, engineers, architects and more. The diverse atmosphere of the Digital Media Commons generates a multitude of cross-curricular collaborative projects each year – From live performances featuring orchestras manipulated via brain waves to exploring the anatomy of a digital cadaver in virtual reality.
In the past the Duderstadt Center’s MIDEN has been used to prototype architectural spaces, host artistic installations and assess human behavior or simulate training scenarios. Incorporating the Unreal Engine into a space like the MIDEN allows visitors to experience an intense level of realism never before achieved in this sort of environment, opening new doors not just for gamers, but for those seeking high quality visualizations for research and exploration. Unreal Engine brings a wide range of materials and visual effects to any scene. From realistic water, foliage or glass, to effects like fire and transitions in the time of day.
Sean Petty, graphics programmer of the Duderstadt Center, explains his process for getting Unreal to operate from within the MIDEN:
The MIDEN requires us to render a different view of the scene to each of the four walls from the perspective of the user. In order to achieve this we must track the location and orientation of the users eyes, which is accomplished by motion tracking a pair of glasses worn by the user. In the MIDEN there is a dedicated computer performing the necessary calculations, the first step to enabling MIDEN support in Unreal is to modify the engine to interface with this computer.
Once the location of the user has been determined we must project the user’s view to each of the four walls. When rendering a scene in a standard desktop environment the camera is positioned in the center of the screen. A centered camera only requires a symmetric frustum projection which is the native transformation supported by Unreal. In the MIDEN, the center of the camera may be anywhere within the space and will often not be centered on a screen. This requires the use of an asymmetric frustum projection, which is functionality that had to be added to the engine.
Unrealhas native support for stereo by rendering the left and right views next to each other into the single image. This setup is used for devices such as the Oculus rift where the both images for the left and right eye are displayed at the same time. The MIDEN uses a technology called “active stereo”, where the displayed image flickers back and forth rapidly between the left and right images. This requires a modification to the engine so the left and right images are rendered to two separate buffers rather than to two sides of a single image.
The final step for displaying unreal scenes in the MIDEN is to get the four rendering computers communicating with each other. This ensures that when the user moves all the screens are updated appropriately to give a consistent view of the scene. The networking is accomplished using Unreal‘s built in network replication functionality, which is designed for use in multiplayer games.
With this latest development, researchers across all disciplines are now able to utilize this technology to reproduce lifelike environments for their studies giving subjects the ultimate immersive experience. It is hoped that this higher level of immersion offered by the Unreal Engine will have a dramatic impact in studies involving human behavior and environmental effects.
In addition to incorporating Unreal, the MIDEN also continues to operate using an in-house engine developed by Ted Hall & Sean Petty, called “Jugular,” which provides support for a broad range of models, materials, and interactivity. While Unreal offers finer elements of photo-realism for mesh-based geometry, Jugular supports easier import of a wider range of file types from a variety of sources, including not only meshes but also solid volumes and informatics graphs.
Proto is a natural science magazine produced by Massachusetts General Hospital in collaboration with Time Inc. Content Solutions. Launched in 2005, the magazine covers topics in the field of biomedicine and health care, targeting physicians, researchers and policy makers. In June, Proto featured an article, “Mortal Remains” that discusses alternatives to using real cadavers in the study of medicine.
Preserving human remains for use as a cadaver during a school semester has tremendous costs associated with it. The article in Proto magazine discusses options for revolutionizing this area of study, from the mention of old techniques like 17th Century anatomically correct wax models or Plastination (the process of removing fluids from the body and instead injecting a polymer) to new technology utilizing the Visible Human data, with a specific mention of the Duderstadt Center’s Virtual Cadaver.
To learn more, the full article from Proto Magazine can be found here.
The Anatomage table is a technologically advanced anatomy visualization system that allows users to explore the complex anatomy of the human body in digital form, eliminating the need for a human cadaver. The table presents a human figure at 1:1 scale, and utilizes data from the Visible Human effort with the additional capability of loading real patient data (CT, MRI, etc), making it a great resource for research, collaborative discovery, and the studying of surgical procedures. Funding to obtain the table was a collaborative effort between the schools of Dentistry, Movement Science, and Nursing although utilization is expected to expand to include Biology. Currently on display in the Duderstadt Center for exploration, the Anatomage table will be relocating to its more permanent home inside the Taubman Health Library in early July.
Professor Kamal Sarabandi, of Electrical Engingeering and Computer Science, and student Samuel Cook were looking into the accuracy of sonar equipment and came to the 3D Lab for assistance with visualizing their data. Their goal was to generate an accurate to scale 3D model of the EECS atrium that would be used to align their data to a physical space.
The Duderstadt Center’s Stephanie O’Malley and student consultant, Maggie Miller, used precise measurements and photo reference provided by Sam to re-create the atrium in 3D Studio Max. The point cloud data produced by their sonar was then exported as a *.PTS file, and brought into Autodesk Recap to quickly determine if everything appeared correct. When viewing point cloud data from the sonar, any significant gaps in the cloud indicate an obstruction, such as furniture, plants, or people.
Using the origin of the sonar device positioned on the second floor balcony, the data was aligned to the scene, and colored appropriately. When the images produced by the sonar were aligned with the re-created EECS atrium, they were able to see the sonar picking up large objects such as benches or posts because those areas did not produce data points. Professor Sarabandi’s research focus encompasses a wide range of topics in the area of applied electromagnetics. The visualization efforts of the Duderstadt Center assisted in furthering his research by helping to improve the accuracy of their radar.
The Museum of Natural History will soon be the host of several animations produced by the Duderstadt Center covering an array of space-related subjects. From understanding the behavior of black holes, to demonstrations of the life cycle of stars, Stephanie O’Malley, digital artist of the Duderstatdt Center, has created the animations in collaboration with Matthew Linke, the planetarium director, Lydia Bieri, professor in mathematics, and Kayhan Gultekin, an assistant researcher in astronomy.
The Museum of Natural History houses a vast collection of natural history objects ranging from local birds species, to larger mammals, to the skeletons of mammoths. The museum is located on campus and provides educational opportunities and exhibits open to both the campus and the wider community. The planetarium is located on the top floor of the museum. Since 1958 the planetarium has put on informative shows about astronomy for visitors. A full-dome screen is used to immerse guests in the night sky, and throughout the year staff put on seasonal star talks using the dome to visualize what the sky looks like at that time of the year.
The collaboration between visualization artists and scientists produced well-researched visualizations on an array of astronomy topics. These animations are unique in that much of what has been visualized stems from raw data in many cases. Nobody has ever photographed these events actually occurring in space and they are largely hypothetical in some cases. These animations are scheduled to be projected on the museum’s full-dome screen and used as a tool in classes to better acquaint students with concepts discussed in class. They are also being featured for a short time in a separate exhibit outside of the planetarium space.
Those familiar with Saturday Morning Physics lessons may recognize some of the animations, as they were shown recently during Lydia Bieri’s spot discussing gravitational lensing and gravity waves (Click here for the link to the video).
The animations created were each part of National Science Foundation funded grants. They were created in After Effects and 3D Studio Max, using a special plugin (Domemaster 3D camera shader) for Full Dome Planetarium warping (this is what gives single frames of an animation the correct distortion to be projected onto the planetarium’s curved ceiling). Frames were then rendered at 1200-4k pixel resolution to accommodate even very large planetariums looking to feature these animations.
The Kelsey Museum – Visualizing Lost Cylinder Seals
The Kelsey Museum houses a collection of more than 100,000 ancient and medieval objects from the civilizations of the Mediterranean and the Near East. Margaret Root, curator of the Greek and Near Eastern Collections at the Kelsey Museum, came to the Duderstadt Center with the impressions of several ancient cylinder seals. A cylinder seal is a small cylindrical tool, about one inch long, used in ancient times to engrave symbols or marks. When rolled in wet clay, the seal would leave an impression equivalent to a person’s “signature.” These signatures were commonly used to sign for goods when trading. Some of the earliest cylinder seals were found in the Mesopotamian region.The Kelsey Museum wanted to re-create these seals from the impressions to generate 3D prototypes or for use in a digital exhibit. These exhibits would allow visitors to the Kelsey to experience the cylinder seal tradition first-hand by providing seals and clay to roll their own impressions. The problem was these seals tend to get lost over time so the museum did not have the original seals, only the imprints.To recover the seal’s three-dimensional form, Margaret Root provided the Duderstadt Center with an outline of the imprints in Adobe Illustrator. From the outline, Stephanie O’Malley of the Duderstadt Center added varying amounts of grey to generate a depth map, where the darkest areas were the most inset and the lightest areas were the most protruding. With a depth map in place she was then able to inset areas on a cylindrical mesh in Zbrush (a 3d sculpting software) to re-create what the cylinder seal (the example seal is the “queen’s seal” ) would have looked like. Shawn O’Grady has printed one of these seals already.
The Duderstadt Center has since obtained the new Projet 3D printer, and plans are now underway to eventually print one of these on the Projet since it has a much higher print resolution and these seals are typically quite small.
To check out more at the Kelsey Museum, click here.
Architectural Visualization of Renovated Space for the Department of Pathology
The University of Michigan Health System, Department of Pathology has recently started making preparations to move to the North Campus Research Complex. Previously, the Department of Pathology had labs dispersed around the campus. Now there is a proposed $160 million effort to centralize the labs of the Department of Pathology and other health system branches in a space that will be more flexible and adept at accommodating future research and developments in technology.
Tsoi/Kobus and Associates, an architecture firm based in Cambridge, Massachusetts are the architects chosen to design the new labs. They are a firm based out of Cambridge, Massachusetts specializing in architecture and interior design of technology and science, university, and healthcare projects.
The Duderstadt Center played host to the design review lead by Christine Baker of the UMH Facilities Projects and Corrie Pennington-Block of CW Mott Administration. The Department of Pathology staff were invited and asked to give feedback on the designs. These meetings continued for a week from April 20-24, 2015 with various participating sub-groups. Sessions comprised of an introduction and orientation to the design using standard hard-copy architectural floor plans. Computer-generated walk-through videos and Google Earth maps were put up on the large Tiled Display to assist in visualizing.
The designers wanted the staff’s opinion on questions of space utilization and adjacencies. To assist in visualizing this, the designs were uploaded to the MIDEN through FBX files, a file form exported from Autodesk Revit. Through the use of the MIDEN, Pathology staff could walk through a full-scale virtual replication of the architects floor plan allowing participants to explore the proposed layout and give more in-depth feedback. Based in part on that feedback, the architects revised the design and ran
another review session on May 14.
Click here to read The Ann Arbor News article about the proposed lab space.
Robert Alexander is a Design Science Ph.D. Graduate and member of the Solar and Heliospheric Research Group. Working with NASA, Robert aims to use data audification to teach us something new about the Sun’s solar wind and is using mixed media coupled with unique interaction methods to pull viewers into the experience. The Duderstadt Center worked with Robert to put his research into video form: