Robert Alexander’s “Audification Explained” Featured on BBC World Service

Robert Alexander’s “Audification Explained” Featured on BBC World Service

Sonification is the conversion of data sets to audio files. Robert Alexander II is a Sonification Specialist working with NASA, who uses satellite recordings of the sun’s emissions to discover new solar phenomena. The Duderstadt Center worked with Robert to produce a short video explaining the concept of data audification.

Recently Robert was featured in a BBC World Service clip along with his video about making music from the sun: http://www.bbc.co.uk/programmes/p03crzsv

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba On Display

Mammoth Calf Lyuba, a Collaborative Exploration of Data

On Nov. 17th-19th the Duderstadt Center’s Visualization Expert, Ted Hall, will be in Austin, Texas representing the Duderstadt Center at SC15, a super computing event. The technology on display will allow people in Austin to be projected into the MIDEN, the University of Michigan’s immersive virtual reality cave, allowing visitors in both Ann Arbor and in Austin to explore the body of a mummified mammoth.

The mummified remains of Lyuba.

The mammoth in question is a calf called Lyuba, found in Siberia in 2007 after being preserved underground for 50,000 years. This specimen is considered the best preserved mammoth mummy in the world, and is currently on display in the Shemanovskiy Museum and Exhibition Center in Salekhard, Russia.

University of Michigan Professor Daniel Fisher and his colleagues at the University of Michigan Museum of Paleontology arranged to have the mummy scanned using X-Ray computed tomography in Ford Motor Company’s Nondestructive Evaluation Laboratory. Adam Rountrey then applied a color map to the density data to reveal the internal anatomical structures.

Lyuba with her skeleton visible.

The Duderstadt Center got this data as an image stack for interactive volumetric visualization. The stack comprises 1,132 JPEG image slices with 762×700 pixel resolution per slice. Each of the resulting voxels is 1mm cubed.

When this data is brought into the Duderstadt Center’s Jugular software, the user can interactively slice through the mammoth’s total volume by manipulating a series of hexagonal planes, revealing the internal structure. In the MIDEN, the user can explore the mammoth in the same way while the mammoth appears to exist in front of them in three virtual dimensions. The MIDEN’s Virtual Cadaver used a similar process.

For the demo at SC15, users in Texas can occupy the same virtual space as another user in Ann Arbor’s MIDEN. Via a Kinect sensor in Austin, a 3D mesh of the user will be projected into the MIDEN alongside Lyuba allowing for simultaneous interaction and exploration of the data.

Showings will take place in the MIDEN

Sean Petty and Ted Hall simultaneously explore the Lyuba data set, with Ted’s form being projected into the virtual space of the MIDEN via Kinect sensor.

More about the Lyuba specimen:
Fisher, Daniel C.; Shirley, Ethan A.; Whalen, Christopher D.; Calamari, Zachary T.; Rountrey, Adam N.;
Tikhonov, Alexei N.; Buigues, Bernard; Lacombat, Frédéric; Grigoriev, Semyon; Lazarev, Piotr A. (2014 July). “X-ray Computed Tomography of Two Mammoth Calf Mummies.” Journal of Paleontology 88(4):664-675. DOI: http://dx.doi.org/10.1666/13-092
https://en.wikipedia.org/wiki/Lyuba
http://www.dallasnews.com/lifestyles/travel/headlines/20100418-42-000-year-old-baby-mammoth-4566.ece

Sonar Visualized in EECS

Sonar Visualized in EECS

Original point cloud data brought into Autodesk Recap

Professor Kamal Sarabandi, of Electrical Engingeering and Computer Science, and student Samuel Cook were looking into the accuracy of sonar equipment and came to the 3D Lab for assistance with visualizing their data. Their goal was to generate an accurate to scale 3D model of the EECS atrium that would be used to align their data to a physical space.

Gaps in point cloud data indicate an obstruction encountered by the sonar.

The Duderstadt Center’s Stephanie O’Malley and student consultant, Maggie Miller, used precise measurements and photo reference provided by Sam to re-create the atrium in 3D Studio Max. The point cloud data produced by their sonar was then exported as a *.PTS file, and brought into Autodesk Recap to quickly determine if everything appeared correct. When viewing point cloud data from the sonar, any significant gaps in the cloud indicate an obstruction, such as furniture, plants, or people.

Using the origin of the sonar device positioned on the second floor balcony, the data was aligned to the scene, and colored appropriately.  When the images produced by the sonar were aligned with the re-created EECS atrium, they were able to see the sonar picking up large objects such as benches or posts because those areas did not produce data points.  Professor Sarabandi’s research focus encompasses a wide range of topics in the area of applied electromagnetics.  The visualization efforts of the Duderstadt Center assisted in furthering his research by helping to improve the accuracy of their radar.

Sonar data aligned to a model of the EECS atrium

Museum of Natural History – Planetarium Shows

Museum of Natural History – Planetarium Shows

Shredding Stars: Stars are consumed by a black hole
The Museum of Natural History will soon be the host of several animations produced by the Duderstadt Center covering an array of space-related subjects. From understanding the behavior of black holes, to demonstrations of the life cycle of stars, Stephanie O’Malley, digital artist of the Duderstatdt Center, has created the animations in collaboration with Matthew Linke, the planetarium director, Lydia Bieri, professor in mathematics, and Kayhan Gultekin, an assistant researcher in astronomy.
Kicked out black holes: The gravitational pull of a black hole can cause multiple black holes to merge together. This spinning motion then causes the merged black holes to be kicked out of orbit.
The Museum of Natural History houses a vast collection of natural history objects ranging from local birds species, to larger mammals, to the skeletons of mammoths.  The museum is located on campus and provides educational opportunities and exhibits open to both the campus and the wider community.  The planetarium is located on the top floor of the museum.  Since 1958 the planetarium has put on informative shows about astronomy for visitors.  A full-dome screen is used to immerse guests in the night sky, and throughout the year staff put on seasonal star talks using the dome to visualize what the sky looks like at that time of the year.
 
The collaboration between visualization artists and scientists produced well-researched visualizations on an array of astronomy topics.  These animations are unique in that much of what has been visualized stems from raw data in many cases.  Nobody has ever photographed these events actually occurring in space and they are largely hypothetical in some cases.  These animations are scheduled to be projected on the museum’s full-dome screen and used as a tool in classes to better acquaint students with concepts discussed in class.  They are also being featured for a short time in a separate exhibit outside of the planetarium space.
 
Those familiar with Saturday Morning Physics lessons may recognize some of the animations, as they were shown recently during Lydia Bieri’s spot discussing gravitational lensing and gravity waves (Click here for the link to the video).
Gravitational Lensing: A gravitational lens refers to a distribution of matter (such as a cluster of galaxies) between a distant source and an observer, that is capable of bending the light from the source, as it travels towards the observer.

The animations created were each part of National Science Foundation funded grants. They were created in After Effects and 3D Studio Max, using a special plugin (Domemaster 3D camera shader) for Full Dome Planetarium warping (this is what gives single frames of an animation the correct distortion to be projected onto the planetarium’s curved ceiling). Frames were then rendered at 1200-4k pixel resolution to accommodate even very large planetariums looking to feature these animations.

The Kelsey Museum – Visualizing Lost Cylinder Seals

The Kelsey Museum – Visualizing Lost Cylinder Seals

2D illustration of one of the seal imprints used to generate a 3D model

The Kelsey Museum houses a collection of more than 100,000 ancient and medieval objects from the civilizations of the Mediterranean and the Near East.  Margaret Root, curator of the Greek and Near Eastern Collections at the Kelsey Museum, came to the Duderstadt Center with the impressions of several ancient cylinder seals.  A cylinder seal is a small cylindrical tool, about one inch long, used in ancient times to engrave symbols or marks.  When rolled in wet clay, the seal would leave an impression equivalent to a person’s “signature.”  These signatures were commonly used to sign for goods when trading.  Some of the earliest cylinder seals were found in the Mesopotamian region.The Kelsey Museum wanted to re-create these seals from the impressions to generate 3D prototypes or for use in a digital exhibit.  These exhibits would allow visitors to the Kelsey to experience the cylinder seal tradition first-hand by providing seals and clay to roll their own impressions.  The problem was these seals tend to get lost over time so the museum did not have the original seals, only the imprints.To recover the seal’s three-dimensional form, Margaret Root provided the Duderstadt Center with an outline of the imprints in Adobe Illustrator.  From the outline, Stephanie O’Malley of the Duderstadt Center added varying amounts of grey to generate a depth map, where the darkest areas were the most inset and the lightest areas were the most protruding.  With a depth map in place she was then able to inset areas on a cylindrical mesh in Zbrush (a 3d sculpting software) to re-create what the cylinder seal (the example seal is the “queen’s seal” ) would have looked like. Shawn O’Grady has printed one of these seals already.

A 3D render of the re-created cylinder seal.

The Duderstadt Center has since obtained the new Projet 3D printer, and plans are now underway to eventually print one of these on the Projet since it has a much higher print resolution and these seals are typically quite small.

To check out more at the Kelsey Museum, click here.

Two Minute Tech: Audification Explained

 

Two Minute Tech: Audification Explained

Robert Alexander is a Design Science Ph.D. Graduate and member of the Solar and Heliospheric Research Group. Working with NASA, Robert aims to use data audification to teach us something new about the Sun’s solar wind and is using mixed media coupled with unique interaction methods to pull viewers into the experience. The Duderstadt Center worked with Robert to put his research into video form:

 

Duderstadt Center Collaboration on NASA Proposals

Duderstadt Center Collaboration on NASA Proposals

Cover Graphics for the Armada Proposal for NASA

Over the years the Duderstadt Center has provided its services of visualization for a variety of NASA Proposals. Submitting a proposal requires a packet of information and visual aids that follow a strict format and series of guidelines.

Illustration from NASA proposal, MAARSI

Most recently, the Duderstadt Center assisted with the Mars Radar and Radiometry Subsurface Investigation (MARRSI) proposal. This was submitted in December 2013 and is currently awaiting a response. This proposal aims to implement new ways of tracing evidence of water in the martian soil, by utilizing the antenna of the existing Mars rovers. This antenna would detect signals from Earth that are reflected off the surface of Mars, thereby probing the soil for indications of water. The Duderstadt Center worked with the professor involved, as well as NASA’s Jet Propulsion Laboratory to design a proposal cover, diagrams and CDs for submission that adhere to the format requested.

Satellite render for NASA proposal, AERIE

Additionally, the Duderstadt Center was also involved in the Trace Gas Microwave Radiometer (TGMR) proposal. This proposal was centered on detecting the processes that produce and destroy methane gas on the surface of Mars. The goal of both of these proposals is to seek evidence of both methane and water on Mars, which may lead to discovering signs of bacterial life on Mars.

In the past, the Duderstadt Center designed mission logos and a cover for the Armada proposal. This proposal concerned documenting atmospheric events on Earth using cube satellites.

Virtual Prototyping of Classrooms – Business School

Virtual Prototyping of Classrooms – Business School

The designing of architectural spaces provides unique challenges, especially when those spaces are intended to serve specific functions as well. The Ross School of Business recently constructed a new building which strived to meet the needs of the school’s faculty and students. Within the new construction was a plan for new U shaped classrooms. Since the design was unlike what many have used in the past and their effectiveness during daily classes was in question, the School of Business planned to construct test sites so faculty could experience the room before it was built. These test sites were typical of movie sets costing hundreds of thousands of dollars. If changes needed to be made, the site would need to be reconstructed to the new plans.

Dean Graham Mercer, approached the University of Michigan Duderstadt Center looking for a more cost effective solution to identifying problems in the design earlier on. Through the use of the Virtual Reality MIDEN, which has the distinct ability to display virtual worlds at true 1-to-1 scale, faculty from the School of Business was able to experience the proposed classrooms prior to the physical construction of the space and offer suggestions with confidence. This process cost the school a fraction of building physical test sites and allowed for rapid turn around on any additions they needed.

The new classrooms can now be seen in the Ross School of Business on Central Campus.