Passion & Violence: Anna Galeotti’s MIDEN Installation

Passion & Violence

Anna Galeotti’s MIDEN INstallation

Ph.D. Fullbright Scholar (Winter, 2014) Anna Galeotti:  exploring the concept of “foam” or “bubbles” as a possible model for audiovisual design elements and their relationships. Her art installation, “Passion and Violence in Brazil” was displayed in the Duderstadt Center’s MIDEN.

Interested in using the MIDEN to do something similar? Contact us.

Extended Reality: changing the face of learning, teaching, and research

Extended Reality: changing the face of learning, teaching, and research

Written by Laurel Thomas, Michigan News

Students in a film course can evoke new emotions in an Orson Welles classic by virtually changing the camera angles in a dramatic scene.

Any one of us could take a smartphone, laptop, paper, Play-doh and an app developed at U-M, and with a little direction become a mixed reality designer. 

A patient worried about an upcoming MRI may be able put fears aside after virtually experiencing the procedure in advance. 

Dr. Jadranka Stojanovska, one of the collaborators on the virtual MRI, tries on the device

This is XR—Extended Reality—and the University of Michigan is making a major investment in how to utilize the technology to shape the future of learning. 

Recently, Provost Martin Philbert announced a three-year funded initiative led by the Center for Academic Innovation to fund XR, a term used to encompass augmented reality, virtual reality, mixed reality and other variations of computer-generated real and virtual environments and human-machine interactions. 

The Initiative will explore how XR technologies can strengthen the quality of a Michigan education, cultivate interdisciplinary practice, and enhance a national network of academic innovation. 

Throughout the next three years, the campus community will explore new ways to integrate XR technologies in support of residential and online learning and will seek to develop innovative partnerships with external organizations around XR in education.

“Michigan combines its public mission with a commitment to research excellence and innovation in education to explore how XR will change the way we teach and learn from the university of the future,” said Jeremy Nelson, new director of the XR Initiative in the Center for Academic Innovation.

Current Use of XR

 
Applications of the technology are already changing the learning experience across the university in classrooms and research labs with practical application for patients in health care settings. 

In January 2018, a group of students created the Alternative Reality Initiative to provide a community for hosting development workshops, discussing industry news, and connecting students in the greater XR ecosystem.

In Matthew Solomon‘s film course, students can alter a scene in Orson Welles’ classic “Citizen Kane.” U-M is home to one of the largest Orson Welles collections in the world.

Solomon’s concept for developers was to take a clip from the movie and model a scene to look like a virtual reality setting—almost like a video game. The goal was to bring a virtual camera in the space so students could choose shot angles to change the look and feel of the scene. 

This VR tool will be used fully next semester to help students talk about filmmaker style, meaning and choice.

“We can look at clips in class and be analytical but a tool like this can bring these lessons home a little more vividly,” said Solomon, associate professor in the Department of Film, Television and Media.

A scene from Orson Welles’ “Citizen Kane” from the point of view of a virtual camera that allows students to alter the action.

Sara Eskandari, who just graduated with a Bachelor of Arts from the Penny Stamps School of Art and a minor in Computer Science, helped develop the tool for Solomon’s class as a member of the Visualization Studio team.

“I hope students can enter an application like ‘Citizen Kane’ and feel comfortable experimenting, iterating, practicing, and learning in a low-stress environment,” Eskandari said. “Not only does this give students the feeling of being behind an old-school camera, and supplies them with practice footage to edit, but the recording experience itself removes any boundaries of reality. 

“Students can float to the ceiling to take a dramatic overhead shot with the press of a few buttons, and a moment later record an extreme close up with entirely different lighting.”

Sara Blair, vice provost for academic and faculty affairs, and the Patricia S. Yaeger Collegiate Professor of English Language and Literature, hopes to see more projects like “Citizen Kane.”

“An important part of this project, which will set it apart from experiments with XR on many other campuses, is our interest in humanities-centered perspectives to shape innovations in teaching and learning at a great liberal arts institution,” Blair said. “How can we use XR tools and platforms to help our students develop historical imagination or to help students consider the value and limits of empathy, and the way we produce knowledge of other lives than our own? 

“We hope that arts and humanities colleagues won’t just participate in this [initiative] but lead in developing deeper understandings of what we can do with XR technologies, as we think critically about our engagements with them.”

UM Faculty Embracing XR

 

Mark W. Newman, professor of information and of electrical engineering, and chair of the Augmented and Virtual Reality Steering Committee, said indications are that many faculty are thinking about ways to use the technology in teaching and research, as evidenced by progress on an Interdisciplinary Graduate Certificate Program in Augmented and Virtual Reality.

Newman chaired the group that pulled together a number of faculty working in XR to identify the scope of the work under way on campus, and to recommend ways to encourage more interdisciplinary collaboration. He’s now working with deans and others to move forward with the certificate program that would allow experiential learning and research collaborations on XR projects.

“Based on this, I can say that there is great enthusiasm across campus for increased engagement in XR and particularly in providing opportunities for students to gain experience employing these technologies in their own academic work,” Newman said, addressing the impact the technology can have on education and research.

“With a well-designed XR experience, users feel fully present in the virtual environment, and this allows them to engage their senses and bodies in ways that are difficult if not impossible to achieve with conventional screen-based interactive experiences. We’ve seen examples of how this kind of immersion can dramatically aid the communication and comprehension of otherwise challenging concepts, but so far we’re only scratching the surface in terms of understanding exactly how XR impacts users and how best to design experiences that deliver the effects intended by experience creators.

Experimentation for All

 

Encouraging everyone to explore the possibilities of mixed reality (MR) is a goal of Michael Nebeling, assistant professor in the School of Information, who has developed unique tools that can turn just about anyone into an augmented reality designer using his ProtoAR or 360proto software.

Most AR projects begin with a two-dimensional design on paper that are then made into a 3D model, typically by a team of experienced 3D artists and programmers. 

Michael Nebeling’s mixed reality app for everyone.

With Nebeling’s ProtoAR app content can be sketched on paper, or molded with Play-doh, then the designer either moves the camera around the object or spins the piece in front of the lens to create motion. ProtoAR then blends the physical and digital content to come up with various AR applications.

Using his latest tool, 360proto, they can even make the paper sketches interactive so that users can experience the AR app live on smartphones and headsets, without

Michael Nebeling’s mixed reality app for everyone.

spending hours and hours on refining and implementing the design in code.

These are the kind of technologies that not only allow his students to learn about AR/VR in his courses, but also have practical applications. For example, people can

experience their dream kitchen at  home, rather than having to use their imaginations when clicking things together on home improvement sites. He also is working on getting many solutions directly into future web browsers so that people can access AR/VR modes when visiting home improvement sites, cooking a recipe in the kitchen, planning a weekend trip with museum or gallery visits, or when reading articles on wikipedia or the news.

Nebeling is committed to “making mixed reality a thing that designers do and users want.”

“As a researcher, I can see that mixed reality has the potential to fundamentally change the way designers create interfaces and users interact with information,” he said. “As a user of current AR/VR applications, however, it’s difficult to see that potential even for me.”

He wants to enable a future in which “mixed reality will be mainstream, available and accessible to anyone, at the snap of a finger. Where everybody will be able to ‘play,’ be it as consumer or producer.”

 

XR and the Patient Experience

A team in the Department of Radiology, in collaboration with the Duderstadt Center Visualization Studio, has developed a Virtual Reality tool to simulate an MRI, with the goal of reducing last minute cancellations due to claustrophobia that occur in an estimated 4-14% of patients. The clinical trial is currently enrolling patients. 
VR MRI Machine

“The collaboration with the Duderstadt team has enabled us to develop a cutting-edge tool that allows patients to truly experience an MRI before having a scan,” said Dr. Richard K.J. Brown, professor of radiology. The patient puts on a headset and is ‘virtually transported’ into an MRI tube. A calming voice explains the MRI exam, as the patient hears the realistic sounds of the magnet in motion, simulating an exam experience. 

The team also is developing an Augmented Reality tool to improve the safety of CT-guided biopsies.

Team members include doctors Brown, Jadranka Stojanovska, Matt Davenport, Ella Kazerooni, Elaine Caoili from Radiology, and Dan Fessahazion, Sean Petty, Stephanie O’Malley,Theodore Hall and several students from the Visualization Studio. 

“The Duderstadt Center and the Visualization Studio exists to foster exactly these kinds of collaborations,” said Daniel Fessahazion, center’s associate director for emerging technologies. “We have a deep understanding of the technology and collaborate with faculty to explore its applicability to create unique solutions.” 

Dr. Elaine Caoili, Saroja Adusumilli Collegiate Professor of Radiology, demonstrates and Augmented Reality tool under development that will improve the safety of CT-guided biopsies.

AI’s Nelson said the first step of this new initiative is to assess and convene early innovators in XR from across the university to shape how this technology may best support serve residential and online learning. 

“We have a unique opportunity with this campus-wide initiative to build upon the efforts of engaged students, world-class faculty, and our diverse alumni network to impact the future of learning,” he said.

Photogrammetry for the Stearns Collection

Photogrammetry for the Stearns Collection

Photogrammetry results from the Stearns Collection: Here a drum is captured, and visible are the original digital photographs taken inside Stearns, the drum generated as a point cloud, the point cloud developed into a 3D mesh, and then a fully textured 3D model.

Donated in 1899 by wealthy Detroit drug manufacturer, Frederick Stearns, the  Stearn’s Collection is a university collection comprised of over 2,500 historical and contemporary musical instruments from all over the world, with many of the instruments in the collection being particularly fragile or one of a kind. In 1966 Stearns grew to include the only complete Javanese gamelan in the world, and being home to such masterpieces, the Stearns collection has become recognized internationally as unique. In 1974, due to concerns about preservation and display, much of the collection was relocated out of public view. Once residing in Hill Auditorium, the majority of the collection now sits in storage inside an old factory near downtown Ann Arbor.

The current location of the Stearns Collection. Photo Credit: www.dailymail.co.uk

Current preservation efforts have involved photographing the collection and making the nearly 13,000 resulting images available online. However, over the past year the Duderstadt Center has been working with Chris Dempsey, curator of the Stearns Collection and Jennifer Brown, a University Library Associate in Learning & Teaching, on a new process for preservation: Utilizing Photogrammetry to document the collection. Photogrammetry is a process that relies on several digital photographs of an artifact to re-construct the physical object into a digital 3D model. While traditional methods of obtaining 3D models often utilize markers placed atop the object, the process of Photogrammetry is largely un-invasive, allowing for minimal, and sometimes, no direct handling of an artifact. Models resulting from this process, when captured properly, are typically very precise and allow the viewer to rotate the object 360 degrees, zoom in and out, measure, or otherwise analyze the object in many cases as though it were actually in front of them.

Equipped with a high resolution digital SLR camera, Jennifer traveled to the warehouse where much of the Stearns collection is now held to document some of the instruments that are not currently on display and have limited accessibility to the general public. Feeding the resulting images into an experimental Photogrammetry software developed for research purposes (“Visual SFM” and “CMVS”), Jennifer processed the photos taken of various instruments into high resolution 3D models that could eventually be placed on the web for more accessible public viewing and student interaction.

U-M Future of Visualization Committee Issues Report

U-M Future of Visualization Committee Issues Report

The U-M Future of Visualization Committee* issued a report early this month focusing on the role Visualization plays at the University of Michigan, as well as steps for addressing growing needs on campus. The report concluded that two “visualization hubs” should be created on campus to make computing visualization services more accessible to our campus research community. “The hubs envisioned by the committee would leverage existing resources and consist of advanced workstations, high bandwidth connectivity, and collaborative learning spaces, with a support model based on that of the Duderstadt Center and Flux. The hardware and software would be configured to allow departments or individuals to purchase their own resources in a way that would reduce fragmentation and allow for efficient support, training, and maintenance.” (Text courtesy of Dan Miesler and ARC)

The following excerpts from the executive summary of the report highlight the importance and educational value of visualization services:

“The University of Michigan has seen incredible growth and change over the years. The growth will continue as we innovate and adapt. How we teach, conduct research, facilitate student learning, push technological boundaries, and collaborate with our peers will create demand for new tools and infrastructure. One such need is visualization because of the imperative role it plays in facilitating innovation. When one considers the vast quantities of data currently being generated from disparate domains, methods that facilitate discovery, exploration, and integration become necessary to ensure those data are understood and effectively used.

There is a great opportunity to change the way research and education has been done but to also allow for a seamless transition between the two through advancements in connectivity, mobility, and visualization. The opportunity here is tremendous, complex, and in no way trivial. Support for a responsive and organized visualization program and its cyberinfrastructure needs is necessary to leverage the opportunities currently present at the University of Michigan.”

A full copy of the report is available here.

*The committee was created by Dan Atkins with the charge of evaluating existing visualization technologies and methods on campus; developing an action plan for addressing deficiencies in visualization needs; establishing a group of visualization leaders; and communicating with the community on visualization topics. It is composed of faculty members and staff from ARC, University Libraries, Dentistry, LSA, the Medical School, ITS, Architecture and Urban Planning, Atmospheric and Oceanic and Space Sciences, and the College of Engineering. (Text courtesy of Dan Miesler and ARC)

Using the MIDEN for Hospital Room Visualization

Using the MIDEN for Hospital Room Visualization

How can doctors and nurses walk around a hospital room that hasn’t been built yet? It may seem like an impossible riddle, but the Duderstadt Center is making it possible!

Working with the University Of Michigan Hospital and a team of architects, healthcare professionals are able to preview full-scale re-designs of hospital rooms using the MIDEN. The MIDEN— or Michigan Immersive Digital Experience Nexus— is our an advanced audio-visual system for virtual reality. It provides its users with the convincing illusion of being fully immersed in a computer-generated, three-dimensional world. This world is presented in life-size stereoscopic projections on four surfaces that together fill the visual field, as well as 4.1 surround sound with attenuation and Doppler Effect.

Architects and nursing staff are using the MIDEN to preview patient room upgrades in the Trauma Burn Unit of the University Hospital. Of particular interest is the placement of an adjustable wall-mounted workstation monitor and keyboard. The MIDEN offers full-scale immersive visualization of clearances and sight-lines for the workstation with respect to the walls, cabinets, and patient bed. The design is being revised based on these visualizations before any actual construction occurs, avoiding time-consuming and costly renovations later.