The M.I.D.E.N. transports you into Nova’s Reality

The M.I.D.E.N. transports you into Nova’s Reality

Using Virtual Production Techniques to Film a Music Video

Author

Emma Powell and Eilene Koo


Photo Credit: Eilene Koo

Photo Credit: Eilene Koo

The Michigan Immersive Digital Experience Nexus (MIDEN) doesn’t seem very large at first, as it is a 10 x 10 x 10 foot space and a blank canvas ready for its next project like “Nova’s Reality.” For this special performance, though, the MIDEN transformed into an expansive galactic environment, placing electronic jazz musician Nova Zaii in the middle of a long subway car traveling through space. 

The technological centerpiece of Zaii’s performance was the Nova Portals, a cone-shaped instrument invented by Zaii which uses motion and doesn’t require direct contact to create sound. Zaii created the Nova Portals during his time as a student at the University of Michigan and majoring in Performing Arts Technology and Jazz Studies. 

Several members of the U-M community helped bring Zaii’s performance to life inside the MIDEN. Akash Dewan, a graduated senior and director of the project, first met Nova in his freshman year. “I first met Nova in my freshman year in the spring, when I decided to shoot Masimba Hwati’s opening ceremony for his piece at the UMMA,” Dewan said. Masinba Hwati is a Zimbabwean sculptor and musician who made the Ngoromera, a sculpture made out of many different objects and musical instruments. When Hwati performed at the opening ceremony, his drummer was Nova Zaii. 

“I took photos throughout the performance and took some particularly cool photos of Nova, so I decided to chat with him after the event and show him the photos I took,” Dewan said. “Since then, I have been following his journey on Instagram as a jazz musician part of the Juju Exchange [a musical partnership between Zaii and two childhood friends] and inventor of the Nova Portals, which I’ve always taken a fascination towards due to my interest in the intersection of technology and art.”

Following major hardware upgrades, the MIDEN’s image refresh rate has doubled, along with brighter light outputs, to support stereoscopic immersion for two simultaneous users — four perspective projections for four eyes, 60 times per second. 

Sean Petty, senior software developer at the DMC’s Emerging Technologies Group, assisted the “Nova’s Reality” team inside the MIDEN and explained how the space was used. 

“This project is using the MIDEN as a virtual production space, where the MIDEN screens will deliver a perspective correct background that will make it appear as if the actor is actually in the virtual environment from the perspective of the camera,” Petty said. “To accommodate this, we modified the MIDEN to only display one image per frame, rather than the four that would be required for the usual two user VR experience. We also reconfigured the motion tracking to track the motion of the camera, rather than the motion of the VR glasses.”

The high-speed projection refresh rate of 240 scans per second allowed for a flicker-free recording by the video camera.

“This entire process was extremely inspiring for all of us involved, and maintains my strong drive to continue to find new, fresh interdisciplinary approaches to visualizing music,” said Dewan. After this project and their graduation, each member will also be continuing individual creative work. 

The full project will be published on novazaii.com and akashdewan.com

Photo Credit: Eilene Koo


Find more of their creative works through their Instagram profiles:

Performer & Inventor of the Nova Portals: Nova Zaii, @novazaii

Director, Co-Director of Photography, Editor, 3D Graphics Developer: Akash Dewan, @akashdewann

Audio Engineer/Technician: Adithya Sastr, @adithyasastry

3D Graphics Developer: Elvis Xiang

Co-Director of Photography: Gokul Madathil, @madlight.productions

BTS Photographer: Randall Xiao, @randysfoto

MIDEN Staff + Technical Support: Sean Petty and Theodore Hall

F’25 Brings New Course on 3D Modeling, Animation & Game Dev for Everyone

EECS 298 : 3D Technical Art and Animation

F’25 Brings New Course on 3D Modeling, Animation & Game Dev for Everyone

Author


Coming in Fall 2025 and running yearly, EECS 298 : 3D Technical Art and Animation is a new, open-to-everyone course teaching students how to create 3D characters, objects, environments, materials, armatures, animations, and more in Blender before bringing them to life in a game engine. Per the course syllabus, students will not only learn how to create these 3D art assets, but also learn how to integrate them into technical ecosystems such as the Unity game engine, achieving special effects and interactivity (dynamic hair, flowing lava, layered animations, particles, etc). No background in programming or art is assumed, and there are no prerequisite course requirements.

Students will conclude the course with a portfolio of 30+ low-poly models, along with a functional, shareable, 3D platforming video game (think Super Mario 3D World) featuring student-made (and student-integrated) playable characters, environments, NPCs, and objects. The course staff will provide some of the programming– students produce the 3D assets and game engine integration that brings everything together (and to life!).

Created by Austin Yarger in collaboration with Evan Marcus, the course will run once yearly (in fall semesters) with a capacity of approximately 50 students. The course will provide 4 credits of varying types depending on the department of the student (for example, CS students will earn FlexTech credit, while STAMPS students will earn elective credit). The course is designed to complement a growing suite of game development and XR courses on campus, including EECS 494 : Game Design and Development, EECS 440 : Extended Reality for Social Impact, EECS 498.007 : Game Engine Architecture, SI 311 : Games and UX, PAT 305 : Video Game Music, EDUC 333 : Games and Learning, etc. 

A full list of planned topics include–

3D content authoring, basic animation techniques, 3D topology, open source tools (Blender), basic computational geometry, 3D asset formats and representations, asset optimization, armature design, UV mapping and textures, basic materials and lighting / shader logic, asset-to-engine pipelines, 3D printing and photogrammetry, etc.

Planting Disabled Futures – A call for artists to collaborate

Planting Disabled Futures

OPen Call for Artist Collaborators

Author


Petra Kuppers is disability culture activist and a community performance artist. She creates participatory community performance environments that think/feel into public space, tenderness, site-specific art, access and experimentation. Petra grounds herself in disability culture methods, and uses ecosomatics, performance, and speculative writing to engage audiences toward more socially just and enjoyable futures.


Her latest project, Planting Disabled Futures, is funded by a Just Tech fellowship.

In the Planting Disabled Futures project, Petra aims to use live performance approaches and virtual reality (and other) technologies to share energy, liveliness, ongoingness, crip joy and experiences of pain. 

In the development of the Virtual Reality (VR) components of the project, we will ask: How can VR allow us to celebrate difference, rather than engage in hyper-mobile fantasies of overcoming and of disembodied life? How can our disabled bodymindspirits develop non-extractive intimacies, in energetic touch, using VR as a tool toward connecting with plants, with the world, even in pain, in climate emergency, in our ongoing COVID world?

A watercolor mock-up of the Crip Cave, with Moira Williams’ Stim Tent, two VR stations, a potential sound bed, and a table for drawing/writing.

Petra envisions a sensory art installation equipped with a VR experience, stimming tent, a soundbed and a drawing and writing table. The VR experience would be supplemented by actors providing opportunities to engage with unique taste, touch and smell sensations as the environment is navigated.

A cyanotype (blue) and watercolor mock-up of what the VR app might look like: a violet leaf with sensation hubs, little white ink portals, that might lead to an audio dream journey

The VR experience involved in the Crip-Cave is expected to be tree-like environment that allows participants to select either a visual or an auditory experience. Participants can travel down to the roots and experience earth critters or up to the branches and into the leafy canopy. In both locations, “sensory hubs” would take participants on a journey to other worlds – worlds potentially populated with content produced by fellow artists.

A cyanotype/watercolor mock-up of little critters that might accompany you on your journey through the environment.

Artist collaborators are welcome to contribute their talents generating 3d worlds in Unreal Engine, reciting poetry, animating or composing music to create a dream journey in virtual reality. Artists generating digital content they would like considered for inclusion in this unique art installation can reach out to: [email protected]


To learn more about Planting Disabled Futures, visit:
https://www.petrakuppers.com/planting-disabled-futures