A Configurable iOS Controller for a Virtual Reality Environment

A Configurable iOS Controller for a Virtual Reality Environment

James examining a volumetric brain in the MIDEN with an iPod controller

Traditionally, users navigate through 3D virtual environments via game controllers; however, game controllers are littered with ambiguously labeled buttons.  And while excellent for gaming, this setup makes navigating through 3D space unnecessarily complicated for the average user.  James Cheng, a sophomore in Computer Science in Engineering, has been working to resolve this headache by using touch screens such as those found in mobile devices instead game controllers.  Using the Jugular Engine in development at the Duderstadt Center, he has been developing a scalable UI system that can be used for a wide range of immersive simulations. Want to cut through a volumetric brain?  Select the “slice button” and start dragging.  What to fly through an environment instead of walking?  Switch to “Fly” mode and take off.  The system aims to be highly configurable since every experience is different.

Initial development is being done for the iOS platform due to it’s consistent hardware and options for scalable user interfaces.  James aims to make immersive experiences more intuitive and give the developer more options for communicating with the user.  You can now say “good-bye!” to memorizing what buttons “X” and “Y” do for each simulation, and instead utilize clearly defined and simulation-specific buttons.