Low-Cost Dynamic and Immersive Gaze Tracking

Low-Cost Dynamic and Immersive Gaze Tracking

From touch-screen computers to the Kinect’s full-body motion sensor—interacting with your computer is as simple as a tap on the screen or a wave of the hand. But what if you could control your computer by simply looking at it? Gaze tracking is a dynamic and immersive input system with the potential to revolutionize modern technology.

Realizing this potential, Rachael Havens, a member of the Duderstadt Center and UROP student, investigated ways of integrating an efficient and economical gaze tracker into our system. However since this powerful tool is overlooked by many people, this task proved to be quite the challenge. Current professional gaze tracking tools are highly specialized and require buyers to drop tens of thousands of dollars for a single system. The open-source alternative is not much better, as it sacrifices quality for availability. Since none of the aforementioned options were ideal, a custom design was pursued.

Inspired by the EyeWriter Project, the Sony PS Eye was hacked. We systemically replaced the Infrared filtered lens and lens mount, adding a visible light filter and installing our own 3D printed lens mount. With little expense, we transformed a $30 webcam into an infrared, head-mounted gaze tracker. The Duderstadt Center didn’t stop there, however; we integrated this gaze tracker’s software with Jugular, an in-house interactive 3D engine. Now a glance from the user doesn’t just move the cursor on a desktop, it selects objects in a 3D virtual environment of our own design.

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

Rachael Miller and Rob Soltesz presented their summer work on Kinect development, natural user interfaces, and capturing emotive qualities of users at the 2012 UROP Symposium for MSTEM. Rachael won a Blue Ribbon at the event for her poster and they are both the first (that I know of) who have successfully used multiple Kinects in an immersive virtual reality space for virtual physical presence.

Rachael focused on creating a natural user interface for immersive 3D environments by combining multiple connects for a more robust skeleton.  This stable and predictable skeleton allowed her to then wrap virtual (invisible) objects around the user’s limbs and torso effectively allowing people to interact with virtual objects without markers or special tracking devices. Beyond simple interaction with virtual objects she then developed several gestures to be used for navigation in virtual reality.

Rob worked with Rachael on aspects of her project but also looked into using the Kinect’s multiple microphones and internal voice recognition capabilities to extract emotive qualities from the user inside a virtual reality space.

Andrew Janke also presented at a second UROP symposium on his work with iOS connectivity to a variety of applications. Getting data off of an iOS device is not always trivial. Formatting that data into a PDF and then sending it via email to a specific individual can be a challenge. Andrew developed a process that allows arbitrary iOS applications to send data, using simple sockets, which can then be formatted and then sent via email. This functionality was required by a few of our applications in development and proved to be extremely useful.

All students did a great job over the summer and we’re excited to be a part of the UROP program at the University of Michigan.

Virtual Jet Ski Driving Simulator

Virtual Jet Ski Driving Simulator

The Virtual Jet Ski Driving Simulator allows a user to drive a jet ski (or personal watercraft) through a lake environment that is presented in an immersive virtual reality MIDEN system. The user sits on a jet ski mockup and controls the ride via handlebar and throttle. While the mockup is stationary (does not move), the environment changes dynamically in response to handlebar and throttle operation, thereby, creating the feeling of jet ski driving in a very convincing way. The virtual reality system provides head-referenced stereo viewing and a realistic, full scale representation of the environment.

The simulator was developed to study human risk factors related to the operation of a personal watercraft (PWC). In recreational boating, PWCs are involved in accidents in disproportional numbers. Using the simulator, accident scenarios can be simulated and the reaction of PWC operators in specific situations can be studied. The simulator provides a cost-effective analysis tool for regulators and equipment designers as well as a training device for PWC operators, enforcers, and educators.

The simulator was developed for the U.S. Coast Guard (USCG) by the University of Michigan Virtual Reality Laboratory and the Research Triangle Institute. It is now in the process of being revived through help from the Undergraduate Research Opportunity Program (UROP)