Motion Capture and Kinects Analyze Movement in Tandem

Motion Capture and Kinects Analyze Movement in Tandem

As part of their research under the Dynamic Project Management (DPM) group, PhD candidates Joon Oh Seo and SangUk Han with UROP student Drew Nikolai used the Motion Capture system to study the ergonomics and biomechanics of climbing a ladder. The team, advised by Professor SangHyun Lee, is analyzing the movements of construction workers to identify behaviors that may lead to injury or undue stress on the body. Using MoCap the team can collect data on joint movement, and by using the Kinect they can collect depth information. By comparing the two data sets of Nikolai climbing and descending the ladder, Seo and Han can compare accuracy, and potentially use the Kinects to collect information at actual construction sites.

PainTrek Released on iTunes!

PainTrek Released on iTunes!


Get the App!

Ever have a headache or facial pain that seemingly comes and goes without warning? Ever been diagnosed with migraines, TMD or facial neuralgias but feel that your ability to explain your pain is limited?

PainTrek is a novel app that was developed to make it easier to track, analyze, and talk about pain. Using an innovative “paint your pain” interface, users can easily enter the intensity and area of pain by simply dragging over a 3D head model. Pain information can be entered as often as desired, can be viewed over time, and even analyzed to provide deeper understanding of your pain.

The PainTrek application measures pain area and progression using a unique and accurate anatomical 3D system. The head 3D model is based on a square grid system with vertical and horizontal coordinates using anatomical landmarks. Each quadrangle frames well-detailed craniofacial areas for real-time indication of precise pain location and intensity in a quantifiable method. This is combined with essential sensory and biopsychosocial questionnaires related to previous and ongoing treatments, and their rate of success/failure, integrating and displaying such information in an intuitive way.

Duderstadt Center takes 1st and 2nd Place in Mobile Apps Challenge

Duderstadt Center takes 1st and 2nd Place in Mobile Apps Challenge

In December of 2012, The University of Michigan held a mobile app competition to showcase new apps developed within the university and encourage the developer community to create innovative mobile designs. U-M students, faculty, and staff submitted a variety of apps from many different disciplines and genres. The event was sponsored and judged by individuals from Computer Science and Engineering, Google, Information and Technology Services, and Technology Transfer.

1st Place – PainTrek
Ever have a headache or facial pain that seemingly comes and goes without warning? Ever been diagnosed with migraines, TMD or facial neuralgias but feel that the medication or your ability to explain your pain is limited? PainTrek is a novel app that was developed to make it easier to track, analyze, and talk about pain.

2nd Place – PictureIt: The Epistles of St. Paul
The app will give you the feel of what it was like reading an ancient Greek book on papyrus, where the text is written without word division, punctuation, headings, or chapter and verse numbers. To aid the reader without knowledge of ancient Greek the translation mode will give a literal translation of the Greek text preserved on these pages (with addition of chapter and verse numbers), with explanatory notes showing where this text is different from the Standard text.

Xbox Kinect used to scan surfaces in wind tunnel

Xbox Kinect used to scan surfaces in wind tunnel

At­­ the Gorguze Family Laboratory here on North Campus, Alexander Pankonien scanned a test wing in the 5 by 7 foot wind tunnel. Aerospace engineers will test their prototypes or parts in the wind tunnels to see how they will fare, scanning the objects to measure how it was affected and determine whether it will be safe to launch.

Usually, the Aerospace engineers scan with a laser scanner which picks up single point. In this instance, Alexander used the Xbox 360 Kinect, which captures entire surfaces, to scan the wing. The Kinect can scan from behind glass or acrylic screening, so it doesn’t upset wind patterns. Though the Kinect is less accurate than the laser scanner, it can scan more of the object than the single laser beam. And, if it is accurate enough to make measurements worthwhile, this will be a great, fast solution to scanning objects in wind tunnels.

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

UROP Summer Symposium 2012 (Kinect, Virtual Reality, and iOS)

Rachael Miller and Rob Soltesz presented their summer work on Kinect development, natural user interfaces, and capturing emotive qualities of users at the 2012 UROP Symposium for MSTEM. Rachael won a Blue Ribbon at the event for her poster and they are both the first (that I know of) who have successfully used multiple Kinects in an immersive virtual reality space for virtual physical presence.

Rachael focused on creating a natural user interface for immersive 3D environments by combining multiple connects for a more robust skeleton.  This stable and predictable skeleton allowed her to then wrap virtual (invisible) objects around the user’s limbs and torso effectively allowing people to interact with virtual objects without markers or special tracking devices. Beyond simple interaction with virtual objects she then developed several gestures to be used for navigation in virtual reality.

Rob worked with Rachael on aspects of her project but also looked into using the Kinect’s multiple microphones and internal voice recognition capabilities to extract emotive qualities from the user inside a virtual reality space.

Andrew Janke also presented at a second UROP symposium on his work with iOS connectivity to a variety of applications. Getting data off of an iOS device is not always trivial. Formatting that data into a PDF and then sending it via email to a specific individual can be a challenge. Andrew developed a process that allows arbitrary iOS applications to send data, using simple sockets, which can then be formatted and then sent via email. This functionality was required by a few of our applications in development and proved to be extremely useful.

All students did a great job over the summer and we’re excited to be a part of the UROP program at the University of Michigan.

The Museum of Life and Death

The Museum of Life and Death

Andy Kirshner, a resident faculty member in the School of Music and Theater, used the University of Michigan Duderstadt Center’s motion capture service to record several movements for his production, The Museum of Life and Death which is described as:

“Set in the post-human 26th-century, The Museum of Life and Death is a radical reconsideration of the medieval Play of Everyman. Framed as a kind of post-human Masterpiece Theatre, and hosted by a chipper cyborg named Virgil, The Museum mixes 3D animation, projected video, live action, buddhist sutras, and original music to consider essential questions of Life, Death — and extinction — in our own time.”

Performance Homepage