Select Page

Motion analysis research effort leads to $1.4 million NSF grant

October 10, 2004

Spearheading one of the nation’s few research initiatives to blend the work of artists, engineers and other experts, ASU has received a $1.4 million grant from the National Science Foundation (NSF) to study real-time motion analysis.

This is the first large-scale NSF grant for the year-old Arts, Media and Engineering (AME) graduate program, and it follows a National Endowment for the Arts grant – ASU’s largest – for the groundbreaking motion analysis collaborative. Over the next five years, researchers hope for promising results in areas that range from development of gait-recognition security to spinal rehabilitation, from dance choreography to robotics.

AME is a joint program of ASU’s Herberger College of Fine Arts and the Ira A. Fulton School of Engineering. It governs the Interdisciplinary Research Environment for Motion Analysis (IREMA) program, in which nearly 60 faculty from 10 disciplines ranging from computer science and bioengineering to dance and psychology share in teaching and researching motion analysis and maintaining the motion-analysis labs that the NSF grant will support.

Over the past decade, human motion analysis has become an important research area with critical applications. But it is a complex problem because of the three-dimensional nature of the human body and the multiple levels of movement in terms of time, space and energy. Progress has been slow because disciplines have been addressing the issues individually.

“IREMA can serve as a new model for research and interdisciplinary collaboration, which can be adapted to other areas” to improve productivity, say NSF officials who awarded the grant.

“ASU’s AME program is supporting artists and engineers who are pushing the limits of technology and creating new applications that are considered cutting-edge in the worlds of art and science,” says Thanassis Rikakis, director of the AME program. “We are proud to be considered among the top initiatives of the NSF and the NEA.”

The entertainment industry has used motion capture to render digital effects and animate video games. But most motion analysis is not done in real time, restricting the ability for the instantaneous evaluation, feedback and correction that characterizes movement learning in the real world.

“When we reach for a glass of water, we receive instantaneous and continuous feedback from our body and our environment that enables us to perform the task without spilling the water,” Rikakis explained. “When you are trying to rehabilitate patients with certain neurological diseases, a motion-analysis system can evaluate their movements and offer corrective feedback in real time, becoming an effective substitute for their own senses.”

Ultimately, the technology will exist in many real-life environments such as hospitals, classrooms, stadiums, performance halls and airports. Applications may include movement rehabilitation, movement recognition for security purposes, technologies that encourage active learning in K-12, movement training for the fields of dance, theater, sports, firefighting and the military, and movement enhancement for robotics and other human-computer interaction.

The NSF grant will enable AME to take its research to the next level, helping its research teams capture human movement in its full essence and enhancing interactive, real-time feedback capabilities. The goal is to create a multimodal sensing environment that integrates high-precision marker-based motion capture, pressure sensing in the floor and audio sensing. The sensing will provide real-time feedback and training through a multiple view, video-based motion capture system.

The lead project investigator is Gang Qian, who joined AME as a joint appointment from the Herberger College and Fulton ‘s Electrical Engineering department last year. The co-leading faculty are Thanassis Rikakis (AME), Todd Ingalls (AME), Jodi James (AME), Sethurahman Panchanathan (Computer Science & Engineering), Jiping He (Bioengineering) and Michael McBeath (Psychology),

The project is funded by an NEA technology grant and a separate NSF grant to develop media flow architectures (real-time control of audio, video and lighting on an intelligent stage) for arts performances.

For more information about the AME, visit (http://ame.asu.edu).

About The Author

Fulton Schools

For media inquiries, contact Lanelle Strawder, Assistant Director of Marketing and Communications: 480-727-5618, [email protected] | Ira A. Fulton Schools of Engineering | Strategic Marketing & Communications

ASU Engineering on Facebook