Technology

App rapidly gathers sophisticated movement data, at 1% of the usual cost

View 3 Images
The OpenCap app allows clinicians to gain the 'superpower' of seeing below the surface, without expensive equipment
OpenCap
The OpenCap app allows clinicians to gain the 'superpower' of seeing below the surface, without expensive equipment
OpenCap
A comparison of traditional motion analysis and how OpenCap does it
Uhlrich, S et al/(CC BY 4.0)

This explainer shows the relative simplicity of the capture and analysis process
Uhlrich, S et al/(CC BY 4.0)

View gallery - 3 images

Using synchronous video taken with a pair of smartphones, scientists have created an open-source motion-capture app that gathers human movement data, with rapid analysis via an artificial intelligence system, to then be used in clinical settings for rehabilitation, pre-surgery planning and disease diagnostics – at just 1% of the cost of traditional technology.

The Stanford University researchers, with funding from the US National Institutes of Health, created OpenCap, which uses two calibrated iPhones working together to measure human motion and the underlying musculoskeletal mechanisms that power movement. What’s more, it’s faster than traditional technology used to gather the same information and is a fraction of the cost of the US$150,000 setups found in specialized clinics that use around eight high-tech cameras.

“OpenCap democratizes human movement analysis,” said senior author Scott Delp, professor of bioengineering and mechanical engineering at Stanford. “We hope it can put the once out-of-reach tools in more hands than ever before.”

The analysis can inform treatment for patients with movement issues, help clinicians plan for surgery and review how various treatments have and are working. It could also potentially be used to screen for disease, in which changes in gait or balance may not be easily observed during routine medical checkups.

This explainer shows the relative simplicity of the capture and analysis process
Uhlrich, S et al/(CC BY 4.0)

They tested OpenCap on 100 participants, recording video that was then analyzed by web-based artificial intelligence to assess muscle activation, joint load and joint movement. The data collection took less than 10 hours to complete for the entire 100 participants, and analysis was returned in 31 hours. Data collection took around 10 minutes for each person, with processing automatically initiated in the cloud platform, which is freely available for researchers.

“It would take an expert engineer days to collect and process the biomechanical data that OpenCap provides in minutes,” said co-first author Scott Uhlrich, director of research in Stanford’s Human Performance Lab. “We collected data from 100 individuals in less than 10 hours – this would have previously taken us a year.”

The data studies body ‘landmarks’ – knees, hips, shoulders, and other joints – and how they move in a three-dimensional space. It then uses complex physics and biology models of the musculoskeletal system to assess how a body is moving and what forces are at play. This provides important information about joint angles and loads.

“It will even tell you which specific muscles are being activated,” Delp said.

The researchers believe this sort of data gathering, along with deep-learning analysis, will be a new era for biomechanics research.

"There is the human genome,” said Delp, “but this is really going to be the motion-nome of the whole repertoire of human motion captured quantitatively."

“Our hope is that in democratizing access to human movement analysis, OpenCap will accelerate the incorporation of key biomechanical metrics into more and more studies, trials, and clinical practices to improve outcomes for patients across the world,” Delp added.

The study was published in PLOS Computational Biology. For more, see the video from the Standord team demonstrating OpenCap below.

Sources: Stanford University, OpenCap

View gallery - 3 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
0 comments
There are no comments. Be the first!