It might soon be possible to perform large-scale 3D motion reconstructions of sporting events or other live performances, thanks to new research by scientists at Carnegie Mellon University. The researchers mounted 480 video cameras in a two-story geodesic dome that enabled them to track the motion of events such as a man swinging a baseball bat or confetti being thrown into the air.
This is all done without any little white balls or other markers attached to the subjects, thanks to a technique for estimating visibility of a target point based on motion. The trick was to leverage established techniques for automatically identifying and tracking points based on visual elements – to find distinctive patterns, in other words – and monitor them over time as they cross between different cameras.
UPGRADE TO NEW ATLAS PLUS
More than 1,500 New Atlas Plus subscribers directly support our journalism, and get access to our premium ad-free site and email newsletter. Join them for just US$19 a year.UPGRADE
The video below shows the resulting 3D motion reconstructions.
The 480 cameras were able to track 100,000 points at a time, which is vastly more than the 10 to 20 cameras used in typical work of this nature. On the one hand, it led to reams of data that could eliminate common problems with occlusion and motion blur. But it also meant huge redundancy that needed to be overcome through the above methods to figure out which cameras can see each target point at any given moment and drop irrelevant feeds from the analysis of each frame.
These methods, coupled with the dense array of cameras, allow 3D motion reconstructions that were previously impossible, such as animal motion or the flow of individual pieces of confetti as they fall to the floor. "You couldn't put markers on the paper without changing the flow," explained Ph.D. student Hanbyul Joo.
The Panoptic Studio dome could also be used to capture fine details in interactions between people in order to understand the intricacies of socialization and possibly even to help diagnose conditions such as autism.
The researchers hope to next investigate ways to adapt the principles determined here to overcome imaging artifacts in less-controlled environments (like political rallies, pop concerts, or major sports games) that are increasingly captured by hundreds of cameras directed at the action from a multitude of angles. The 360-degree camera controls of replays in sports video games might be closer to reality than we think.
Check out the video below for an explanation of how the system works.
Source: Carnegie Mellon University