Drones

Drones may find use in simplified motion-capture animation

Drones may find use in simplified motion-capture animation
The system requires no more than two commercially-available camera-equipped drones, a laptop, and a special suit that is worn by the actor
The system requires no more than two commercially-available camera-equipped drones, a laptop, and a special suit that is worn by the actor
View 1 Image
The system requires no more than two commercially-available camera-equipped drones, a laptop, and a special suit that is worn by the actor
1/1
The system requires no more than two commercially-available camera-equipped drones, a laptop, and a special suit that is worn by the actor

In a typical motion-capture animation setup, an actor goes through their actions in a studio equipped with multiple cameras. A computer tracks markers on the actor's body, building a moving 3D "skeleton" that is digitally fleshed out to create the final animated character. Experimental new drone-based tech, however, may make the process much easier.

Developed by Tobias Nägeli, who is a computer scientist at Switzerland's ETH Zurich research institute, the system requires no more than two commercially-available camera-equipped drones, a laptop, and a special suit that is worn by the actor. That suit has infrared-light diodes located at each joint on the body.

Instead of being confined to a studio, the actor can move around various outdoor or indoor environments, performing actions such as walking, running or climbing. As they do so, the two drones autonomously fly along with them, each one staying positioned so that it can film the actor from a different angle. The system is even able to anticipate the actor's movements, moving the drones preemptively to keep them from losing their shot.

Each drone's camera is equipped with a true light filter, which allows the camera to "see" nothing other than the infrared markers on the body suit. This streamlines things, by minimizing the amount of data that is transmitted to the computer. That computer in turn utilizes custom software to create an animated skeleton, based on the combined output of the cameras.

Due to the fact that those cameras move with the actor, only two of them are necessary in order to capture enough of the action. By contrast, because the cameras in a traditional studio are stationary, many of them are needed in order to sufficiently image the actor as they move across the room or turn around.

Nägeli is now refining the technology via his startup company, Tinamu Labs. He hopes that the system could ultimately be used not only for film-making, but also in applications such as performing gait analysis of people with mobility problems, or for tracking the performance of athletes.

There's more information in the video below.

Source: ETH Zurich

Real-time Environment-independent Multi-view Human Pose Estimation with Aerial Vehicles [SA '18]

1 comment
1 comment
Malatrope
I did this with moving robotic platforms (rather than drones, which weren't capable of it at the time) in the early 90's. I couldn't even talk my company into filing a patent application for it. Guess I should have talked to Hollywood rather than Aerospace.