Computers

3D "joystick" for animation artists takes shape

3D animation software joystick "input puppet" (Image: Interactive Geometry Lab / ETH Zurich)
3D animation software joystick "input puppet" (Image: Interactive Geometry Lab / ETH Zurich)

Until recently, computer software animation developers had to manipulate characters by using their computer mouse to drag virtual limbs into poses one tedious, time-consuming key frame at a time. Now researchers at the Interactive Geometry Lab (IGL) at ETH Zurich have developed a whole new way of creating movement in virtual characters using 3D model "joysticks" that directly create shape and movement inputs.

Working in collaboration with the Autonomous Systems Lab (ALS), the Geometry lab team led by ETH-professor Olga Sorkine-Hornung developed the "input-puppet" using set of 3D-printed modular construction blocks that can be connected together to approximate any virtual character. With integrated sensors in each joint measuring the angle of bend or degree of twisting motion, the assembled model can be used to directly transfer this information to software that calculates the way that the virtual characters should move.

"The software assists the artist in registering their newly assembled device to the character’s shape," explained Professor Sorkine-Hornung. "Thus, the artist can match each actual joint of the input device to the corresponding virtual joint of the animated character. This way, even if the input-puppet features a rather short neck, it can be fitted to an animated giraffe's long neck."

The Zurich researchers are not alone in their goal of a direct-input device; various other researchers at different institutions are also working on diverse virtual character input devices, such as puppets that can be manipulated and the changes caught using motion-capture or by integrating sensors directly into the puppet's joints. However, these devices are claimed to be less versatile as the IGL/ALS version as they usually only have one standard shape – such as a human – which is unsuitable if you want to animate, say, a dog.

To help encourage further research, the IGL/ALS researchers have made available the design for building blocks of their device as open hardware, and plans for a set of 25 ready-made building blocks being made commercially available are also being considered.

"Anyone can 3D-print the separate units and with the help of an engineer to integrate the electronics," explains Sorkine-Hornung. "We are going to present the device at the SIGGRAPH conference and exhibition on Computer Graphics and Interactive Techniques this coming August. So we hope to receive some feedback whether there is a demand for a commercially available set and for further improvements of the device's design," said Professor Sorkine-Hornung.

Though the present device has limitations by only allowing it to bend and twist in two separate movements, rather than in one fluid motion, the researchers intend to examine the idea ball-and-socket joints to provide easier, more life-like manipulation of the input puppet.

The results of this research will be presented at the SIGGRAPH conference in August. The video below has been produced to accompany the technical paper.

Source: ETH Zurich

  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
2 comments
EddieG
This is a terrific contraption. Artists have used mannequins for a long time. Why stop now?
Jon A.
They used the same idea 20 years ago on Jurassic Park.
The difference here is that it's a modular kit, so if you need to build a horse with eight legs, you're good to go.