Researchers working at Queen’s University’s Human Media Lab in Ontario have created a collaborating swarm of drones that act as 3D pixels (voxels) to create giant, flying interactive displays. The researchers claim that the "BitDrone" system provides users with the ability to investigate virtual information presented in 3D by directly manipulating these hovering voxels for use in the likes of 3D gaming, medical imaging, and molecular modelling.
The promise of large-scale 3D displays has almost invariably been a static one – such as the prototype TriLite 3D billboard display – with the viewer restricted to very specific locations and lines of sight for the illusion to be fully effective. The BitDrones system, on the other hand, uses swarms of nano quadcopters covered in a material capable of deforming its shape in response to programmed commands.
GET 20% OFF A NEW ATLAS PLUS SUBSCRIPTION
For a limited time, we're offering 20% off a New Atlas Plus subscription.
Just use the promo code APRIL at checkout.BUY NOW
"BitDrones brings flying programmable matter, such as featured in the futuristic Disney movie Big Hero 6, closer to reality," says Queen's University professor Dr. Roel Vertegaal. "It is a first step towards allowing people to interact with virtual 3D objects as real physical objects."
The BitDrones system actually comprises three separate types of drones, with each one used as hovering voxels of varying resolution. The smallest are known as "PixelDrones," and use a single LED along with a small dot matrix display. The next size up are the "ShapeDrones" whose bodies are covered with a lightweight deformable mesh and a 3D printed geometric frame; these drones are used as building blocks for more complex 3D dispaly. Finally, the largest of the trio are known as "DisplayDrones" and are kitted out with a high-resolution curved flexible touchscreen, a video camera facing forward, along with an Android-based smartphone circuit board.
To enable tracking and positioning in real time, all of the BitDrones have reflective markers attached so they can be detected by the motion capture technology built into the system. In addition, the researchers claim that their creation also tracks the user’s hand motions and touches, to allow users to physically position the voxels in mid-air.
"We call this a Real Reality interface rather than a Virtual Reality interface. This is what distinguishes it from technologies such as Microsoft HoloLens and the Oculus Rift: you can actually touch these pixels, and see them without a headset," says Dr. Vertegaal.
Whilst other drone systems have been used to display low-resolution images in the night sky, the BitDrones system is touted as a much more malleable, interactive system of floating displays capable of much greater resolutions and – as a consequence – many more potential applications.
In one scenario proffered by the team, users of the system are able to explore a computer file folder by touching a PixelDrone associated with it. After the folder is opened, the file contents are revealed by other PixelDrones hovering in a horizontal wheel formation below it. To browse the files in the wheel, the user then physically swipes the drones to the right or left.
In other uses proposed by the researchers, users could manipulate ShapeDrones for use as building blocks for creating 3D models in real time, or they could participate in telepresence conversations by communicating through a Skype-enabled DisplayDrone. In the latter case, the researchers state that a DisplayDrone so equipped is able to automatically track and reproduce all of the user’s head movements, providing a remote user with the ability to virtually look around a location, whilst making it easier for a local user to understand a remote user’s actions.
So far the Queen's University system is made up of only large 2.5 - 5-inch (63.5 - 127-mm) -sized drones, but the researchers are planning to vastly scale up their system to support many thousands of drones of no more than 0.5-in (12.7-mm) in size, which will provide for higher-resolution, more seamlessly integrated programmable voxels.
The results of this research were recently presented at the ACM UIST User Interface Software and technology symposium, in North Carolina.
Source: Queen's University