The basis of most 3D systems is to "trick" our eyes into believing that an image shown on a flat screen has three dimensions, but what if you could throw away the screen entirely! It sounds simply too far-fetched and impossible to choreograph, but that's exactly what researchers MIT's SENSEable City Lab and Aerospace Robotics and Embedded Systems Laboratory (ARES Lab) have created with Flyfire - a cloud of LED-carrying micro-helicopters controlled in synchrony to show unique animated light displays in three-dimensional space.
Each "micro-helicopter" carries a small LED and is digitally-controlled and choreographed as a smart pixel that emits colored light. This is achieved using self-stabilizing and precise controlling technology developed by the ARES lab which allows the pixels to be controlled in real time. It has also been made possible by recent advances in battery technology and wireless control. The pixel is capable of displaying digital information such as 3D writing and pictures and rearranging itself several times in a performance before needing to recharge.
The Flyfire canvas can reorganize itself from one shape to another or bring a two-dimensional photographic image into a 3D form. "It's like when Winnie the Pooh hits a beehive: a swarm of bees comes out and chases him while changing its configuration to resemble a beast," said E Roon Kang, research fellow at SENSEable City Lab. "Unlike traditional displays that can only be seen from the front, Flyfire becomes a three dimensional immersive display that can be experienced from all directions," said team member Carnaven Chiu.
Flyfire was developed as a public space installation, and although currently SENSEable are capable of controlling a limited numbers of helicopters they are aiming to further develop the technology to control very large numbers for more ambitious displays and applications. They also suggest it could be a step towards 'smart dust' technology, a wireless network of tiny synchronized devices the size of dust.