HyperSurfaces uses AI to make object interfacing more natural
Back in 2012, Bruno Zamborlin revealed a tiny device that could transform any surface into a musical instrument. After some refinement, Mogees launched on Kickstarter in 2014, and again as Mogees Play in 2016. Now Zamborlin has unveiled HyperSurfaces, which taps into the power of AI and machine learning to turn "any object of any material, shape and size" into a user interface.
Imagine a wooden kitchen table that can be used to control lighting or room temperature, a floor that's able to determine if the intruder in your house is just the cat or a would-be thief, the surface of a door transformed into one big interface or the inner surface of a car door acting as a button-free control panel. These are some of the examples given for the HyperSurfaces system.
The technology combines vibration sensing with neural network algorithms running on dedicated microchips. "Every time we interact with an object, we create a distinctive vibration pattern which dedicated sensors coupled with our patented algorithms, can transform into digital commands," said Zamborlin – who heads an international development team split between London and Los Angeles.
All data processing is undertaken in real-time on the chip itself, meaning that once a use case model is loaded onto the system-on-chip, it can work without needing to access external systems such as data processing in the cloud.
"HyperSurfaces aims to revolutionize the way we live, blending the data world within any object around us," the company stated in a press release. "Consumer electronics, IoT, retail, transportation, augmented reality, smart facilities, all these domains can potentially be changed forever."
Development of the system continues, but you can see what our interface future may soon look like in the promo video below.