Science

ZeroTouch multi-'touch' sensing technology unveiled

View 4 Images
ZeroTouch is a prototype multi-touch system, in which users can touchlessly control applications by reaching into a picture-frame-like infrared sensing device
(Image: Texas A&M)
ZeroTouch is a prototype multi-touch system, in which users can touchlessly control applications by reaching into a picture-frame-like infrared sensing device
(Image: Texas A&M)
ZeroTouch is a prototype multi-touch system, in which users can touchlessly control applications by reaching into a picture-frame-like infrared sensing device
(Image: Texas A&M)
ZeroTouch is a prototype multi-touch system, in which users can touchlessly control applications by reaching into a picture-frame-like infrared sensing device
(Image: Texas A&M)
A diagram showing how objects are registered using ZeroTouch's point-to-point visual hull sensing technology
(Image: Texas A&M)
View gallery - 4 images

Last November, German tech firm Evoluce unveiled a Kinect-based prototype multi-touch system that allows users to navigate through Windows 7 applications, simply by moving their hands in the air. While that system utilizes the Kinect unit's RGB camera and depth sensor to track the user's hands, a new technology developed at Texas A&M University's Interface Ecology Lab uses a matrix of infrared light beams to do essentially the same thing. It's called ZeroTouch, and it was presented at last week's 2011 Conference on Human Factors in Computing Systems in Vancouver.

Unlike the Evoluce system, ZeroTouch incorporates an open picture-frame-like sensing apparatus, which the user reaches into. It can be placed on a desktop, around the computer screen, or it can hang in the air with the screen visible beyond it. Around the frame's four edges are an array of infrared LED lights, the invisible beams of which shine into and across the inside open area. Mixed in with those lights are 256 modulated infrared sensors, which register the beams of the lights located across from them.

When a user places one or more fingers or other objects within the frame - intersecting the grid-work of light beams - the system's software is able to calculate the size, shape and location of those objects within the frame, and apply that to equivalents on a Windows 7 computer screen. It's a technology known as point-to-point visual hull sensing, and it can handle over 20 objects at once.

A diagram showing how objects are registered using ZeroTouch's point-to-point visual hull sensing technology
(Image: Texas A&M)

The Texas A&M team demonstrated three ZeroTouch applications in Vancouver. These included intangibleCanvas, in which users can "paint" pictures using their elbows, arms, head and fingers; Hand + Pen in Hand Command, which is a real-time strategy game played via multi-touch and stylus; and ArtPiles, a curatorial tool for museums and art galleries, that allows users to organize large collections.

View gallery - 4 images
  • Facebook
  • Twitter
  • Flipboard
  • LinkedIn
5 comments
dwreid
The bad news here is that this is not really a big breakthrough. All that the designers have done is take technology that was invented in the 1970s and has been in use on thousands of devices, including computer monitors, kiosks and in cars, and tweak it to allow it to do multi-touch. So, basically, they\'ve copied someone else\'s hardware concept and added someone else\'s concept for multi-touch and called it original. Hardly.
VadimR
How much would a frame like that cost? Would be used as a \"cheap\" 3D scanner. Pass the object through the frame and you can get a rough contour of the object. Not perfect, but could be handy for dyi type projects...
Matt Rings
If you\'ve ever watched the historical documentary TV series called \"Connections\" about technological advances through the centuries...each \"new\" invention has been a stepping-stone from another one before it, and usually incorporating a \"copy\" of something already invented... fascinating, and still going on today with technological evolution.
Gregg Eshelman
It\'s an evolution of the IR touchscreen system Hewlett Packard had on some monitors in the late 70\'s and early 80\'s.
Neil Underwood
Use many more sensors. Including optical for texture rendering.
Now make a collapsible locking frame and a bluetooth connection, with a gyroscope and accelerometer for providing positional data. A pocket hd 3d scanner for anything that fits inside. Not bad. I'd have use for one.