Haptic feedback has become a common feature of recent technology, but such systems usually rely on stimulation of parts of the user’s body via direct mechanical or acoustic vibration. A new technique being developed by researchers at the University of Bristol promises to change all of this by using projected ultrasound to directly create floating, 3D shapes that can be seen and felt in mid-air.
Building on previous work at the university, the researchers have used an array of ultrasonic transducers to create and focus compound patterns of ultrasound to shape the air at which it was directed. To make these shapes visible, the manipulated air was directed through a thin curtain of oil and a lamp was then used to illuminate it. According to the researchers, this results in a system that produces such accurate and identifiable shapes that users can readily match an image of a 3D object to the shape rendered by the prototype ultrasound system.
"Touchable holograms, immersive virtual reality that you can feel and complex touchable controls in free space, are all possible ways of using this system," said Dr Ben Long, Research Assistant from the Bristol Interaction and Graphics (BIG) department at the University of Bristol. "In the future, people could feel holograms of objects that would not otherwise be touchable, such as feeling the differences between materials in a CT scan or understanding the shapes of artefacts in a museum."
The system does not use the ultrasound frequency (around 40 kHz) to directly impinge on the surface of the skin when the haptic object is touched. Instead, vibrations are set up in the air upon which the array is focused to produce sensations oscillating anywhere from around 0.4 Hz to 500 Hz. In this way, when the various patterns are produced by the ultrasonic array, the user is able to discern the shape of an object in a similar way to feeling a solid article.
The researchers believe this new technology may revolutionize the way haptic feedback is utilized in various fields. This includes medicine, where a surgeon may be able to “feel” a patient’s CT scan with haptic feedback to determine irregularities and identify diseases or tumors. But, more than this, the separation of visual displays and haptic feedback may one day be removed, and physical and visual interaction combined in a fully-immersive, floating, 3D system.
Led by Dr Ben Long and colleagues Professor Sriram Subramanian, Sue Ann Seah and Tom Carter from the University of Bristol’s Department of Computer Science, the research Is published in the current issue of the journal ACM Transactions on Graphics and will be presented at the SIGGRAPH Asia 2014 conference.
No announcement has yet been made as to future applications or any commercial release of the system.
The video below demonstrates the prototype system in use.
Source: University of Bristol