Tactile feedback is nothing new. It's been used in telecommunications and in entertainment for decades, and it became a standard feature in the late 1990s in mobile phones and video games – where vibrations alert you to new messages or help you "feel" the forces exerted on your avatar. Haptic technology has been very much a bit player in the fields that it's infiltrated, though, and only now are we seeing it begin to take its place alongside visual and audio tech as a key element in human-computer interaction.
Smartwatches such as the upcoming Apple Watch are embracing haptics to give you turn-by-turn directions. Researchers, meanwhile, are experimenting with haptic cues built into the steering wheel of cars for enhanced safety, and with tactile feedback built into touchscreens and public maps for more natural-feeling interactions.
Haptics enable deafblind people to browse the web (thanks to Morse Code) or even to play video games. In the gaming space, haptics is a fast-growing field thanks to the rise of virtual reality and the desire of players to feel just as viscerally as they see and hear their virtual environments. Haptic technology is also helping to train the next generation of surgeons, and improving simulations in the industrial sector for pilots and large machine operators.
The basics
Before we get into any of that, let's step back a moment and look at what haptic technology is. In most cases it uses a kind of motor called an actuator to convert electrical, hydraulic or pneumatic energy into vibrations, which can be managed and controlled by software that determines the duration, frequency, and amplitude.
Smartphones typically use haptic technology for alerts and notifications, as well as for subtle feedback as you type messages or dial numbers on their touchscreens. Video game controllers use haptics now in the same way that they did almost two decades ago with the Nintendo 64 rumble pack and PlayStation DualShock gamepad – to lend a tangibility, felt through your hands gripping the controller, to an explosion or crash or a rough surface that you're driving over.
But there's a lot more that haptics can do, both at the lower resolutions of feedback offered by an Xbox or PlayStation controller (which amounts to little more than an on/off switch with dimmer controls for each of their two actuators) and at the higher resolutions that the latest haptic technology can provide (which allows for feedback localized to specific coordinates and even, at the cutting-edge level, according to how hard you press on the surface).
Haptics look set to be the next big thing in our interactions with the digital world. We can cram surround-sound effects into headphones and record audio at high fidelity with our smartphones, and we can touch and pinch and zoom near everything that appears on a screen. But it all feels flat and lifeless because we've put so little effort into making it anything but that. The human body has a highly-sophisticated capacity for recognizing and responding to textures and vibrations and pokes and all manner of other external forces that engage the receptors in our somatosensory system.
Improving tactile feedback in consumer technology will be crucial as we further our attempts to bridge the divide between all that is physical and tangible and all that is digital and virtual. Among those attempting to explore the possibility space is Mathias Nordvall, a cognitive scientist and game designer at Linköping University in Sweden.
Sightlence and haptic Pong
"A lot of culture today is either vision based or audio based," notes Nordvall. "You go to a concert or you listen to the radio or go to the movies." And if you lack one of those senses, you compensate with the other. Predominately visual interfaces like browsing the web are presented to visually-impaired people through screen readers that speak the text on a page, while TV shows and movies can be watched with closed captions for the hearing impaired.
One day Nordvall wondered: "What happens if you don't have access to either one of them?". How can deafblind people (people with little or no vision and little or no hearing) interact with the modern world? Many of those at the farther end of the deafblind spectrum have never been able to enjoy television, the internet or video games. So Nordvall set out to see if a video game could be made that's "completely and only" based on haptic technology – a game that makes no use of graphics or audio whatsoever.
He sought to do it on the cheap, using only low-cost, off-the-shelf consumer technology, and quickly settled on Xbox 360 controllers, which have two motors in them (and, Nordvall notes, inner workings that are nearly identical to the Nintendo 64 rumble pack released a decade earlier, except for the addition of a second actuator). And after some thought he decided to attempt to translate an existing game, Pong, because that would ensure that any difficulties to play are due to faults in the user interface and not in the game's ruleset.
The result does not feel at all like Pong, or at least not at first. Players hold one Xbox 360 controller in their hands, with the paddle's movement mapped to one of the joysticks, and they place a second in their lap or somewhere else on their body. Vibrations in each of the controllers indicate where the ball is according to the following rules:
- The gamepad in your hands will have a steady weak vibration if the ball is above your paddle, a strong vibration if the ball is below, and no vibration if they are at the same height
- The gamepad in your lap will progressively vibrate faster as the ball moves toward your paddle and slower as it moves away
- One or the other gamepad will pulse on impact with the walls or paddles, depending on what the ball bounces off
It took my girlfriend and I around half an hour to get the hang of the game, as we started first with the visual and sound aides turned on and then gradually eased our way into haptic-only play. That, Nordvall says, is about standard. "Whenever we test people on it, it's very hard in the beginning," he tells Gizmag. "But after like 20-25 minutes something clicks in your head and you start to realize that 'Oh wait, this signal is actually not at all the same as this other signal that I'm also feeling.' So you can start to tease those out."
Nordvall likens the pulses to a language. "Those pulses that you feel have very little correspondence to a ball bouncing off a paddle or a ball actually coming towards you," he says. "In that sense, it's more about a semantic language; here's an arbitrary signal that means that ball is coming towards you, just like it's arbitrary that we spell 'cat' c-a-t in the English language. But If you write it down on a piece of paper and show it to someone they can understand you."
Video game controllers come associated with a language of play and interaction that gamers have been developing for over 30 years, and people who have played games at any point during that period can learn the language relatively easily. But haptic interfaces don't yet have any kind of standard interface(s), and there hasn't been much experimentation with the idea of creating one. Enter Sightlence.
As an extension of the haptic Pong game, Nordvall and his colleagues have built Sightlence, a haptic editor, which will be available soon and is meant to make it easier for anyone to design haptic output signals for games and other software. Nordvall likens it to a digital audio workstation that's not for creating music but rather for haptics. You have boxes for individual signals, and "samples" that you can create or import and then mix and match, and then you choose where the signal goes – an Xbox gamepad or an Android or iOS device, or perhaps a custom device that you built yourself. The editor is plugin-based, so even if there's no out-of-the-box support for a device, you can write a new plugin driver and jump into the tools.
The idea is that Sightlence will provide the tools for developers to establish a kind of scaffolding for haptic interfaces. "We're not used to using haptic interfaces at all," explains Nordvall. "Especially not for information that's only conveyed through haptics and nothing else."
Haptic growth
Nordvall's haptic editor could help kickstart development of new haptic experiences – of haptic "languages" – where there's currently a big spike in interest on three separate fronts: virtual reality, wearable computing, and touchscreen technology.
Sightlence may not help much with virtual reality, which leans more toward simulation in one way or another and thus could benefit from direct translation of recorded real-world forces into haptic signals. But wearables and touchscreens will need to be more abstract in their use of the technology, and that appears well-suited to a haptic editor that lets you tap or draw out signals and tweak them to your heart's content.
Apple touts its upcoming Watch as coming with a Taptic Engine that provides tactile sensations on your wrist that are "recognizably different for each kind of interaction." It will, the documentation suggests, allow users to send each other their heartbeats as measured by the heart rate sensor, and perhaps other customized haptic cues. The watch will also guide you in the right direction with a gentle buzz when you're navigating. Other smartwatches and their arrays of apps are trying similar things, and as new kinds of wearables emerge, we're likely to see haptics play a key role in how we interact with the technology we carry around with us.
Especially now that screens, too, are gaining improved haptic technology. Clever techniques such as one involving pushing liquid into prearranged tactile pixels can provide the sensation of pressing physical buttons. Fujitsu last year showed off prototypes for haptic-enhanced tablets that go even farther, offering users a feeling of texture beneath their fingers through ultrasonic vibrations of varying frequencies and amplitudes.
In the virtual reality space, haptic technology could be the missing puzzle piece that propels the likes of Oculus Rift and Sony's Project Morpheus into the mainstream. The trouble is that virtual reality now looks and sounds so lifelike that you instinctively feel like you're physically in the virtual space, and that means that you want to touch things and to feel things touching you. But without haptics (or a whole lot of props and careful planning), neither of these are possible.
That's why there's an arms race in the field. Control VR and PrioVR, among others, track your hands and fingers or entire body in real time, with haptic feedback being gradually explored in most of them.
Others don't worry about motion tracking, and focus only on the haptics. KOR-FX makes you feel every virtual impact – big or small – in your chest, as explosions seem to blow right through you, engines rumble deep beneath your skin, and you feel something akin to the real g-force of your car or spaceship turning.
Others still offer motion-tracked props with built-in haptic feedback, such as with the Striker Virtual Recoil rifle or Tactical Haptics Reactive Grip device.
And this is without even getting into the high-cost custom-designed haptic tools used in industrial, medical, and military simulations. Surgeons train or perform remote surgery with 3D glasses and haptic instruments, soldiers get a taste of battle using fake guns with realistic recoil and vests not unlike KOR-FX, while pilots and mining machine technicians also get tactile feedback from their training programs.
A vibrant future
In all of these applications of haptic technology, Nordvall suggests the key is that tactile feedback be used to either augment an experience (to make it feel more real or satisfying, like subtle vibrations on a virtual keyboard or pulses when your virtual car crashes into a wall) or to replace elements of the interface entirely such that the information haptics provide is not available through the soundscape or visually on the screen.
In a video game, for instance, "you could have some vibration telling you which way is north, so that you don't have to look at a mini-map all the time," says Nordvall. Neither method has been explored fully, and the latter especially is ripe for attention. You could have experiences that are only possible with haptics, Nordvall suggests, or reinvent existing ones, like the game of Pong or perhaps navigating through a maze.
With recent research producing both new forms of haptic feedback (just last month Gizmag reported on holographic objects that can be seen and felt) as well as more sophisticated tactile displays and 3D maps, the future of haptics seems limited only by our imaginations. The technology is maturing rapidly, and it's making its way into the mass market on a scale worthy of developer attention. Now the interfaces that apply it need to catch up.
Imagine if both students and experts can not only see and hear simulations, but feel them as well. They have the opportunity to physically try things that cannot be done with real patients or real materials without the possibility of causing harm or costing too much in damages. This is already being used at the top level of many fields. When this technology becomes cheaper and more easily available, it will have a very large impact.