Virtual and augmented reality technology has become pretty good at fooling our brains into believing virtual objects are really right in front of us. Motion tracking technology and direct sensors can make our virtual hands exactly mimic our real ones. Allowing us to push and pull virtual objects. Unfortunately if you did reach out to touch a virtual object all you’d feel is thin air. When we consider the ways we sense the world around us and virtual reality’s overall goal of replicating those experiences, it’s clear that our tactile senses are a crucial component to complete the illusion.
This is where the field of haptics comes into play. Haptics technology has the express purpose of simulating the sensation of touch with various mechanisms.This includes using touch as a feedback system to communicate information to the user and to simulate virtual objects.
Achieving true full touch simulation is not as simple as one might think at first. Our sense of touch comes from a combination of various different organs. With our hands we can determine everything about an object we could using sight (barring its colour) and we can tell things that we can’t see with our eyes. We can tell hardness, geometry, temperature, texture and weight by handling something. These advanced sensory abilities are what let you locate an object in a bag amongst other objects without looking. As visually-centered creatures we often don’t stop to consider how incredible our sense of touch really is.
Although you might not know it explicitly, there’s a good chance you’ve encountered haptic technology in your daily life. Many smartphones with capacitive touch screens use vibration as a form of output. Unlike keys on a keypad that have a texture and shape, touchscreens are just flat plates of glass. The buttons that appear on them are virtual ones with no “click”, so the vibration function of the phone is used to simulate the tactile feel of buttons. Those same vibrations can be used to convey information. For example, some Android smartphones will detect when you pick them up and vibrate if there are any unread notifications for you.
Video game controllers have long made use of different haptics to enrich the gaming experience. There have been a range of force feedback steering wheels and flight sticks that give tactile feedback from the virtual game world. Active, motorized electronics make the steering wheel vibrate or jump in your hand to reflect the road surface. The flight stick might provide the right sort of “feel” you would get when flying a real helicopter or aeroplane.
Modern gamepads are also imbued with haptic feedback. You may feel the thudding steps of a monster, the kick of a firearm or the rumble of an earthquake thanks to intelligent motors and weights in the device.
The idea of haptics has been brought a bit more into the mainstream light thanks to new developments by Apple Corporation for its line of mobile devices and wearables. The “taptic engine” and “3D touch” features make using devices such as the Apple Watch or the iPhone 6 a much more tactile experience. In the case of the watch the taptic engine creates a strong tactile click along with a vibration. Using these two effects various sensations can be simulated. The 3D touch feature on the iPhone 6 senses pressure and assigns different functions to buttons depending on how hard you press. Combined with haptic feedback this creates the illusion of a screen with depth.
How Do Haptics Work?
To give us sensations that feel like solid objects, or resistance haptic devices employ various different technologies. They can apply force, pressure or resistance by using electric actuators, pneumatics and hydraulics. Devices such as gamepads and force feedback peripherals use electric motors, but more exotic haptic systems (many of which are specific to virtual reality) use hydraulic or pneumatic solutions. For example, some data gloves both track hand motion and use air bladders to harden and restrict your grip, so you can feel an object like a ball in virtual reality. Then we have really high-end solutions such as the CyberGrasp or Hiro III.
The CyberGrasp is a wearable exoskeleton that uses tendons and actuators to apply resistance to each finger individually. The Hiro III is a robotic hand that transmits touch information to the fingertips of the user.
The haptic exoskeleton concept can be extended to the entire body, or to pneumatic suits, or body worn vibration packs to simulate (for example) impacts to the body.
Haptic suits, as the name implies, cover more than just the hands. Since large pneumatic, hydraulic and electromechanical haptic systems aren’t practical for mainstream use current attempts at haptic suits use neuromuscular stimulation similar to the technology used for therapy. The Teslasuit intends to use this technology to add full body touch, giving sensations of impacts, hot, cold, etc. At the same time it provides full body motion tracking, fulfilling several functions at the same time.
One major gap in the touch puzzle remains however, the issue of texture. Interestingly the Walt Disney company has been working on this issue for some time and has come up with “textured” touchscreens. Rather than using a type of programmable material that actually changes shape, but uses electro-vibrations and a very clever algorithm to fool the brain into perceiving texture.
Haptics are sure to be a major part of both future virtual reality experiences and everyday consumer electronics. Wearable technology, since it is in contact with our skins, will be the most likely place to deploy these solutions. Haptics is a fast evolving area and now that other virtual reality technologies are maturing it’s likely that demand for consumer-grade haptics will also intensify.