Virtual reality is an immersive experience, but it’s missing an element of realism. Although users can see, hear, smell, and interact with the virtual world, it’s difficult to replicate touch.
That’s a missed opportunity.
Recreating the sense of touch and feel adds an additional layer of realism to a VR experience, further bridging the gap between the virtual and the real. For instance, imagine a VR experience simulating a beach. You see the sun reflecting on the water, you hear the seagulls chirping, and you smell the beach-goers’ sunblock. But if the simulation’s temperature isn’t any warmer than the building you’re in or you feel the room’s carpet on your soles instead of sand between your toes, that’s not very convincing. Would you really feel like you were at the beach?
This is where haptic technology comes in. At the consumer level, nearly everyone with a smartphone is familiar with haptic technology — a light push on the screen vibrates the phone to give the satisfying feeling of pressing a button, or users with recent iPhones will come to find that their home “buttons” aren’t buttons at all — they’re haptic-enabled touch points. Gamers, too, have been familiar with haptic technology for quite some time. Something as simple as the rumble of a controller when, say, one car hits another in a racing game is a small gesture that enhances the experience’s guise of reality.
But haptic tech is becoming more common beyond the individual level and more complex than just controllers and buttons. We’re beginning to see haptics make its way into virtual reality and the enterprise community to the point that users can even feel imaginary objects. Now, everything from controllers to wearables to treadmills and other VR peripherals (like the Teslasuit) are being developed at a breakneck pace. One recent wearable even hacks users’ muscles to make them believe they are truly feeling the digital environment.
Researchers are already studying how haptic feedback can improve accuracy and time in VR simulations. Stimulating our muscles to match what pressure, weight, temperature, and other factors we expect to feel when interacting with environments is at the foundation of this research. Surgical students, for example, are finding clinically relevant improvements using haptics in VR simulators. The precision is almost fine enough to acquire legitimate surgical skills. Similarly, midwifing students are able to simulate a birth during situations ranging from nervous soon-to-be fathers to hospital emergencies.
Imagine what this technology could do for operator training on expensive equipment, hazardous situations, digital twins, and much more. The possibilities are endless, and we’ve only just begun to explore them.
The latest piece in the ever-growing VR puzzle, haptic-enabled virtual reality has the potential to be a leading enterprise technology. Airbus, an aeronautics manufacturing company, is already implementing a haptics system to help engineers navigate potential tool issues — an early peek into how the technology can be used to make operations more efficient.
Product designers, for instance, will be able to get early feedback on ergonomics and the user experience. Consider a new infotainment system in a car. With haptics, designers can review how the controls feel and evaluate their overall usability before bringing the product to market. Further down the product pipeline, sales teams will be able to leverage a more robust virtual experience before customers make a purchase (like major car manufacturer Lexus is already doing), creating the potential to shorten the sales cycle.
Another example is digital twins and the potential they demonstrate for haptic technology. Using this concept, researchers can already see inside a running turbine that’s color-coded based on working condition, heat, and other factors. Imagine being able to feel a loose bolt or wear and tear around the engine as it runs in real time. This allows individuals in industries such as manufacturing and engineering to truly simulate their work environment in a virtual setting, providing a hazard-free space for training.
Of course, more robust technology costs more for businesses, but haptics adds an additional sensory layer that, in many cases, provides more realistic (i.e., more effective) training, which can significantly impact workers’ productivity and efficacy. In fact, kinesthetic learning — learning by touching and doing — is one of the three major learning styles, the other two being visual and auditory. Some learn by watching, others by hearing, and others yet by doing. The most robust training system would appeal to learners in each category, and that’s where haptics in VR can help.
That becomes especially true when the work involves plenty of hazardous manual labor. For instance, one report on assembly planning demonstrates how haptic-enabled VR allowed users to instantly modify features such as virtual objects’ weight, letting them test a number of use cases they would experience at work efficiently and without risk. Training experiences like this help stimulate the correct behaviors and responses employees should have when they face similar situations on the job in reality.
As for when the intersection of these two concepts will become common in the workforce, the potential already lies entirely in our hands. The above examples prove that while the technology is not yet at peak performance (gestures such as handshakes, for instance, still need perfecting), it is advanced enough to be incorporated into a business’s operations, be it for training, product design, sales, or something else.
While virtual reality has not yet realized its potential in disrupting the traditional flat screen with consumers, its true purpose is as an enterprise disruptor. Investments in VR aren’t tied to entertainment issues such as whether the PlayStation VR outsells the Nintendo Switch; success in VR is about creating new skills and opportunities in the tech marketplace as a whole. Everything from training and interfacing with machines to shopping online and in person will one day be done virtually.
But in order for that experience to realize its potential without unnecessary stumbling, tech entities must tackle that missing element so that users can truly “feel” the virtual world around them — and that’s where haptics takes over.
K.R. Sanjiv is the chief technology officer for Wipro Limited, a global information technology, consulting, and business process services company, where he’s overseen the development of Wipro HOLMESTM, the company’s proprietary AI system.