With the extended reality (XR) revolution already underway, it’s easy to envision a future in which the lines between the real world and the virtual world become even more blurred than they are today. In this article, I look at the technological advances coming our way in virtual reality (VR) and augmented reality (AR) and what these might mean for everyday life in the future.
Rapid XR advances are on the horizon
In the future, it’s likely we’ll experience XR in ways we can’t yet imagine. But, for now, there are plenty of imminent tech advances to look forward to. We’ll have faster, lighter, more affordable VR technology. And advances in smartphone technology (such as better cameras and processors) will mean we can enjoy slicker AR and VR experiences on our phones. And with 5G wireless networks, we’ll be able to enjoy them wherever we are in the world.
Here are some of the key advances in XR tech that are just around the corner:
- LiDAR will bring more realistic AR creations to our phones. The iPhone 12 and iPad Pro are now equipped with LiDAR technology, and it’s reasonable to expect other devices will follow suit in due course. LiDAR (Light Detection and Ranging) is essentially used to create a 3D map of surroundings, which can seriously boost a device’s AR capabilities. It can provide a sense of depth to AR creations – instead of them looking like a flat graphic. It also allows for occlusion, which is where any real physical object located in front of the AR object should, obviously, block the view of it – for example, people’s legs blocking out a Pokémon GO character on the street. This is vital for making AR creations appear more rooted in the real world and avoiding clunky AR experiences.
- VR headsets will get smaller, lighter, and incorporate more features. Hand detection and eye tracking are two prominent examples of the built-in technology that will increasingly be incorporated into VR headsets. Because hand detection allows VR users to control movements without clunky controllers, users can be more expressive in VR and connect with their game or VR experience on a deeper level. And the inclusion of eye-tracking technology allows the system to focus the best resolution and image quality only on the parts of the image that the user is looking at (exactly how the human eye does). This taxes the system less, reduces lag and reduces the risk of nausea.
- We’ll have new XR accessories to deepen the experience further. One of my favourites examples is robotic boots. Startup Ekto VR has created wearable robotic boots that provide the sensation of walking, to match your movement in the headset, even though you’re actually standing still. The Ekto One robotic boots look a bit like futuristic roller skates except, instead of wheels, they have rotating discs on the bottom, which move to match the direction of the wearer’s movements. In future, accessories like this may be considered a normal part of the VR experience.
- We’ll even have full-body haptic suits. We already have things like haptic gloves, which simulate the feeling of touch through vibrations. But what about full body suits? In fact, full-body suits are already available – the TESLASUIT being one example – but they aren’t exactly affordable for everyday VR users. They will probably become more affordable, mainstream, and effective in time, providing yet another leap forward for VR.
Merging the human body with XR technologies?
Looking beyond these external accessories and devices, we may see XR technologies begin to integrate more seamlessly with the human body. One way is through AR contact lenses. While it’s true that AR glasses will get better, cheaper, and more comfortable, in the future they may also become obsolete as AR lenses take over. Such lenses are already in development; in 2020, California-based startup Mojo Vision revealed it was developing AR contact lenses with micro-LED displays that place information inside the wearer’s eyes.
Imagine the uses for such AR lenses. For now, Mojo says that its first priority is to help people struggling with poor vision (by providing better contrast or the ability to zoom in on objects). But the intention is the lenses will eventually be made available for everyday consumers, and could be used to project things like health tracking stats and other useful data. Indeed, when demonstrating the prototype to journalists, the lenses displayed pre-loaded information like text messages and the weather report, indicating that AR lenses could help us consume content in new ways. It could also help us enhance our sight in low light conditions (even if our vision is otherwise trouble-free), or even serve as a teleprompter for speaking events.
Eventually, AR lenses could potentially be used to augment the world around us, so we could see whatever we wanted. Let’s say you hate the garish paint job your neighbours have done on the exterior of their home. In the future, your lenses could change it for you, and you’ll see whatever colour house you choose. Or let’s say you see an impressive building and want to know who designed it and when it was built. Your lenses could overlay the information directly in front of your eyes. All of which would further blur the boundaries between the real world and the virtual one.
Keeping the benefits of XR in mind
It would be easy to paint all this with a dystopian flair – a slippery slope that starts with playing Pokémon GO and ends up with humans permanently wired up to a virtual world. But I feel hugely positive about the future of XR. At the end of the day, XR is about turning information into experiences, and this can make so many aspects of our lives richer and more fulfilling.
Yes, there are pitfalls to overcome (individual privacy, ethics, and so on). Yet, the potential benefits of XR far outweigh the challenges. Certainly, for businesses, XR offers huge scope to drive business success, whether that means engaging more deeply with customers, creating immersive training solutions, streamlining business processes such as manufacturing and maintenance, or generally offering customers innovative solutions to their problems.
Where to go from here
If you would like to know more about , check out my articles on:
Or browse the Artificial Intelligence & Machine Learning library to find the metrics that matter most to you.