A recent report by Accenture highlighted the growing number of companies taking advantage of “extended reality” (XR) technology, which includes virtual and augmented reality (VR/ AR).
Titled ‘Waking Up to a New Reality: Building a Responsible Future for Immersive Technologies, ‘ the report highlights the need to consider not just the opportunities but also the dangers they could pose. The authors argue that as well as threats to privacy, these could include risks to mental health
Like other technologies driving the current phase of digital revolution – such as artificial intelligence and cloud computing – XR experiences provide value by gathering and interpreting data. With XR, however, the data is of the most personal variety imaginable.
When using AR applications in the real world, we can expose vast amounts of data about ourselves. Highly popular AR apps of the type that power the philtres used by Snapchat and Facebook can gather intimate biometric data when we show them our face. This includes facial expressions, speech data, and even retina patterns that can be used to uniquely identify us.
Other AR apps have the potential to capture everything we are seeing and hearing, such as those that involve headsets like Google Glass or Microsoft’s Hololens.
Is every parent that lets their child play freely with these fun applications fully aware of the implications of this? It’s highly unlikely – and while social media companies and smartphone app publishers may do their best to assure us everything is secure and our privacy is respected, there’s no way of knowing for sure who is accessing this information and what they are doing with it.
It’s an unfortunate fact that data breaches are an increasingly common occurrence, and even the seemingly most secure platforms are at risk. Facebook – which purchased the VR company Oculus and whose social media platform Instagram makes use of AR – is a significant player in this field. Last year 50 million of its users’ accounts were compromised through one attack alone.
I spoke to Rori Duboff, head of content innovation at Accenture Interactive, who told me “With AR applications like the cosmetics apps that let you try on any shade of lipstick, people think ‘hey this is pretty exciting’ – and suddenly you’re letting people see your whole face … so the question becomes, what data is stored?
“Should the data, when it becomes that personal and sensitive, just be used to optimise and enhance the current experience at hand, or does it get stored anywhere? Figuring that out is very important because I think people will share this data as long as they feel they are getting value.”
This certainly fits in with other patterns of acceptance we’ve seen developing around technology in general. How many of us who initially turned off location services on our phones as we felt uneasy about our location being constantly exposed, now happily leave them running, because it makes it so much easier to call a taxi, get directions or find a nearby restaurant?
When we use VR, we create and often share data about our behaviour and movement in virtual environments that could one day be used to imitate us and steal our virtual identities or real-world assets. While most people’s experiences of VR today is still within the realm of gaming – generally offering solo experiences within controlled arenas – work is ongoing on creating online, shared VR spaces for socialising.
When the avatars we develop to represent ourselves within these virtual spaces inevitably become tied to our real-life personas – to allow us to make payments, for example – there will be even more opportunity for those with malicious intent. On top of this, VR and AR are increasingly commonplace in industry, where they are used for training as well as monitoring on-the-job performance, further blurring the boundaries between the virtual and “real” worlds.
Duboff says, “There’s a prevailing sense that these technologies are still just in the gaming world … at a consumer level its some kids fooling around with image philtres, and at an enterprise level, most people are not seeing what’s happening day-to-day inside companies, where a lot of this growth is happening. Most people are not even aware that when you turn on the phone and give it camera permission, and it’s putting something over them or their kids – that’s AR.”
Another risk lies in the potential damage that frequent and prolonged exposure to XR environments could do to our mental health. Unfortunately, there has been very little research done in this field. But we know that internet addiction has become a very real problem for a small number of people, and widespread adoption of social media, particularly among young people, has caused its own set of problems.
Could the merging of social media and XR – which potentially means far more immersive experiences and the possibility of sharing more of our lives online – exacerbate the situation for people who use the internet as a refuge from reality, or define their worth by the number of “likes” and “follows” they can attract?
And what about the dangers of online radicalization – a growing body of evidence is showing that the content a person is exposed to online can contribute towards fostering extremist political views or even terrorist actions.
“There’s a subtle, subversive world that’s being unlocked through AR, ” Duboff tells me, “We see the world in a one-dimensional way now – what’s in front of us. But once we start having these XR technologies, the world around us becomes much more complicated.
“You can go to this location, or this party, and turn on your phone and uncover things, and a lot of it is fun and exciting, but what if there’re things that you don’t want to see?”
There’s certainly a danger here of sounding like I am scare-mongering – my regular readers will know that generally, I think technology is a wonderful thing that can have a very positive, transformative effect across not just business but wider society, too. But this optimism should always be tempered by giving careful consideration to the risks, as well as the opportunities.
What’s important is that as these technologies are developed, we must ensure they are introduced with due consideration to the impact they could have on our lives and wellbeing. And that this isn’t done merely as an afterthought to mitigate against the potential damage, but baked-in as an integral part of the service.
Ultimately it will be down to the tech companies themselves to ensure that this is done responsibly so that new developments such as XR and AI don’t cause new problems for society, even as they solve old ones.
The full report, Waking Up to a New Reality: Building a Responsible Future for Immersive Technologies, can be read here.