Experts conclude that more attention – or at least some consideration – needs to be given to protecting privacy in the promised metaverse of connected 3D virtual reality worlds.
In an article distributed via ArXiv entitled “Exploring the Unprecedented Privacy Risks of the Metaverse”, experts from UC Berkeley in the USA and the Technical University of Munich in Germany tested an “escape room” virtual reality (VR) game improve understanding of how much data a potential attacker could access.
Through a 30-person study of VR usage, the researchers – Vivek Nair (UCB), Gonzalo Munilla Garrido (TUM) and Dawn Song (UCB) – created a framework for assessing and analyzing potential privacy threats. They identified more than 25 examples of private data attributes available to potential attackers, some of which would be difficult or impossible to obtain using traditional mobile or web applications.
The wealth of information available through Augmented Reality (AR) and VR hardware and software has been well known for years. For example, an article from 2012 in New scientist described Ingress, an AR game from Google spin-off Niantic Labs, as a “data gold mine”. That’s why data monetization firms like Meta are willing to invest billions to make the market for head-hugging hardware and AR/VR apps more than just a sadness for techies who have no use for upper bodies.
Similarly, the trust and security issues of online social interaction have plagued online services since the days of dial-up modems and bulletin boards, before web browsers even existed. And now, as Apple, Google, Microsoft, Meta, and other players see an opportunity to reinvent Second Life under their own watch, management consultancies are once again reminding their clients that privacy will be an issue.
“Advanced technologies, particularly in VR headsets and smart glasses, will track behavioral and biometric information at record volumes,” explains the Everest Group in its most recent report, Taming the Hydra: Trust and Safety in the Metaverse.
“Currently, digital technologies can collect data about facial expressions, hand movements and gestures. Therefore, personal and sensitive information that will permeate the metaverse in the future will include real-world information about user habits and physiological traits.”
Not only is data protection an unsolved Metaverse problem, hardware security also leaves a lot to be desired. A related recent study on AR/VR hardware, “Security and Privacy Evaluation of Popular Augmented and Virtual Reality Technologies,” found vendor websites rife with potential security vulnerabilities, their hardware and software do not support multifactor authentication, and their privacy policies are obtuse.
The Escape Room study lists the specific data points available to attackers of various types – hardware, client, server and user adversaries. It’s worth noting that “attackers,” as defined by the researchers, includes not only external threat actors, but also attendees and the companies running the show.
Potential data points identified by the researchers include: geospatial telemetry (height, arm length, eye relief, and room dimensions); Device specifications (refresh rate, tracking rate, resolution, device field of view, GPU and CPU); network (bandwidth, proximity); Behavioral observations (languages, handedness, voice, reaction time, near vision, television, color vision, cognitive acuity and fitness).
Various inferences can be drawn from these metrics about a VR participant’s gender, wealth, ethnicity, age, and disabilities.
“The alarming accuracy and stealth of these attacks, and the foray into metaverse technologies by data-hungry enterprises, suggest that data collection and inference practices in VR environments will soon become ubiquitous in our daily lives,” the paper concludes.
“We would like to start by saying that these ‘attacks’ are theoretical and we have no evidence that anyone is actually using them at this time, although it would be quite difficult to know if they were,” Nair and Munilla Garrido wrote in an E -Mail to The registry. “Also, we use ‘attacks’ as a term, but in reality if this data collection were used, the consent would probably have been buried in an agreement somewhere and theoretically gone completely overboard.”
If a company wants to collect data, they could get a lot more information about users in VR than in mobile apps… In this context, a move to VR would make perfect sense
However, the two researchers say there is reason to believe that companies investing in the metaverse are doing so, at least in part, in the expectation that post-sales advertising will offset losses such as the $12.5 billion the Metas Reality Labs group spent last year to earn just $2.3 billion in revenue.
“Assuming a company of this size knows how to calculate a BOM, this lossy approach has to be a strategic decision that they believe will ultimately pay off,” argued Nair and Munilla Garrido. “And when we look at who these companies are and what revenue methods they’ve already perfected, we suspect it will be at least somewhat tempting to use the same methods to offset hardware losses.” But this too is speculative.
“All of our research shows that if a company wanted to do data harvesting, they could get a lot more information about users in VR than, for example, from mobile apps, and that a move to VR would make perfect sense in this context.”
When asked if existing privacy rules adequately address the collection of Metaverse data, the two eggheads responded that they believe so, except when those rules only pertain to mobile apps.
“But we have a unique challenge with metaverse apps as there is a plausible reason to push that data to central servers,” they explained. “Basically, Metaverse applications work by tracking all of your body movements and streaming all of that data to a server so that a representation of yourself can be rendered for other users around the world.
“For example, while a company would have trouble arguing that tracking your movements is required for their mobile app, it is actually an integral part of the Metaverse experience! And at this point it’s much easier to argue that logs about it are needed for troubleshooting and so on. In theory, even if the same privacy laws apply, they could be interpreted in dramatically different ways because the fundamental data requirements of the platform are so different.”
Nair and Munilla Garrido acknowledged that some of the approximately 25 collectible attributes they identified in their research may be obtainable through cell phones or other online interactions. But Metaverse apps provide a one-stop shop for data.
“We have a situation where all of these categories of information can be gathered at once in a matter of minutes,” they explained.
“And because you need to combine multiple attributes to make inferences (eg, size and voice to infer gender), having all of these data collection methods in the same place at the same time makes VR a unique risk in terms of.” being able to very accurately infer user data attributes.”
The sheer amount of information available about the metaverse is enough to de-anonymize any VR user, they claimed. They argue that this is not the case with apps or websites.
The purpose of their newspaper, they said The registryaims to shed light on the far-reaching privacy risks of AR/VR and encourage other researchers to seek solutions.
Screenshot of MetaGuard in the VR world… Click to enlarge
They already have one thing in mind: a plugin for the Unity game engine called MetaGuard. The name makes the source of the privacy threat clear.
“Think of it as ‘incognito mode for VR,'” wrote Nair and Munilla Garrido. “It works by adding noise to certain VR tracking measurements using a statistical technique known as differential privacy, making them no longer accurate enough to identify users but without significantly degrading the user experience. Like incognito mode in browsers, it’s something users can toggle on and off and customize as they see fit, depending on their environment and trust level.
We hope that privacy in the Metaverse will be that easy. ®