The metaverse as a concept has gained traction in the real world with individuals and organisations engaging within its digital boundaries. This makes it important to ask the tough questions about privacy, security and legal expectations.Despite its current hype reversal, the metaverse has moved from a quirky suggestion made by an out-of-touch billionaire to a promising environment that is expected to garner significant investment and growth over the next few years. As the technology that enables and empowers the metaverse becomes more accessible and cost-effective, the innovations that thrive within it become more engaging and immersive.
According to research by Citi, the market for the metaverse could be valued anywhere from $8 trillion to $13 trillion in the next seven years. It is a view shared by McKinsey, even though their estimate is somewhat more conservative at a “mere” $5 trillion by 2030. It is, as the firm accurately points out, the very real business of the virtual world.
“It is a space that has immense potential across different users, platforms markets and industries, and is the next evolutionary step up from the current Web2 internet,” says Anna Collard, SVP Content Strategy & Evangelist at KnowBe4 AFRICA.
“However, it is as important to look at the security safeguards and privacy laws that shape our digital lives today and tomorrow as it is to look at the technologies that drive it. There are a lot of questions about how people can protect their privacy in a virtual world. Which jurisdictions apply in a virtual world where citizens from different countries interact with each other in an immersive fashion? How will different data protection regulations step in to protect information and individuals? We have not even solved these challenges in the current internet and will face new ones that come with any advancements of new technologies”.
Regulation and protection
It is a hazy space. Does the data protection regulation then recognise that the metaverse is a singular country of its own and that it is then governed by its own laws and regulations? How do legal firms unpack this complexity amidst a barrel of unknowns and the unexpected? According to the law firm OneTrust Data Guidance, the data-driven nature of the metaverse introduces risk around the processing of personal data. The firm also reiterated that the application of regulations such as GDPR is reliant on the location of the end user – not their residence or citizenship. But what is the location within the metaverse? Is it the physical location of the end user, the location of their avatar or the location of the relevant server?
It is a view shared by the World Economic Forum (WEF) in a recent discussion around the importance of protecting virtual reality (VR) user privacy. In a recent article, the organisation asked what would happen if the data gathered from VR experiences and behaviours were used to determine an individual’s financial future.
If the data, for example, from a game was sold to a financial firm that then denied a person access to a policy based on certain tracking points within the data – this would be a violation of that person’s privacy and have a serious impact on their future. Especially considering that data collected within the VR realm has been able to identify a person with 90% accuracy, even when the data is anonymised.
“It is a truly concerning situation, especially for individuals who may find out deeply personal information or discover painful health realities without any kind of medical diagnosis or support,” says Collard. “Suddenly, they are wrenched from this immersive world into the real world and there are no protections in place to support them or to challenge the decisions made by big businesses on entirely unethical foundations.”
The ramifications of this extend into advertising and even more nefarious use cases within the metaverse – the collection of information based on individual behaviours within this environment that are then sold on to improve or target advertising or misinformation campaigns more effectively.
What people like and even what they think in their virtual world will be obtained and analysed from their bio-neuro feedback markers. The decisions they make can be forecasted and persuasively influenced or even manipulated by interactive and generative AI that responds to the input in real time.
“This goes beyond what we can do today and has the potential to manipulate people so extensively that we are even now, unable to fully assess the impact of the damage. I am especially concerned about the ethical questions around the usage of generative and multimodal AI in online advertising. Conversational chat bots powered by multimodal ML can react in real time to the input they receive and use this to persuade the users towards the advertiser’s or political party’s goals, some of which might or might not be in the consumer’s best interest,” concludes Collard.
“We need to rigorously define privacy within the metaverse and how information is collected and sold. We need to protect our children and our families and prevent platform providers from taking what they need to turn a profit. While the metaverse is rich with potential and incredible experiences, all these can be tarnished if the data collected affect people’s lives.”