Future-proofing the Metaverse
Can we future-proof the metaverse? Possible, by Whole-of-Society participation. This means individuals, families, institutions (private, public, and non-profit), and governments working together across borders. Consider this: there are over 800 policies and strategies in AI worldwide, yet we seem unsure of how opportunities we are presented with can be governed for the greater good. Last year, Chat GPT grabbed the spotlight for governance and market exploitation. This year, the metaverse is back (with Apple Vision Pro and the recent Mark Zuckerberg interview with Lex). The expected market size of the metaverse varies from US$ 13-8 trillion, US$ 5 trillion, or US$ 800 billion. The current world economy is about US$100 trillion and may reach US$ 150-200 trillion by 2023. These projections mean the metaverse (an AI technology) will be about the US$ 13-15 trillion mark, overlapping with much of AI technology[1] , which is forecast to hit US$1345 billionby 2030.
What??? Metaverse could be worth more than AI! How come?
First of all, these predictions are speculative. The problem stems from the fact the industry spills across multiple sectors. Take the humble silicon chip, a key feature of AI. The last estimate was that it is embedded in 169 industries. AI is hardware, software, and data - these are giant industries in their own right, so I would accept the higher valuation (10% of total global GDP). That, in my opinion, would be conservative. With the internet of all things, we require more standards for interoperability, more data deal flows, and there will be greater adoption across sectors to feed these hungry platforms.
Second, there is no one definition for AI, let alone metaverse. This lack of a common understanding of the term is a barrier, as we are unable to capture what we mean by the metaverse. In gaming virtual worlds, for example, predictive analytics is commonly used to manage latency (the lag in transmission of data signals). We need prescriptive analytics and generative AI for customer experience generation, a core part of the metaverse experience. So, AI feeds into the metaverse.
Governing the Metaverse
The challenge to govern the metaverse is that disparate technologies come together in a global marketplace across industries. Many of these technologies begin as open source, crowd-sourced, or in the hands of the private sector. Government is unable to predict when and what will come out (since the private sector does not publish research in peer-reviewed journals), and when these systems are released, market adoption determines whether regulations are needed. Keep in mind Chatbots were classified as limited risk in the AI Act till Chat GPT came out.
Hence, the first agenda must be to understand what the metaverse is and is not (and, for that matter, AI) if we are to govern it better. Industry estimates for the full realization of the metaverse concept are that it will take about 5–10 years.
Second, since the time lag between regulations and adoptions is vast, we must encourage the private sector to self-govern. This means incentivizing them to do the right thing rather than just focusing on penalties. Take tax havens, for example; more than US$1 trillion is “parked” offshore to avoid taxes, which are considered a penalty.
Third, we need transparency since the metaverse and AI run on data (and energy, water, and resources). What happens to data when an app or a system closes? How do you know the data collected has been destroyed? How about for an M&A that was not disclosed before signing up for the app or software? Why should future M&As be a default of a blanket legal contract signed now for personal data? What is the energy consumption of these systems? We don't need the offset version alone but the actual energy, resource, and water consumption. Currently, a lot of this is self-disclosure or not transparent. For example, did you know that Google’s data centers consumed 5.6 billion gallons of water in 2022 (Google 2023)? “clean” or renewable energy is often not so clean, with a toxic environmental footprint before operation and at end-of-life (e.g., solarand wind). An offset number or a phrase like “carbon neutral” may be misleading. A seedling needs 5-10 years before it can absorb carbon. What common standards need to evolve, and are they inclusive (global North and South in a dialogue)?
Fourth, we need societal education – parents, caregivers, educators, and policymakers. Somehow, this has become the task of popular media or consultants. So, as the equipment to onboard the metaverse becomes cheaper, facilitating large-scale adoption, we need to be sure that people who sell and purchase these equipment (and give them as gifts or force employees to use them) know the consequences of these choices (harvesting of data), and ensure human agency is not lost, especially in a gamified environment.
With the diversity of intents (public and private sector) at the global level, we have a Herculean task ahead. What can be used for good can easily be used by other actors for other not so good purposes.
More blogs: www.melodena.com
[1] See PwC analysis for AI: https://www.pwc.com/gx/en/issues/data-and-analytics/publications/artificial-intelligence-study.html
Recent Posts
See AllWe are seeing more movement of products across industries and the inability of regulators to catch these. Take for example the case of...
Comments