The Layers of Metaverse
Let’s talk about the value chain of the Metaverse. What are the different layers that add value to the Metaverse and make it worthy for the consumer to interact with? From the experiences that consumers want to the supporting technology that allows it to happen. More significantly, let’s talk about the vision for a future Metaverse based on decentralization and driven by creators.
Whether this is the future that emerges: one that delivers the widest range of experiences, driven by artists who make a livelihood doing it — or one characterized by the next wave of gatekeepers and rent-takers — will be determined by the investments and decisions made today.
We are currently in the right direction of the former and I hope this trend continues. Here are the seven layers of the Metaverse.
- Spatial computing
- Human Interface
Many people imagine the Metaverse as a three-dimensional realm that will surround us. But the Metaverse is about the relentless dematerialization of actual space, distance, and things, not 3D or 2D or even necessarily graphical. It includes 3D games such as Fortnite on consoles, Beat Saber on VR headsets, and Roblox on PCs. Alexa in your house, Clubhouse on our phones, Zoom when you are working remotely are all part of it.
When physical space is dematerialized, what happens? Experiences that were scarce previously may become abundant. Games show us the way forward: you may picture yourself as a rock star, a Jedi, a racing car driver, or whatever else you can think of in a game. Consider what happens when you apply this to more everyday situations. A physical performance, for example, may only sell a few front-row seats, but a virtual concert may create a tailored plane of existence around each individual, ensuring that you always have the greatest seat in the house. Cue Fox’s Alter Ego, a show where participants adapt to avatars.
Music concerts and immersive theatre, which have previously appeared in Fortnite, Roblox, and Rec Room, will develop to include more events that are informed by live entertainment. Social entertainment will be added to esports and online communities. Meanwhile, established businesses such as tourism, education, and live performance will be remade to fit the virtual economy of abundance and game-thinking.
Customers are no longer just consumers of content; they are also creators and amplifiers of content. In the past, “user-generated content” was used to describe commonplace services like blog comments or video posting. People no longer create content; instead, it arises through their interactions and flows into the substance of their communities’ dialogues. The material creates more content, events, and social engagement in a virtual flywheel. When we talk about “immersion” in the future, we’ll be referring to social immersion as well as immersion inside a graphical space or a story-world, and how it inspires engagement and drives material.
The push and pull that introduces individuals to new experiences is the discovery layer. This is a huge ecosystem, and it’s one of the most profitable for many companies, including some of the world’s biggest. Most discovery systems may be characterized as either inbound (the person is on the lookout for the information) or outbound (marketing that the person did not specifically seek).
Inbound will have the following components: Real-time presence, Community-driven content, App stores (along with reviews, rating systems, and categorization/tagging), Curation — via featured application listings in stores, taste-makers, and “influencers”, Search engines, Earned media
While outbound can be the following — Display advertising, Spam (email, LinkedIn, Discord), Notifications
The internet is already familiar with most of the elements mentioned above. Let’s concentrate on what is more relevant to the Metaverse.
First, unlike other types of promotion, community-driven content is a considerably more cost-effective source of discovery. People will spread the word if they are passionate about the material or activities they are attending. The content itself will become a marketing asset as it becomes simpler to swap, barter, and share inside more Metaverse situations. The relative simplicity with which they may be given to decentralized exchanges, as well as the economics that promote more direct creator-community participation, are two examples that have already developed. As an alternative to application markets, content marketplaces will emerge as a viable option.
Real-time presence features are a type of community surfacing. Rather than concentrating on what people enjoy, this focuses on what they are doing right now. This is especially important in a Metaverse when communicating with friends through shared experiences will provide so much value.
Real-time presence is well-used in the many walled gardens for particular games: if you check in to Steam, Battle.net, Xbox, or PlayStation, you can see what games your friends are playing right now. Clubhouse demonstrates the potential of this structure outside of games: picking which room to join is primarily based on your selected list of people you follow.
One of the most exciting options for producers is real-time presence recognition that encompasses a wide range of activities in the Metaverse. Discord has a presence-detection SDK that works in a variety of gaming contexts; if that (or something similar) becomes more widely accepted and visible, we’ll see a shift away from asynchronous “social networking” and toward real-time “social participation.” Experiences that provide community leaders with the tools they need to start activities that people want to participate in will pave the way.
Not only are Metaverse experiences getting more immersive, social, and real-time, but the number of producers who create them is also growing at an exponential rate. This layer houses all of the technology that artists utilize every day to create the experiences that consumers like.
Whether in the Metaverse, gaming, online development, or e-commerce, previous creator economies followed a predictable pattern:
Pioneer Era: Since the first individuals to design experiences for a particular technology don’t have any tools, they have to start from scratch. The initial websites were written in HTML; for e-commerce sites, users created their own shopping carts; and for games, programmers wrote straight to the graphics hardware.
Engineering Era: Following early achievements in a creative industry, the number of individuals on teams explodes. Build-from-scratch is typically too slow and costly to meet demands, and workflow becomes more complicated. Early tooling in a market tends to ease overworked developers by providing SDKs and middleware that saves them time. Ruby on Rails, for example, made it easier for developers to design data-driven websites (along with a slew of other application server stacks). Visual libraries such as OpenGL and DirectX appeared in games to allow programmers to produce 3D graphics without having to know much low-level code.
Creator era: Designers and creators don’t want coding bottlenecks to slow them down, while developers would rather contribute their skills to the project’s distinctive qualities. The number of creators has increased dramatically and exponentially in our period. Creators get access to content creation tools, templates, and markets that reorient development from a bottom-up, code-focused approach to a top-down, artistically focused process.
Without understanding a single line of code, you can now launch an e-commerce website with Shopify in minutes. Wix and Squarespace are both good options for building and maintaining websites. Without accessing the lower-level rendering APIs, 3D graphics experiences may be created within game engines like Unity and Unreal utilizing visual interfaces within their studio settings.
The Metaverse’s experiences will become more dynamic, social, and constantly updated. Until now, Metaverse’s creator-driven experiences have been centered on centrally managed platforms like Roblox, Rec Room, and Manticore, where a full suite of integrated tooling, discovery, social networking, and monetization functions has enabled an unprecedented number of people to craft experiences for others.
Spatial computing proposes hybrid real/virtual computation that erodes the barriers between the physical and the ideal worlds. … Wherever possible the machine in space and space in the machine should be allowed to bleed into each other. Sometimes this means bringing space into the computer, sometimes this means injecting computation into objects. Mostly it means designing systems that push through the traditional boundaries of screen and keyboard without getting hung up there and melting into the interface or meek simulation.
— Simon Greenwold, Spatial Computing
Spatial computing has grown into a broad category of technology that allows us to explore and modify 3D locations, as well as enrich the physical world with additional data and experiences. The spatial computing software should be separated from the enabling hardware layer, which we will go into in-depth in the Human Interface section below. This comprises the following important features of the software:
- Mapping and interpreting the inside and the outside world — geospatial mapping (Niantic Planet-Scale AR and Cesium) and object recognition
- 3D engines to display geometry and animation (Unity and Unreal)
- Data integration from devices (Internet of Things) and biometrics from people (for identification purposes as well as quantified self applications in health/fitness)
- Voice and gesture recognition
- Next-generation user interfaces to support concurrent information streams and analysis
The ideal Metaverse organization is the polar opposite of Ready Player One’s OASIS, which was ruled by a single entity. When alternatives are maximized, systems are interoperable and constructed inside competitive marketplaces, creators will have control over their own data and products, experimentation and growth skyrocket.
The Domain Name System (DNS), which maps individual IP addresses to names and saves you from having to type in a number every time you want to go anywhere online, is the most basic example of decentralization.
Distributed computing and microservices enable a scalable environment for developers to access online capabilities — everything from commerce systems to specialized AI to a variety of game systems — without having to worry about constructing or integrating back-end capabilities.
Blockchain technology liberates financial assets from centralized control and custody, and we’re already seeing examples of financial legos being connected to construct unique applications in decentralized finance (DeFi). We’ll see a wave of innovation around decentralized marketplaces and apps for game assets. NFTs and blockchains suited for the type of microtransactions required by games and Metaverse experiences have already been emerging.
“Far edge” computing will bring the cloud closer to our homes — and even into our cars — allowing sophisticated programs to run at low latency without putting all of the work on our devices. Computing power will become more like a grid utility (similar to electricity) than a data center.
The closer proximity of computer equipment to our bodies is changing us into cyborgs.
Smartphones have outlived their usefulness as phones. They’re extremely portable, always connected, and powerful PCs with a phone app preloaded. They’ll absorb an increasing number of apps and experiences from the Metaverse as downsizing, the correct sensors, embedded AI technologies, and low-latency access to powerful edge computing systems improve.
If you think about it, the Oculus Quest is simply a smartphone that has been repurposed as a virtual reality gadget; this provides us a glimpse into the future.
Smartglasses with all of the features of a smartphone, as well as AR and VR applications, will be available soon.
There are more ways in which machines can be brought closer to us, that is what we call wearable technology.
- Fashion and clothes incorporate 3D-printed wearables.
- Biosensors that are so small that they can be printed on the skin
- Perhaps even consumer neural interfaces are in the works?
The infrastructure layer is responsible for enabling our devices, connecting them to the network, and delivering content.
5G networks will increase capacity while lowering network congestion and latency. Speeds will be increased by another order of magnitude with 6G.
Untethered functionality, high performance, and miniaturization will be required by the next generation of mobile devices, smart glasses, and wearables. This makes increasingly powerful and tinier hardware a necessity — semiconductors that will soon be reduced to 3nm processes and beyond; microelectromechanical systems (MEMS) that will enable tiny sensors; and compact, long-lasting batteries.