Overview of the Metaverse Ecosystem: Avatars and Virtual Worlds

Learn and explore the foundational categories like Interchangeable tools, immersive hardware, and browser-based XR experiences.

Blocks of metaverse

The envisioned version of the metaverse suggests that it’s a virtual 3D space with various virtual worlds within it. In each world, users are depicted as avatars (virtual characters).

The experience is mostly immersive, but to this day, some popular metaverses are accessible via non-immersive devices such as AR headsets, mobile, and desktops. The shared space would be a lifelike replica where people could play games together and shop from various brands. The economy involved would be virtual and mostly blockchain-driven.

Lastly, user identity would persist across the metaverse space. Users would be able to move between various virtual spaces within the metaverse, much like moving between tabs in a browser. Their identity would remain consistent across sessions. Unlike on YouTube or Instagram, where users have separate IDs, in the metaverse, identities would persist across spaces. If a user earns currency in one space, they could use it to purchase items in another space within the metaverse. While this idea may seem utopian, progress is being made step by step, and it continues to evolve.

According to Mathew BallBall, M. The Metaverse Primer, 2021. https://www.matthewball.vc/the-metaverse-primer, the metaverse can be divided into eight core categories. We’ll further categorize the categories into three blocks.

  • Basic block (interchangeable tools, standard, payment): To achieve interoperability across virtual worlds in the metaverse, it’s necessary to establish a few standard protocols. For instance, the HTTP protocol is being used for data communication across the internet. Similar protocols need to be devised for virtual worlds to be built on. This is just one example of standardization. A lot of standardization also needs to be done in payment gateways and users’ identities.

  • Intermediate block (hardware, network, compute): To create HD-rendered 3D simulations involving complex physics and other high computing functions, high-performance computers are required. Additionally, to provide real-time and persistent data across a network of devices within the metaverse, a fast internet connection (such as 5G5G wireless technology is meant to deliver higher multi-gbps peak data speeds, ultra low latency, more reliability, massive network capacity, increased availability, and a more uniform user experience to more users.) and edge computingEdge computing is a form of computing that is done on site or near a particular data source, minimizing the need for data to be processed in a remote data center. would be useful. Finally, advancements need to be made to process real-time simulations on consumer-facing hardware, such as VR headsets and AR-supported mobile phones.

  • Top level (metaverse content, services, assets, and user behaviors]: This block is eventually the application block comprises of development platform on which developers are making virtual spaces and creator enhancing them.

Three blocks of Metaverse
Three blocks of Metaverse

Interface (consumer-facing) hardware

When it comes to metaverse technologies, consumer-facing hardware or interfaces can be divided into two types: flat screens (such as web/desktop/mobile phones) and immersive screens (such as AR/VR headsets). Each interface can be placed on a number line based on its level of immersion. For instance, VR devices offer a highly immersive experience, while AR devices provide an intermediate level of immersion, and web/mobile-based (not AR-supported) interfaces don’t offer immersion at all. Although there are numerous screens through which we can join virtual spaces (that already exist), true immersion is necessary to fully realize their potential.

The levels of immersion
The levels of immersion

Still, advancements are being made in the interface hardware space. Let’s look at the number of technologies that will help us to join the ecosystem.


Rendering 2D/3D graphics within web browsers requires the use of a GPU. The development of WebGL, a JavaScript API, has enabled developers to create interactive 2D/3D graphics and animations within the browser by leveraging the user’s graphic card. This was a huge milestone for browser-based 2D/3D platforms. When we see 3D animations in the browser, that’s WebGL playing its part.

In the example below, we can interact with the 3D model of a hand within our flat-screen browsers.

Developers can now create web-based VR experiences using open-source frameworks. One of the most popular frameworks is A-Frame, an open-source web framework used to develop virtual reality experiences for the Web. A-frame uses the three.js library, a cross-browser JavaScript tool kit for creating and displaying animated 3D computer graphics using WebGL in a web browser.

Additionally, various browsers comply with WebXR technologies and offer developer support. Mozilla also developed the WebXR Emulator extension, which allows users and developers to test WebVR content in desktop browsers without needing an XR device. The number of individuals using VR through browsers is growing, and a mega shift is happening to support WebXR experiences.

Afterward, the development of WebXR, a WebGL-based library allowed the developers to create AR/VR experiences that could also work on AR/VR devices. Babylon.js is another popular library for building WebXR experiences.

Here’s an example of a WebXR-based augmented reality experience.

Experience the solar system in augmented reality (source: immersive-web.github.io/webxr-samples)
Experience the solar system in augmented reality (source: immersive-web.github.io/webxr-samples)
Educative’s XR widget: Our unique widget allows in-browser VR interactivity, enabling us to launch your AR/VR experiences directly onto our headset. Educative is the only platform to offer seamless AR/VR development experience via our XR widget.

VR Not Connected
Experience in VR
Connect your VR headset to get started.

Various challenges exist regarding making WebXR experiences run across all browsers and devices. Nevertheless, experiences are built to engage a great number of users. There are many metaverse and virtual spaces that are browser-based, such as Decentraland.


On the desktop, 2D/3D games can take advantage of computing and graphic power to a greater extent than WebGL. There are multiple graphic APIs available, such as DirectX and Vulkan. However, developers don’t need to worry about them because most game engines, such as Unreal and Unity, take responsibility for the task. While flat screens can support many virtual spaces that require high computation power, such as Fortnite and Minecraft, they can’t provide immersive experiences.


AR devices

Augmented reality (AR) is the fusion of virtual and physical worlds. AR-based apps overlay information and data on physical objects. The most well-known examples of AR are Snapchat and Instagram filters. However, do we need a specific device to access AR? Yes, mobile phones are the most common devices that support AR. For example, people used AR-supported mobile phones to access the virtual world of Pokémon Go. Advancements have also been made in AR devices. For instance, Microsoft’s Hololens overlays holographic content on the physical world. Additionally, Ray-Ban glasses, developed in collaboration with Meta, is an AR device that looks just like regular glasses. Therefore, in terms of immersion, AR devices lie somewhere in the middle.

VR headsets

VR headsets offer a fully immersive experience, allowing users to feel fully immersed in their virtual environments. To get started, users must wear a head-mounted display (HMD), which tracks their head movements and renders the environment accordingly. The environment can be interacted with in various ways, including using a controller with almost all VR devices. Some devices, such as Quest 2, also support hand tracking, which enables users to interact within the immersive space with some limitations. VR devices vary in display resolution, controllers, field of view, and added features like haptic feedback, eye gaze, and hand tracking.

Tethered VR headsets serve as the display for a powerful graphics-heavy machine that’s connected via HMDI or other USB cables. These headsets can deliver a highly immersive and realistic experience. Unlike untethered headsets, we’re allowed restricted movements within a specific range. These headsets include the Oculus Rift, HTC Vive, and Varjo.

Untethered VR headsets are wireless VR headsets that don’t require external resources such as a PC or external sensors to offer a VR experience. These VR headsets include the Meta Quest 2/3 and Pico Neo 2.

Mobile VR is another cost-effective way to experience XR. Most, if not all, latest smartphones now support VR. We can get a cardboard or GearVR container and fit our smartphone in it to convert it into a VR headset. The gyroscope sensors in modern smartphones have enabled immersive experiences like Google Earth and YouTube (360 degrees). The Mobile-VR experience is less immersive and realistic than dedicated high-end VR devices, but it’s useful when we want our VR experience to be scalable.

Whether it’s adding VR support to virtual spaces like Roblox or making browsers XR-compatible, various ongoing technological innovations are making the XR ecosystem more susceptible to widescale adoption. With more tech giants like Apple throwing their weight behind the metaverse ecosystem by announcing VisionPro, it’s needless to say that the field has a vast growth potential.