I recently set out to build a decentralised version of a metaverse in order to establish a framework for future, truly African, virtual worlds. This version of the metaverse does not make use of any proprietary software development kits and is accessible to users who may not be able to afford the hardware typically required to engage in virtual reality experiences. The framework I built allows for people of colour to exist in virtual worlds using avatars that are genuinely reflective of their physical appearance.


Firstly, I used machine learning to transform a photo of myself into a 3D model.

2D image to 3D model translation using machine learning


I then used Blender, a free and open-source 3D computer graphics software toolset to clean up any artefacts as well as to add a texture to the model. I headed over to Mixamo, a free online tool that uses machine learning to automate the steps of the character animation process to rig my model and add movement to it.

Metaverse avatar editing using Blender and Mixamo

Afterwards, I created a 3D scan of my living room using my iPhone’s lidar scanner and imported it into Blender.

Metaverse real-world environment scanning

In order to make the experience accessible to others I used three.js to bring it to life in any modern web browser. Three.js is an open source cross-browser JavaScript library and application programming interface used to create and display animated 3D computer graphics in a web browser using WebGL. Thereafter, I built movement controls for avatars and added the ability to communicate between avatars using socket.io, a free JavaScript library for realtime web applications.

Completed metaverse framework


In collaboration with The Hmm, affect lab, and MU Hybrid Arthouse this metaverse framework has been used to host hybrid dance events as part of a broader project called the Toolkit  for the Inbetween. We collaboratively crafted an immersive experience that brought together individuals both in-person and virtually.

Credit: MU Hybrid Art House

Ahead of the event, we invited our guests to create their personalized avatar at our in-person or online workshops. Alternately, they could use our comprehensive 4-step guide to make their unique avatar. Users were able to connect with each other, whether in person or virtually or a combination of both, through the use of a “language of motion” developed specifically for the purpose of the environment. Users form connections in real time through dance and movement, no matter where they are in the world

Credit: MU Hybrid Art House

The most important conversation that this metaverse experiment explored was that of human-to-avatar representation. It was important for me to define what that means for a platform developed specifically for people in under-resourced and digitally marginalized parts of the world for whom online representation remains an unfulfilled dream. Representation latency is political because a handful of people in the Global North choose what the internet should look like, leaving billions in the Global South unseen and yet still exploited.


Share 𝕏