I recently set out to build a decentralised version of a metaverse in order to establish a framework for future, truly African, virtual worlds. This version of the metaverse does not make use of any proprietary software development kits and is accessible to users who may not be able to afford the hardware typically required to engage in virtual reality experiences. The framework I built allows for people of colour to exist in virtual worlds using avatars that are genuinely reflective of their physical appearance.



Firstly, I used machine learning to transform a photo of myself into a 3D model.

2D image to 3D model translation using machine learning


I then used Blender, a free and open-source 3D computer graphics software toolset to clean up any artefacts as well as to add a texture to the model. I headed over to Mixamo, a free online tool that uses machine learning to automate the steps of the character animation process to rig my model and add movement to it.

Metaverse avatar editing using Blender and Mixamo

Afterwards, I created a 3D scan of my living room using my iPhone’s lidar scanner and imported it into Blender.

Metaverse real-world environment scanning

In order to make the experience accessible to others I used three.js to bring it to life in any modern web browser. Three.js is an open source cross-browser JavaScript library and application programming interface used to create and display animated 3D computer graphics in a web browser using WebGL. Thereafter, I built movement controls for avatars and added the ability to communicate between avatars using socket.io, a free JavaScript library for realtime web applications.

Completed metaverse framework


The experiment is still ongoing and the work will continue to evolve as I continue to explore the implications and complexities of extended existence. Watch my tutorial on Instagram on how to build your own metaverse experience.