hey actually f***ing did it. How mental is that?" were the words of a CGI artist streaming the sample of Unreal Engine's new MetaHuman Creator platform when the announcement hit.

Last Wednesday, Unreal Engine, owned by the renowned Epic Games, unveiled a new platform for digital human creation, one which everyone in the 3D computer graphics industry has only theorized about until now: a simple, browser-based Graphical User Interface (GUI) for creating high-fidelity, real-time, fully-rigged, diverse, portable, 3D human characters.

This platform changes the game at multiple levels, both from a technical and a societal standpoint, even in ways one would not expect at first glance.

Take yours, if you haven't:

"You create the narrative. I am MetaHuman."

What a time to be alive. To help you fully understand the complexity and potential of this platform, I'll break it down on multiple fronts and guide you through what matters, while revealing details you may not have even considered.

Are you strapped for time? Before you head down a YouTube rabbit hole on this subject, here's the lead in simplest terms: through a series of strategic acquisitions, Epic has created an insanely time-saving tool for 3D character creators.

An output which historically depended on well-researched, well-practiced human execution to generate is now "platformized" into a Graphical User Interface (GUI) and made accessible by anyone with access to a web browser. The output of this software, though, will still require a professional who holds an intimate understanding of 3D character creation in order to take advantage of the time efficiency advantage.

In other words, to benefit from the magic of these time savings one must still be able to interface with the files this software outputs. For the average person, the utility of the software stalls at the boundaries of the GUI at this moment in time. The true benefactors of this software include, but are not limited to...

  • Video game devs in need of NPCs
  • Devs in need of a character customization engine
  • Avatar-based social media apps, like ZEPETO
  • CGI artists in the film industry
  • Chatbot developers in need of humanization
  • Creators of virtual influencers
  • Use cases yet to be fathomed, limited only by the impossibility of such a platform

Note the common thread: each benefactor is a developer, a graphics designer, or a technical creator of some sort. The common factor among each benefactor remains an in-depth understanding of 3D creation.

The ultimate social impact of this software lies within the creativity of its users. MetaHuman Creator will also spark massive growth in the virtual influencer space by drastically reducing the cost of  virtual influencer creation.

The next step? Unreal Engine says it best: "You create the narrative."

Now, let's get into the details of what makes MetaHuman Creator so monumental.

MetaHuman Creator browser view


The web-based GUI is the secret hero in this announcement. The web-based Graphical User Interface (GUI) is the secret hero of Unreal Engine’s announcement. Note that you cannot power the creation of a digital human in Unreal Engine from a web browser alone since such a feat requires immense computing power to accomplish (read: expensive graphics cards). 

So, how did Unreal do it?

This web-based experience relies on the same technical magic underpinning, for example, Google Stadia or Nvidia GeForce NOW. These innovative experiences depend on virtual computing, streamed to your device with next-to-no latency via a remote computer located in a GPU farm.

NVIDIA Server Farm Wall

With the ultimate release of MetaHuman Creator, Unreal Engine will offer a web portal to a virtual machine via its lesser-known service "Pixel Streaming", Pixel Streaming is a new streaming plugin that has been operating in Beta since November 2018. MetaHuman Creator's low latency is enabled by the WebRTC peer-to-peer communication framework.

Projects which would previously have taken hundreds of hours offline on a beefed-up PC can now take anyone with the right skills mere minutes.

MetaHuman's sheer accessibility and ease of use will skyrocket Unreal Engine as the character creation platform of choice. By solving the need of so many industries to implement convincing digital humans into games, campaigns, and storylines among other projects requiring 3D application, Epic unlocks a new growth vector by solving an increasingly common time-constraint problem, one which has been brewing for years.

High Fidelity

The most visually stimulating factor of the announcement of MetaHuman Creator is the sheer realism of the digital humans generated. Something noteworthy is Unreal Engine's support for digital human creation at this level of fidelity.

This announcement does not represent a leap or bound in the quality one can create in Unreal Engine, but rather a leap and bound through every step in the process towards that literal desired output.

While realism paired with motion capture display (more on this later) drove hyper-strong PR around Unreal's announcement, as it does in the virtual influencer industry at large, the quality demoed was already attainable by certain highly-skilled graphic artists, such as a personal favorite, Hadi Karimi.

Digital Human Audrey Hepburn by Hadi Karimi

Now, with this new tech, your average designer has access to an output that artists like Hadi Karimi previously committed tens to hundreds of hours generating, after years of practice.

There's no indication that MetaHuman Creator will allow for perfect replication of any living human's face in the way an artist like Hadi may execute his work. This sets the stage for an exciting new norm: some of the highest fidelity, most stand-out digital humans will likely start their life in MetaHuman Creator, but the true design value will show in the personal touch executed by artists upon exporting the digital human and going to work with their traditional methods and preferences in mind.

In the example of an artist like Hadi, the hundreds of hours of time savings on the front-end will now allow Hadi to master details in ways the MetaHuman Creator does not currently touch. That is, assuming Hadi even wants to use MetaHuman to kick things off.

In a constant state of staying ahead and standing out on the quality front, designers will face a sort of "Law of Saturation" where countless artists and developers will gain access to the baseline quality created by MetaHuman. This saturation of the supply side will force the hands of artists to get more detailed, more realistic, and push for even higher fidelity—even more than the starting point MetaHuman Creator will enable for so many new artists.

Famous virtual influencers @KnoxFrost and @LilMiquela

While MetaHuman Creator will immediately democratize access to high fidelity digital humans, over time, the platform will elevate the world's top designers even further thanks to an outright elimination of time constraints allowing them to focus on the details and personal influence over their digital humans' image. 

Beyond design alone, a similar effect will occur at massive scale on the narrative front, a never-ending battle fought within the video game industry for decades now: "Who can tell the most captivating, engaging story at the cutting edge of digital experiences?"


Real-time rendering is quite like it sounds: when a digital object is rendered with strong enough computing power at a fast enough rate, the resultant images arrive in sequence in real-time. Think: 30FPS, 60FPS, etc. Frames are delivered, fully rendered, at immense speed thanks to the power of GPUs.


While real-time rendering has technically been around since the days of Pong or prior, Unreal Engine pushes the boundaries of what's possible in the real-time rendering space today.

A typical digital human creation pipeline involves the rendering of a human in a design tool's modeling view, with the resultant image appearing in a time period dependent on one's local computing power. Rendering tools such as Arnold in Maya or Eevee in Blender allow for somewhat real-time rendering—enabling quick visualization, just not real-time.

Unreal Engine, though, has made real-time rendering a cornerstone to the design experience, which you will enjoy taking a moment to witness here in the Unreal Engine 5 Reveal from last summer:

Now with MetaHuman Creator, designers can enjoy remote access to the computing power they need to render a digital human in full fidelity, in real-time. In other words, this means every edit you make to the human's face is instantly applied—with no delay to confirm how your latest design tweak looks. Hence, real-time.

Fully Rigged

Coming right off the real-time conversation, let's open an even more exciting doorway: fully rigged digital humans. MetaHumans generates human assets that are fully rigged and ready for use in Unreal Engine. Real-time motion capture data can be used to drive animations through Unreal Engine's Live Link port to a high-tech capture rig or even your iPhone via a mobile app, right out the gate. 

This indicated that as soon as you export your digital human, you can link it to mocap data, which is not new for Unreal Engine, but is both expected yet significant in the context of MetaHumans.

MetaHumans Rig Details

In a semi-related development, the inclusion of Level of Detail data, or LODs, in the resultant model means Unreal Engine's MetaHuman Creator produces game-ready characters, optimized for consumption on all devices.

"MetaHuman characters run smoothly on PC and console, and we are optimizing the mobile experience." -Unreal on the benefits of LODs

When you are nearby a digital human created using the software, the level of detail will max out to look humanoid. Likewise, when you move some distance from the human in a digital environment, the level of detail will drop to allow for performance optimization. Truly game-ready.


The future will be synthesized, and the synthesized future will be diverse. Character creation screens, for decades, have made a point to feature diverse character selection mechanics. As the humans featured within these screens become more and more realistic, so too persists the emphasis on design customization along a spectrum of tones, shapes, and sizes.

MetaHumans, by nature of being powered by Unreal Engine, give us vast access to diverse character creation options. The inclusion and consideration of diversity on the creation side of the virtual world is crucial for virtual representation as digital humans, like MetaHumans, reach people from every corner of the world.

As my colleague Lebo shares on the subject of diversity, "every continent and every country on earth will be equally represented in the virtual world. This expansion will give fans a unique and relatable perspectives into the lives of their favourite virtual humans."

"Human beings from all over the world need to make room for this new generation of beings, as they are here to stay."


The portability of the digital humans generated by MetaHuman will make or break the softwares societal impact at large. Some important questions are yet to be answered from copyright ownership POVs, to compatibility with design softwares beyond Unreal Engine, to friendliness with developers when it comes to calling a MetaHuman into one's digital experience on the internet at large. How restrictive will Unreal Engine be?

Portability benefits everyone in the end. Website visitors want humans to be the face of their chat experience, Zoom users want digital humans to stand in for them, pseudonymous social media users want to express their personality without revealing their true identity, and smart assistant developers want to show more personality.

The internet needs better humans.

If Unreal follows up the MetaHuman creator launch with a renewed, specific focus on the developer friendliness surrounding the output of the platform, the virtual human industry at large will explode, with Unreal Engine permeating in culture (not that it hasn't already via Fortnite). Unreal needs to wrap MetaHumans in a convenient API call, then charge accordingly.

Hyprmeet by Hyprsense (acquired November 2020 by Epic Games) alludes to MetaHumans' inevitable role as video avatars

In an orchestra, we recognize beautiful arrangements of sound. In a gallery, we recognize inspirational collections of art. In technology, we should recognize Unreal Engine for their skillful arrangement of Pixel Streaming (launched November 2018), 3Lateral (acquired January 2019), Quixel (acquired November 2019), Cubic Motion (acquired March 2020), and so much more unique technology in the cracks that will make MetaHuman Creator a world-changing piece of software for the graphics industry, and potentially the internet, at large.

In moments like these, I wish Epic Games was a publicly traded company, as they are completely redefining how humans interface with each other on the internet.

Watch this company.

The Splice 🧬 Newsletter

Trusted by industry leaders at Samsung, Meta, Warner, Epic Games, and more. Subscribe to get our insights in your inbox. Unsubscribing is easy.

Okay, great! We will send you insights over time.
Oops! Something went wrong while submitting the form.
Message Us
Got it. We'll be in touch!
Oops! Something went wrong while submitting the form.

MORE like this