I

n the weeks leading up to CES, Samsung hyped a groundbreaking artificial human company called NEON. According to the marketing website, a "NEON is a computationally created virtual being that looks and behaves like a real human, with the ability to show emotions and intelligence." NEON is both the name of the company and the humans.


The promise? A virtual companion who looks human, responds in real time, gets to know us, understands emotions, remembers conversations, relates to us, and, above all, befriends us. A human who works for us, whatever that means to you.

A human dream.

"While I would love to be wrong and hope I am, this is going to be a huge let down or turn out to be a prank. There is no way Samsung AI and robotics has leaped so far ahead of everyone else to justify what is being implied here." –Rick Kahler

The peril? Turns out, the "artificial humans" presented at CES were fictionalized. Instead of creating artificial, intelligent, immersive humans as the promotional materials led the world to believe – think digital Westworld – NEON motion captured actors and deepfaked the faces in a controlled environment.


According to The Verge, NEON clarified "scenarios shown at our CES Booth and in our promotional content are fictionalized and simulated for illustrative purposes only." Oof, ouch, analysis over?


No, not quite – while the hype may have been overblown, there's so much one can learn about artificial humans from NEON... especially why you should create them, too.

The most immersive and defining element of a virtual human is its appearance. In NEON's videos, we see extremely lifelike humans. How? At the least, they're simply videos of real humans. At the very most, they're humans with faces re-generated, despite CEO Pranav Mistry swearing to "entirely computer-generated footage, albeit pre-rendered rather than captured in real time."


NEON employs talent from STARS Agency and has them perform complex hand contact, tricky limb & clothing interaction, and flawless object handling (0:07 – 0:17) for prolonged periods of time. We can't believe our eyes! As Doug Roble of Digital Domain said when presenting DigiDoug, "In visual effects, one of the hardest things to do is create believable digital humans the audience accepts as real. People are just really good at recognizing other people." In the case of NEON, you can believe your eyes – they're humans. Here's the talent who plays the news reporter and the one who plays the photographer.

Now, for the actual demo and the magic that should command our attention, NEON uses straight-facing videos of some of their talent, such as "Monica", and leverages Generative Adversarial Networks (GANs – a form of machine learning) to manipulate the eyes, eyebrows, and mouth with a reference human or remote controller, a practice popularized as deepfaking.

From photogrammetry rigs to facial motion capture to Apple's TrueDepth sensors and more, a few paths exist to digital being facial creation and manipulation.

Teams are making significant strides on these paths, such as aforementioned Doug Roble, whom I had the pleasure of meeting at Virtual Beings Summit. I also met Christine Marzano building a platform and toolkit for democratized, avatar content creation at DNABlock. Look to Unreal Engine, too, who is responsible for full-bodied digital being Siren on a mission "to challenge our idea of what a synthetic human could be." Virtual humans are here.


NEON, though, is one of the first to demo deepfaking as a service tied to a vision those virtual human faces may express artificial emotions of their own one day – without the need for a human reference. That is, if NEON can simultaneously master artificial voice and conversation as well.

See, a key ingredient to artificial human interaction is the human's ability to hold a convincing conversation. NEON's demos reveal impossibly human-like sound, tonality, and cadence from the first human. The second human proceeds to speak in multiple languages, yet each in entirely different voices, indicating NEONs aren't as capable at voice skinning as we are led to believe – if at all. The total obfuscation of actual tech demo with pre-recorded, human content makes NEON's progress nearly impossible to pin down, fueling that hype experience.

In this vein, though, notable companies are developing total voice creation, replication, and skinning solutions. Crypton Future Media, the creators of Hatsune Miku, have over a decade of progress generating digital voices that sing via Vocaloid. Descript (Lyrebird), Resemble.ai, and Modulate.ai are working on voice skinning technologies to let you create a digital voice that sounds like you from just a small audio sample.

These deepfakes of the audio industry will empower artificial humans to sound even friendlier and more human than Alexa one day.

Tangential to the subject at hand, yet interesting, NEON says in a press release "there are millions of species on our planet and we hope to add one more," overlooking the iconic legacy of Lil Miquela or her predecessors, Ami Yamato, Hatsune Miku, and others. The same Lil Miquela with whom Samsung, NEON's parent company, signed a massive, prolonged sponsorship, as evident by this video, this video, this post, this post etc throughout 2019.

Looking back – NEON successfully blended tech, a new trend, and hype to tell an amazing story and fuel the media leading into CES, garnering strong attention for the company while drawing newfound attention to the virtual human industry as a whole. A fictionalized production that drew a massive crowd.

Despite what the "simulated for illustrative purposes" content led the media to believe, the promised Westworld-like land of artificially intelligent humans is still quite far off. Pranav Mistry agrees how far we really are, in saying "AI has many years of development to go before science fiction becomes reality." We'll have to wait and see what NEON brings to CES 2021.

In the meantime, we're all at the cusp of a new opportunity to take any software, any product, or any message and encase it in a virtual human storyline, thus making it highly eye-catching, engaging, and impactful. The media's reaction to NEON's sci-fi, fictionalized CES showcase serves as the best, recent example of how powerful virtual human storytelling can be.


See through the hype and distill the "why?" to reveal the paramount learning from NEON's CES demo:


Virtual humans are the next massive storytelling opportunity. Go create one.

To learn more about the emerging Virtual Human industry, explore our homepage.

The Splice 🧬 Newsletter

Trusted by industry leaders at Samsung, Meta, Warner, Epic Games, and more. Subscribe to get our insights in your inbox. Unsubscribing is easy.

Okay, great! We will send you insights over time.
Oops! Something went wrong while submitting the form.
Message Us
Got it. We'll be in touch!
Oops! Something went wrong while submitting the form.

MORE like this