Top Story: Despite A Large Number Of Detractors, Animation Guild Members Ratify New Contract
Metahuman Animator Metahuman Animator

Epic Games has unveiled a new module of its Metahuman platform called Metahuman Animator that is among the most significant advances we’ve seen in automated lip sync and facial animation.

The new tech eliminates the need for bespoke facial motion capture technology, making it possible to accomplish with only an iPhone. The unedited animation output through Unreal Engine is nuanced and polished and provides a framework that animators can further tweak and refine. For the purpose of games and series, which require large outputs of cg animation, this tech could very well be a game changer.

The software was announced at the Game Developers Conference currently taking place in San Francisco. Metahuman Animator is planned for release in the next few months, and will be part of the Metahuman plugin for Unreal Engine.

Here’s a look at what Metahuman Animator generates:

And here’s the five-minute real-time demo from the State of Unreal keynote at GDC that explains how it all works:

Some key highlights of the tech:

  • Facial performance can be reproduced from a capture made on an iPhone (11 or later) with tripod or stereo helmet-mounted cameras.
  • Metahuman Animator starts by creating a “Metahuman Identity,” which is a 3d mesh of the live performer’s topology. The mesh can be built in a few minutes using just three photos, and need be done only once for each performer.
  • This Metahuman Identity is then used to interpret the live performance and how that should be related to the target cg character. The result, according to Epic Games is “that every subtle expression is accurately recreated on your Metahuman target character, regardless of the differences between the actor and the Metahuman’s features.”
  • “Another benefit,” adds Epic Games, “is that the animation data is clean; the control curves are semantically correct, that is, they are where you would expect them to be on the facial rig controls—just as they would be if they had been created by a human animator—making them easy to adjust if required for artistic purposes.”