Skip to main content

Overview

Runtime MetaHuman Lip Sync Documentation

Runtime MetaHuman Lip Sync is a plugin that enables real-time, offline, and cross-platform lip sync for both MetaHuman and custom characters. It allows you to animate a character's lips in response to audio input from various sources, including:

The plugin internally generates visemes (visual representations of phonemes) based on the audio input and performs lip sync animation using a predefined pose asset.

Character Compatibility

Despite its name, Runtime MetaHuman Lip Sync works with a wide range of characters beyond just MetaHumans:

  • Daz Genesis 8/9 characters
  • Reallusion Character Creator 3/4 (CC3/CC4) characters
  • Mixamo characters
  • ReadyPlayerMe avatars

Animation Standards Support

  • FACS-based blendshape systems
  • Apple ARKit blendshape standard
  • Preston Blair phoneme sets
  • 3ds Max phoneme systems
  • Any character with custom morph targets for facial expressions

For detailed instructions on using the plugin with non-MetaHuman characters, see the Custom Character Setup Guide.

Animation Preview

Check out these short animations to see the quality of real-time lip sync animation produced by the plugin. The animation can be applied to any supported character, whether it's MetaHuman or a custom character from any of the supported systems.

Lip Sync Example Lip Sync Example Lip Sync Example Lip Sync Example

Key Features

  • Real-time lip sync from microphone input
  • Offline audio processing support
  • Cross-platform compatibility: Windows, Android, Meta Quest
  • Support for multiple character systems and animation standards
  • Flexible viseme mapping for custom characters

How It Works

The plugin processes audio input in the following way:

  1. Audio data is received as float PCM format with specified channels and sample rate
  2. The plugin processes the audio to generate visemes (phonemes)
  3. These visemes drive the lip sync animation using the character's pose asset
  4. The animation is applied to the character in real-time

Quick Start

Here's a basic setup for enabling lip sync on your character:

  1. For MetaHuman characters, follow the MetaHuman Setup Guide
  2. For custom characters, follow the Custom Character Setup Guide
  3. Set up audio input processing (such as in the Event Graph)
  4. Connect the Blend Runtime MetaHuman Lip Sync node in the Anim Graph
  5. Play audio and see your character speak!

Additional Resources