Overview
Runtime MetaHuman Lip Sync is a plugin that enables real-time, offline, and cross-platform lip sync for both MetaHuman and custom characters. It allows you to animate a character's lips in response to audio input from various sources, including:
- Microphone input via Runtime Audio Importer's capturable sound wave
- Synthesized speech from Runtime Text To Speech
- Any audio data in float PCM format (an array of floating-point samples)
The plugin internally generates visemes (visual representations of phonemes) based on the audio input and performs lip sync animation using a predefined pose asset.
Character Compatibility
Despite its name, Runtime MetaHuman Lip Sync works with a wide range of characters beyond just MetaHumans:
Popular Commercial Character Systems
- Daz Genesis 8/9 characters
- Reallusion Character Creator 3/4 (CC3/CC4) characters
- Mixamo characters
- ReadyPlayerMe avatars
Animation Standards Support
- FACS-based blendshape systems
- Apple ARKit blendshape standard
- Preston Blair phoneme sets
- 3ds Max phoneme systems
- Any character with custom morph targets for facial expressions
For detailed instructions on using the plugin with non-MetaHuman characters, see the Custom Character Setup Guide.
Animation Preview
Check out these short animations to see the quality of real-time lip sync animation produced by the plugin. The animation can be applied to any supported character, whether it's MetaHuman or a custom character from any of the supported systems.
Key Features
- Real-time lip sync from microphone input
- Offline audio processing support
- Cross-platform compatibility: Windows, Android, Meta Quest
- Support for multiple character systems and animation standards
- Flexible viseme mapping for custom characters
How It Works
The plugin processes audio input in the following way:
- Audio data is received as float PCM format with specified channels and sample rate
- The plugin processes the audio to generate visemes (phonemes)
- These visemes drive the lip sync animation using the character's pose asset
- The animation is applied to the character in real-time
Quick Start
Here's a basic setup for enabling lip sync on your character:
- For MetaHuman characters, follow the MetaHuman Setup Guide
- For custom characters, follow the Custom Character Setup Guide
- Set up audio input processing (such as in the Event Graph)
- Connect the Blend Runtime MetaHuman Lip Sync node in the Anim Graph
- Play audio and see your character speak!
Additional Resources
- Get it on Fab
- Download Demo (Windows)
- Download Demo source files (UE 5.5)
- Discord support server
- Video tutorial
- Custom Development: [email protected] (tailored solutions for teams & organizations)