Skip to main content

Setup Guide

This guide walks you through the basic setup process for Runtime MetaHuman Lip Sync with your MetaHuman characters.

Note: Runtime MetaHuman Lip Sync works with both MetaHuman and custom characters. For detailed instructions on setting up custom characters, see the Custom Character Setup Guide.

Prerequisites

Before getting started, ensure:

  1. The MetaHuman plugin is enabled in your project (Note: Starting from UE 5.6, this step is no longer required as MetaHuman functionality is integrated directly into the engine)
  2. You have at least one MetaHuman character downloaded and available in your project
  3. The Runtime MetaHuman Lip Sync plugin is installed

Standard Model Extension Plugin

If you plan to use the Standard Model, you'll need to install the extension plugin:

  1. Download the Standard Lip Sync Extension plugin from Google Drive
  2. Extract the folder from the downloaded archive into the Plugins folder of your project (create this folder if it doesn't exist)
  3. Ensure your project is set up as a C++ project (even if you don't have any C++ code)
  4. Rebuild your project
note
  • This extension is only required if you want to use the Standard Model. If you only need the Realistic Models, you can skip this step.
  • For more information on how to build plugins manually, see the Building Plugins tutorial

Additional Plugins

Animation Blueprint Setup

Step 1: Locate and modify the face animation Blueprint

You need to modify an Animation Blueprint that will be used for your MetaHuman character's facial animations. The default MetaHuman face Animation Blueprint is located at:

Content/MetaHumans/Common/Face/Face_AnimBP

Face Animation Blueprint

You have several options for implementing the lip sync functionality:

Open the default Face_AnimBP directly and make your modifications. Any changes will affect all MetaHuman characters using this Animation Blueprint.

Note: This approach is convenient but will impact all characters using the default Animation Blueprint.

Step 2: Event Graph setup

Open your Face Animation Blueprint and switch to the Event Graph. You'll need to create a generator that will process audio data and generate lip sync animation.

  1. Add the Event Blueprint Begin Play node if it doesn't exist already
  2. Add the Create Runtime Viseme Generator node and connect it to the Begin Play event
  3. Save the output as a variable (e.g. "VisemeGenerator") for use in other parts of the graph

Creating Runtime Viseme Generator

For detailed configuration options, see the Standard Model Configuration section.

Step 3: Anim Graph setup

After setting up the Event Graph, switch to the Anim Graph to connect the generator to the character's animation:

  1. Locate the pose that contains the MetaHuman face (typically from Use cached pose 'Body Pose')
  2. Add the Blend Runtime MetaHuman Lip Sync node
  3. Connect the pose to the Source Pose of the Blend Runtime MetaHuman Lip Sync node
  4. Connect your VisemeGenerator variable to the Viseme Generator pin
  5. Connect the output of the Blend Runtime MetaHuman Lip Sync node to the Result pin of the Output Pose

Blend Runtime MetaHuman Lip Sync

Next Steps

Now that you have the basic Animation Blueprint setup complete, you'll need to configure audio input processing to feed audio data to your lip sync generator.

Continue to the Audio Processing Guide to learn how to set up different audio input methods including microphone capture, text-to-speech, and audio file processing.

For advanced configuration options and fine-tuning, see the Configuration Guide.