Runtime MetaHuman Lip Sync

Real-time, offline, and cross-platform lip sync for MetaHuman and custom characters. Animate character lips in response to audio from various sources.

UE 5.0 - 5.5
Blueprints & C++
Windows, Android, Meta Quest
Multiple character systems

Lip Sync Animation Preview

See the quality of real-time lip sync animation produced by the plugin. These animations can be applied to any supported character, whether it's MetaHuman or a custom character.

MetaHuman Lip Sync Example
MetaHuman Character
Custom Character Lip Sync Example 2
Custom Character with Different Viseme System

Real-Time Character Lip Sync Made Simple

Runtime MetaHuman Lip Sync provides a comprehensive system for dynamic lip sync animation, enabling characters to speak naturally in response to audio input from various sources. Despite its name, the plugin works with a wide range of characters beyond just MetaHumans.

Real-Time & Offline Processing

Generate lip sync animations in real-time from microphone input or pre-recorded audio

Universal Character Compatibility

Works with MetaHuman, Daz Genesis 8/9, Reallusion CC3/CC4, Mixamo, ReadyPlayerMe, and more

Multiple Animation Standards

Supports FACS-based systems, Apple ARKit, Preston Blair phoneme sets, and 3ds Max phoneme systems

Flexible Audio Input

Process audio from microphones, imported files, text-to-speech, or any PCM audio data

Video Tutorial
Complete tutorial for setting up lip sync
Demo Project Walkthrough →

Key Features

Easy Blueprint Integration

Set up lip sync animation with just a few Blueprint nodes. The plugin provides a clean, easy-to-use interface for implementing lip sync in your characters.

Creating Runtime Viseme Generator
Create a Runtime Viseme Generator in the Event Graph

Animation Graph Setup

Integrate lip sync into your character's animation blueprint with the dedicated blending nodes.

Blend Runtime MetaHuman Lip Sync
Connect the Blend Runtime MetaHuman Lip Sync node in the Anim Graph

Customizable Parameters

Fine-tune the lip sync behavior with adjustable parameters for interpolation speed and reset timing.

Interpolation Speed

Controls how quickly lip movements transition between visemes. Higher values result in more abrupt transitions.

Reset Time

The duration in seconds after which lip sync is reset when audio stops, preventing lingering mouth positions.

Technical Details

Viseme System

The plugin uses a standard set of visemes (visual phonemes) that are mapped to your character's morph targets:

Sil (Silence)
PP (p, b, m)
FF (f, v)
TH (th)
DD (t, d)
KK (k, g)
CH (ch, j)
SS (s, z)
NN (n)
RR (r)
AA (aa)
E (e)
IH (ih)
OH (oh)
OU (ou)

How It Works

1

Audio Input

Audio data is received as float PCM format with specified channels and sample rate

2

Viseme Generation

The plugin processes the audio to generate visemes (phonemes)

3

Animation Driving

Visemes drive the lip sync animation using the character's pose asset

4

Real-Time Application

The animation is applied to the character in real-time

Platform Support

Windows
Android
Meta Quest
More platforms coming soon

Requirements

Unreal Engine

UE 5.0 - 5.5

For MetaHuman Characters

MetaHuman plugin enabled in your project

For Audio Capture

Runtime Audio Importer plugin

Powerful Integrations

Runtime MetaHuman Lip Sync works seamlessly with other plugins to create complete audio, speech, and animation solutions for your Unreal Engine projects.

Runtime Audio Importer

Import, stream, and capture audio at runtime to drive lip sync animations. Process audio from files, memory, or microphone input.

Learn more

Runtime Speech Recognizer

Add speech recognition to create interactive characters that respond to voice commands while animating with lip sync.

Learn more

Runtime Text To Speech

Generate realistic speech from text offline with 900+ voices and animate character lips in response to the synthesized audio.

Learn more

Runtime AI Chatbot Integrator

Create AI-powered talking characters that respond to user input with natural language and realistic lip sync.

Learn more

Complete Interactive Character Solution

Combine Runtime MetaHuman Lip Sync with our other plugins to create fully interactive characters that can listen, understand, speak, and animate naturally. From voice commands to AI-driven conversations, our plugin ecosystem provides everything you need for next-generation character interaction.

Custom Character Support

Despite its name, Runtime MetaHuman Lip Sync works with a wide range of characters beyond just MetaHumans. Here's how to set up lip sync for various character systems.

Flexible Viseme Mapping

The plugin supports mapping between different viseme standards, allowing you to use characters with various facial animation systems:

Apple ARKit

Map ARKit blendshapes to visemes

FACS-Based Systems

Map Action Units to visemes

Preston Blair System

Classic animation mouth shapes

3ds Max Phoneme System

Standard 3ds Max phonemes

Custom Character Example 2
Custom Character Example 3

Documentation & Support

Get started quickly with our detailed documentation and receive support through multiple channels. From basic setup to advanced character configuration, we're here to help you succeed.

Comprehensive Documentation

Step-by-step guides for MetaHuman and custom characters

Video Tutorials

Visual guides for setup and configuration

Discord Community

Get real-time help from developers and users

Custom Development

Contact solutions@georgy.dev for tailored solutions

Demo Project Walkthrough

Add Realistic Lip Sync to Your Characters

Bring your MetaHuman and custom characters to life with real-time lip sync animation. Create more immersive and engaging experiences with characters that speak naturally in response to audio input.