Game Development 8 min read

Creating and Driving a Digital Human with Unreal Engine MetaHuman and AI Face Swapping

This guide walks through building a 3D digital human using Unreal Engine’s MetaHuman Creator, driving it with live facial capture from an iPhone, and applying AI‑based face swapping (roop) to replace the character’s face with Mr. Bean, covering all required tools, setup, and export steps.

政采云技术
政采云技术
政采云技术
Creating and Driving a Digital Human with Unreal Engine MetaHuman and AI Face Swapping

Overview

This article introduces how to create a digital human from scratch, drive it with live facial capture, and finally apply AI face swapping to achieve a virtual production effect.

Digital Human Overview

A digital human (Digital Human / MetaHuman) is a computer‑generated character that resembles a real person. The tutorial uses a 3D MetaHuman created with MetaHuman Creator and driven by real‑time facial capture.

AI Face Swap Overview

AI face swapping replaces the face in a video or image with another face using artificial‑intelligence techniques. The guide uses the open‑source roop tool to replace the digital human’s face with that of Mr. Bean, mentioning alternatives such as DeepFaceLab and SimSwap.

Implementation

Preparation

Download Unreal Engine from the official site and install the MetaHuman plugin. Install the Live Link Face app on an XR‑grade iPhone for real‑time facial capture. Set up a Python environment for the roop face‑swap tool (e.g., via GitHub ).

Digital Human Creation

Open Unreal Engine and create a new UE5 project. Import a cube model and a reference photo (e.g., a picture of Mr. Bean). Create a material from the photo, apply it to the cube, and enable the MetaHuman plugin. In the Content Browser, create a MetaHuman body, import the cube as a custom mesh, adjust key points, and upload the model to MetaHuman Creator for fine‑tuning.

After adjustments, download the refined model via Quixel Bridge and add it to the Unreal project.

Digital Human Driving

Use the iPhone’s Live Link Face app (ARKit mode) to stream facial deformation data to the PC on the same LAN. In Unreal, enable Live Link, connect to the phone’s IP, and configure the Face_AnimBP animation blueprint to record mouth movements. Add the digital human to a level sequence, attach the recorded animation, and export the result as a series of JPG images (AVI export may cause issues with later face swapping).

AI Face Swapping

Open a command line in the roop directory and run python run.py --execution-provider cuda to launch the UI. Select a target photo of Mr. Bean, choose the folder containing the exported JPG frames, disable "Keep audio", and start the swapping process to generate a video where the digital human’s face is replaced with Mr. Bean’s.

References

https://github.com/s0md3v/roop https://www.bilibili.com/video/BV1vW4y1k73u/

Digital HumanVirtual ProductionUnreal EngineAI Face SwapMetaHuman
政采云技术
Written by

政采云技术

ZCY Technology Team (Zero), based in Hangzhou, is a growth-oriented team passionate about technology and craftsmanship. With around 500 members, we are building comprehensive engineering, project management, and talent development systems. We are committed to innovation and creating a cloud service ecosystem for government and enterprise procurement. We look forward to your joining us.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.