How BinaryVR Built the Real-Time 3D Character “Merry” for AR/VR Experiences
This article walks through BinaryVR’s art director Yoon’s end‑to‑end process for creating the real‑time 3D character Merry—from concept design and high‑poly sculpting to low‑poly retopology, UV mapping, texture painting, facial blendshape animation, Unity integration, and HyprFace SDK testing—showcasing techniques for AR/VR character pipelines.
Art Director Yoon Introduction
Yoon, art director at BinaryVR, has extensive experience in games and film, previously senior artist at Lucasfilm and art lead at Industrial Light & Magic’s xLAB. He has worked on real‑time animation, modeling, texturing, visual development, and collaborated with Magic Leap, Google, Epic Games, Oculus, and The VOID.
Notable projects include “Star Wars: Secrets of the Kingdom” (2018 VR award) where he created characters K2SO, Darth Vader, Imperial Stormtroopers, and environments, and “Ralph Breaks VR” exploring a Disney‑style aesthetic.
Creating Merry – A Real‑Time Rendered 3D Model
The goal of the Merry project is to showcase BinaryVR’s HyprFace facial‑capture technology with a cute, universally appealing character. The design team chose a Pomeranian puppy with a rabbit‑ear hood, defining five basic facial expressions.
1. Concept Design
References were gathered on Pinterest, leading to a Disney/Pixar‑style puppy concept with a rabbit‑ear hood and a library of expressions for joy, sadness, surprise, anger, and fear.
2. High‑Poly Sculpting
Artists sculpted a high‑poly model in ZBrush without worrying about polygon count, focusing on correct facial proportions and creating expression variants.
3. Low‑Poly Retopology
The high‑poly was retopologized into a low‑poly mesh suitable for real‑time animation, using Maya’s polygon tools for creative control over edge flow.
4. UV Unwrapping
The low‑poly mesh was unwrapped into UV islands, allocating larger texture space to the eyes, nose, and mouth, and smaller areas to the head and ears.
5. Texture Painting
Using Substance Paint, artists painted diffuse, specular, roughness, normal, and ambient‑occlusion maps for Merry’s UV layout.
6. Facial Blendshape Animation
Thirty‑three facial blendshapes and six tongue blendshapes were created to cover the five emotion set, with ZBrush used to fine‑tune each shape.
7. Import into Unity
In Unity, the team added dynamic bone simulation for the rabbit‑ear hood, built a two‑layer eye material (cornea and eyeball), and set up HDRI skybox lighting with post‑processing for atmosphere, depth of field, and color grading.
8. HyprFace SDK Integration
The final step integrated HyprFace SDK to drive Merry’s facial animation from live capture, testing blendshape values and adjusting features such as nose size for optimal cuteness.
The entire pipeline was completed in four weeks, resulting in a demo shown at CVPR 2018 and SIGGRAPH 2018, and demonstrates a practical workflow for AR/VR character creation.
We-Design
Tencent WeChat Design Center, handling design and UX research for WeChat products.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.