Unity UI Performance Testing Framework: Concepts, Architecture, Data Collection, and Visualization
This article introduces a Unity‑based UI performance testing framework, covering essential Unity concepts, asset bundle loading, memory management, used vs reserved memory, framework architecture, data collection methods such as panel screenshots and opening latency, and visualization techniques for analyzing draw calls, memory trends, and performance regressions.
01 Unity Related Concepts
The UI performance test is built on Unity, so understanding Unity basics such as AssetBundle loading/unloading (object.Destroy() and AssetBundle.Unload()), the distinction between game memory and Mono memory, and the concepts of Used vs Reserved memory is essential.
Mono automatically adjusts heap size and performs garbage collection, reducing the developer's burden for manual memory management.
02 UI Performance Test Framework Structure
The framework consists of three main parts: process control, workflow description, and key data acquisition.
External control module launches Unity with the specified project based on test configuration.
It connects to test devices, notifies the performance data recording module, and links to the game client console.
An automation module repeatedly opens each UI panel while the data recorder captures metrics.
After testing, recorded data is compressed, uploaded to the UI test platform, and displayed.
Key data collected includes:
(1) Panel Screenshot – The automation module captures a screenshot of the panel, encodes it as a base64 stream, and sends it to the external control module for logging.
(2) Opening Latency – Timestamps are recorded at the start and end of panel rendering; the difference is stored as the opening time.
(3) General Client Performance Metrics – Using Unity's Profile Driver (GetOverviewText) the framework can retrieve CPU time, draw calls, memory usage, and custom data if the client provides appropriate interfaces.
03 Visualization
The platform visualizes three primary indicators: opening latency, memory changes, and draw calls.
1. Platform Result Display
2. Data Analysis
When panel count is low, manual analysis is feasible; for large numbers, automated analysis identifies abnormal growth, trend panels, and performance segment statistics.
Examples include comparing current data with previous runs to find top percentage increases, calculating trend‑line slopes to flag rising panels, and counting panels per metric interval (e.g., draw calls in [40,50] range).
3. Specific Panel Data Examples
Draw Call
The “Hero Draw Card” panel shows significantly higher draw calls and batches, requiring optimization.
Memory
Unity's MemoryProfile provides detailed snapshots; a lightweight approach extracts selected details from the detail mode and compares before/after snapshots to detect leaks.
Comparing snapshots A (pre‑open) and C (post‑close) reveals increased texture2D resources and unreleased GUI assets, indicating potential memory leaks.
04 Notes
Testing tips: run tests in an empty scene on a real device to obtain accurate data; avoid using editor‑only simulations.
Reference links:
ProfilerDriver API
GitHub – Unity Profiler Editor
NetEase LeiHuo Testing Center
LeiHuo Testing Center provides high-quality, efficient QA services, striving to become a leading testing team in China.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.