Game Development 18 min read

Designing High‑Coverage Test Cases for UI‑Intensive Games from a Player Perspective

This article presents a comprehensive methodology for creating high‑coverage test cases for UI‑driven games, covering evaluation criteria, user‑centric design steps, entry‑point analysis, UI element partitioning, static and dynamic checks, business‑flow testing, data‑storage validation, and additional considerations such as network conditions, compatibility, performance, and UX.

NetEase LeiHuo Testing Center
NetEase LeiHuo Testing Center
NetEase LeiHuo Testing Center
Designing High‑Coverage Test Cases for UI‑Intensive Games from a Player Perspective

Test cases are one of the most important daily outputs for testers, typically evaluated on three dimensions: clear structure, strong executability, and high coverage. Among these, high coverage is the most critical for quality assurance.

The article introduces a design approach that starts from the user perspective to achieve high‑coverage test cases, considering five analysis angles:

User angle – covering all user scenarios based on documentation.

Implementation angle – supplementing or removing cases according to the program's implementation mechanisms.

Testing angle – detailing cases using common test‑case design methods.

General angle – including data persistence, exception handling, and cross‑feature interactions.

Error‑book angle – filling gaps based on accumulated mistakes and pitfalls.

01 UI Interface to Game Introduction

UI‑driven games consist of many full‑screen UI screens, where all functional flows are driven by player interactions on the interface, and all data is displayed on the UI. The main game screen is composed of functional buttons and player data displays, and player actions start from here.

Since players interact heavily with UI, test cases should also be designed from the player's perspective , following the player’s step‑by‑step interactions. While interaction documents provide static UI references, we must ensure test cases do not merely cover static displays but also drive functional flows.

02 Test‑Case Design Thinking

The functional flow can be visualized as a house (the feature) and a person (the player). Two objects interact, causing data changes. The design process follows three major stages:

System Creation

Server start‑up or a specific time point (e.g., feature activation) triggers processing logic for the feature. Even if the requirement document mentions the feature only briefly, many test points need attention.

Entry Points

Every feature can have one or more entry points, which are documented in the interaction spec. Entry types include:

Permanent entry – always present on the main screen without conditions.

Conditionally unlocked entry – appears only after unlocking conditions are met.

Timed entry – visible only within a specific time window (e.g., limited‑time events).

Stateful entry – can show different states (e.g., completed vs. not completed).

Intrusive entry – may pop up on any screen, requiring analysis of trigger conditions and potential conflicts.

For each entry, test cases should consider both the logic that makes the entry visible and the logic that handles the click to enter the feature.

UI Interaction Driven

After designing the entry, the next layer is the first interface the player sees (usually the main UI of the system or the first step of a workflow). To avoid missing any element, the UI should be partitioned by information zones, either by layout position or functional blocks, and further subdivided to the smallest granularity.

Each UI element is then examined from four dimensions:

Static display – verify all possible data states (e.g., visible/invisible, different positions).

Dynamic change – test transitions between static states triggered by business flows.

Executable operation business flow – design test points for operations such as click, long‑press, swipe, drag, and multi‑finger gestures.

Data storage – ensure that data changes are persisted correctly (client files, server DB, server files, third‑party storage).

Static Display Example

UI Element

Data States

Live2D and bubble

Show / Hide

Right‑side bouquet position

Show bouquet / Show small fridge / Hide

Bottom message line

Morning / Evening / Other time slots; Remaining time >1h / <1h

Dynamic Change Example

UI Element

Business Flow Causing State Change

Live2D and bubble

Transition from claimable to non‑claimable time and vice‑versa

Right‑side bouquet position

Trigger compensation, compensation, no compensation

Bottom message line

Various time‑slot transitions (e.g., morning to end, evening start to end, etc.)

Operation Business Flow

Mobile UI operations can be categorized as click, long‑press, swipe, drag, and two‑finger rotate. The test flow for an operation typically follows:

Each operation’s static display should be selected using equivalence class or boundary‑value techniques, and the dynamic judgment and post‑operation handling become additional business‑flow test points.

Data Storage Validation

Game data is stored in four carriers: client local files, server databases, server local files, and third‑party storage. Testers must identify which data needs storage verification and design test cases for each carrier, covering personal attributes, personal progression, personal gameplay data, guild data, and system‑level data.

Gameplay Flow

From the player’s viewpoint, a feature’s lifecycle can be split into three major blocks: entering the gameplay, gameplay process driving, and settlement.

1. Enter Gameplay

Test points include entry‑gate condition checks (e.g., feature open, ticket availability, qualification, opponent matching, daily limit, server load) and normal entry handling on both server and client sides.

2. Gameplay Process Driving

Each step (entry, interaction, next step or exit) should be mapped to a flow diagram and treated as an independent test point. Passive exits and interruptions from other features must also be considered.

3. Settlement

Settlement involves calculating scores, updating rankings, awarding rewards, recording participation data, synchronizing results to other modules, and client‑side presentation.

4. Fill‑Table Driven

Some features (e.g., story or skill modules) are driven by large configuration tables; test cases should follow the table order and validate each row’s effect.

03 Test‑Case Completion

Combine the mind‑map of test cases derived from documentation with the above design thinking to achieve comprehensive coverage. Additional dimensions to consider include cross‑module interactions, legacy account compatibility, server merge data conflicts, weak‑network behavior, reconnection and account switching, platform‑specific compatibility, performance testing, user‑experience cost, log completeness, configuration checks, and RPC interface robustness.

04 Summary

By approaching test‑case design from the player’s interaction angle and systematically expanding coverage through data and flow analysis, testers can ensure that every visible UI element, operation, and data change is accounted for, thereby guaranteeing high coverage and robust quality for UI‑intensive games.

For further reading, see the linked articles on script‑generated test cases, game product quality management (theory and practice), and platform development experiences.

UI designCoveragetest casegame testingmobile gamesplayer perspective
NetEase LeiHuo Testing Center
Written by

NetEase LeiHuo Testing Center

LeiHuo Testing Center provides high-quality, efficient QA services, striving to become a leading testing team in China.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.