Operations 6 min read

Best Practices for Test Data Management and Usage

This guide outlines comprehensive principles for generating, using, and cleaning test data across development, performance, and production environments, emphasizing independence, realism, security, proper permission controls, and systematic synchronization to ensure reliable and safe testing processes.

Software Development Quality
Software Development Quality
Software Development Quality
Best Practices for Test Data Management and Usage

Test Data Management and Usage Guidelines

1. Test Data Plan

1.1 Data Generation Principles

Test data should be created through standard methods such as business entry points or data tools, avoiding direct database manipulation.

Each tester's data should be independent and labeled (e.g., account, activity name).

Test data must consider relationships with external systems.

Test data should closely resemble real data.

If the source is production data, it must be desensitized before use.

Reference: Data Management Policy V1.0.

Performance test data should match online scale or be scaled accordingly and follow production data distribution.

1.2 Data Usage Principles

Avoid modifying unfamiliar data to prevent impact on others' tests.

When modifying others' data, obtain prior confirmation.

Assess overall impact before changing foundational data and communicate broadly.

Automation test data should be created, used, and cleaned by the automation program itself, ensuring a closed data loop; others should not interfere.

1.3 Data Cleanup Principles

Functional test data generally does not require cleanup.

After testing, promptly clean data that could affect other tests.

Automation test data should have cleanup scripts designed from the start.

2. Production Environment Test Data Plan

2.1 Basic Principles

Do not modify foundational production data or others' test data arbitrarily.

Test data must not be exposed to real users.

Cleanable test data should be cleared promptly.

Manage test data tags to minimize impact on BI data.

Accurately assess the scope of production environment testing and limit data generation to necessary tests.

Test data should be as realistic as possible; avoid generating meaningless data.

2.2 Test Account Management

2.2.1 Personal Accounts

Control permissions to restrict access, remove unnecessary rights, and request special permissions via email.

2.2.2 Permission (Role) Control

Online user permissions are managed by dedicated personnel who create and assign real roles.

Testing personnel permissions are centralized in a fixed "test" role, assigned and reclaimed by a designated person.

3. Pre‑Production Environment Usage Guidelines

3.1 Data Synchronization Standards

Synchronization frequency: once on Wednesday of the first week of each sprint.

Synchronization timing: chosen to avoid business impact, evaluated by development, testing, and DBA.

Synchronization scope: full‑volume (full table structure + all data).

Synchronization request: submitted by SDM, executed by DBA, with notifications to relevant development and testing teams.

3.2 Recommended Scenarios for Using the Pre‑Environment

Tests that depend on real data should be validated in the pre environment.

Data migration or cleansing projects should rehearse with production‑real data in pre.

Scenarios involving multiple related systems should use the pre environment with real data.

High‑priority test cases for upcoming releases must be regression‑tested in pre before deployment.

Tests that cannot be regressed online (e.g., write‑to‑DB or actual transaction scenarios) must be validated in pre.

For the current release, run core regression cases in pre using real requests captured from production logs to ensure no side effects.

operationssoftware testingData ManagementTest Data
Software Development Quality
Written by

Software Development Quality

Discussions on software development quality, R&D efficiency, high availability, technical quality, quality systems, assurance, architecture design, tool platforms, test development, continuous delivery, continuous testing, etc. Contact me with any article questions.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.