How Traffic Recording & Replay Boosts Test Efficiency at China Agricultural Bank
This article explains the challenges of testing in a rapidly evolving banking system and introduces a traffic recording and replay platform that automates test case generation, improves coverage, and reduces manual effort through web‑server, network‑stack, and application‑level recording techniques.
Introduction
With the continuous evolution of the Agricultural Bank of China's technical architecture and business scenarios, testing faces problems such as complex data construction, high case‑maintenance cost, and uneven quality. Four main challenges are identified: extensive regression testing after system upgrades, time‑consuming regression for complex business logic, high manual effort for test case creation and maintenance, and the need for rapid issue reproduction in production anomalies.
Traffic Recording & Replay Overview
Traffic recording and replay technology quickly captures interface traffic generated during application execution according to filtering rules. Recorded data includes request and response messages, which can be replayed in designated environments for functional regression testing or version verification, simplifying test case writing, improving efficiency, and enhancing scenario coverage.
Classification of Recording Techniques
Web‑server based recording : Captures requests by customizing the OS and web server; supports diverse request types but requires high development and maintenance cost.
Network‑stack based recording : Listens on network ports and copies packets; minimally impacts the application but may affect network performance.
Application‑level recording : Uses AOP to intercept requests; non‑intrusive to code and requires no application modification, though it consumes some server resources.
Design Solution
In July 2022, the Tianjin R&D testing team of the Agricultural Bank established a special project to build a traffic recording and replay tool using open‑source and self‑developed components. The platform provides configuration management, node monitoring, traffic management, replay execution, batch execution, and result analysis.
Architecture
The tool follows a front‑back separation architecture. A probe installed on the application records traffic, while a proxy module listens, processes replay requests, and forwards information to a management module that handles storage and configuration. Users interact via a unified web UI to create configurations, record traffic, and query replay results.
Data Layer Strategy
To support fast queries on massive recorded traffic, the tool stores online traffic and replay results in Elasticsearch, achieving millisecond‑level response for millions of records. Smaller management data resides in MySQL, and Redis queues handle ordered batch replay.
Batch Replay & Result Query
Recorded traffic can be manually filtered into batches and replayed in the original call order to simulate mixed transaction scenarios. The batch execution provides result statistics and success‑rate trends, helping testers monitor interface performance and adjust cases promptly.
Message Comparison Checkpoints
After replay, the tool compares recorded response messages with current responses, ignoring noise fields (e.g., timestamps, random numbers) by configuring checkpoints such as response codes. Matching checkpoints indicate successful replay.
Header Replacement
To handle session expiration during web‑application replay, the tool can replace header parameters in real time, ensuring continuous replay of recorded traffic.
Case Export
Recorded traffic cases can be exported as Collection V2 JSON files, compatible with Postman, Postwoman, and other API testing tools, enabling reuse without manual message assembly.
Lifecycle Management
Users can configure traffic data lifecycle policies per application and environment. Unused traffic not added to batches is periodically cleaned, while batch‑included traffic can be retained permanently for storage or replay.
Application Effect
The platform has been gradually rolled out within the Tianjin R&D department, with over 20 modules integrated (67% web, 13% online, 20% mobile). A total of 4.06 million interfaces have been recorded, and batch replay success rates exceed 85% after process optimization.
Conclusion
Traffic recording and replay provides a novel approach to automated interface testing, applicable to version verification and regression testing. By converting live traffic into test cases, it dramatically reduces manual effort in data preparation and script writing, offers more realistic coverage of business scenarios, and enhances defect detection, thereby improving testing efficiency and lowering costs.
Software Development Quality
Discussions on software development quality, R&D efficiency, high availability, technical quality, quality systems, assurance, architecture design, tool platforms, test development, continuous delivery, continuous testing, etc. Contact me with any article questions.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
