Operations 13 min read

How a Six-Month Multi-System Project Achieved 99.9% Transaction Success: A Deep Dive into Test Summaries and Metrics

This article presents a comprehensive test summary of a six‑month project involving eight applications, detailing test objectives, planning, execution cycles, case counts, defect statistics, quality metrics, and management lessons that together delivered a 99.9% transaction pass rate and valuable insights for test leaders.

FunTester
FunTester
FunTester
How a Six-Month Multi-System Project Achieved 99.9% Transaction Success: A Deep Dive into Test Summaries and Metrics

Project Background

Duration six months, covering eight application systems. Test phase lasted three months, targeting an overall transaction success rate of 99.9% using end‑to‑end, functional and non‑functional testing while monitoring case execution and pass rates.

Test Objectives

The primary system is newly built and interacts with seven peripheral systems. Objectives include verifying new functional points, assessing the impact of peripheral changes, and maintaining a 99.9% transaction success rate.

Test Plan & Strategy

Round 1 – Iteration 1 (7 days) : SIT of the main transaction flow, environment setup and connectivity verification for peripheral systems.

Round 2 – Peripheral 1 (15 days) : Full‑function transaction verification for the new system, including boundary and exception cases; modify peripheral systems.

Round 3 – Peripheral 2 (21 days) : Full‑transaction verification for the new system; full‑function verification for all peripherals.

Round 1 Regression (21 days) : Full‑case regression for the new system; second‑round peripheral verification.

Round 2 Regression (15 days) : Full‑case regression for the new system; first‑round peripheral regression.

Round 3 Regression (11 days) : Targeted regression on high‑severity, repeatedly failing modules and all AB‑level defects from SIT/UAT.

Test Scope

New System

UI testing – all screens, boundary values, workflow validation.

Interface validation – each interface and connectivity with peripherals.

End‑to‑end flow validation across all related systems.

Compatibility testing.

Security testing.

Peripheral Systems

Interface validation with the new system and legacy interfaces.

Full‑process functional validation.

Testing of system modifications.

Test Case Statistics

Cases were built using a two‑dimensional functional matrix; early phases emphasized functional point verification, mid‑phase enriched cases, and later phases focused on coverage execution.

New System – Round 1: 940 cases, 86 passed (9.3 % pass).

New System – Round 2: 3,051 cases, 2,844 passed (93.2 % pass).

New System – Round 3: 3,206 cases, 3,084 passed (96.2 % pass).

Peripheral 1 – Round 1: 485 cases, 356 passed (73.4 % pass).

Peripheral 2 – Round 2: 997 cases, 862 passed (86.5 % pass).

Release Frequency

The new system averaged 2.2 releases per day (≈3 releases on working days). Peripheral 2 had fewer releases and defects than Peripheral 1 despite a larger workload.

Defect Statistics

Total defects across eight systems: 1,791 (defect detection rate 18.8 %). Defect trend follows a typical discovery curve (50 %, 30 %, 15 %, 5 %). AB‑level defects account for 46 % of total defects.

New System: 81 A, 208 B, 309 C, 63 D (total 661).

Peripheral 1: 23 A, 84 B, 101 C, 27 D (total 235).

Peripheral 2: 15 A, 35 B, 69 C, 10 D (total 129).

Peripheral 3: 16 A, 44 B, 30 C, 11 D (total 101).

Peripheral 4: 46 A, 116 B, 121 C, 17 D (total 300).

Peripheral 5: 12 A, 31 B, 56 C, 14 D (total 113).

Peripheral 6: 18 A, 52 B, 63 C, 9 D (total 142).

Peripheral 7: 8 A, 27 B, 54 C, 5 D (total 94).

Quality Metrics

Transaction coverage: 100 % for both new and migrated data.

Transaction pass rate: 100 % for both data sets.

Defect convergence: fixing rate dropped from 49 % (first three rounds) → 40 % (first regression) → 12 % (second regression).

Daily defect count stayed below 20 per round, decreasing during regression.

Interface functional coverage: 100 % for normal and abnormal interfaces.

File handling coverage: 100 % across all eight systems, including empty, duplicate, and dirty files.

Missed Defect Analysis

New System: 3 missed issues (miss rate 0.45 %).

Peripheral 1: 1 missed issue (0.43 %).

Peripheral 3: 1 missed issue (0.99 %).

Peripheral 4: 2 missed issues (0.67 %).

Root causes of missed defects:

Test analysis omission – 3 cases (37.5 %).

Incomplete business requirements – 2 cases (25 %).

Test case omission – 1 case (12.5 %).

Test execution omission – 1 case (12.5 %).

Test environment errors – 1 case (12.5 %).

Management Experience Summary

Progress Management

Test manager defined overall schedule, milestones, and round‑wise strategies.

Weekly decomposition of test plans and goals for tracking.

Progress Monitoring

Integrated test plans into risk management; regular progress checks and risk identification.

Test plans reviewed during solution reviews to ensure reasonableness.

Future Improvement Directions

Continue building a test knowledge base and basic case library; update standards to maintain team capability and mitigate turnover impact.

Explore diverse testing modes in iterative and agile environments; increase early test involvement and tighter collaboration with development.

Defect trend chart
Defect trend chart
Release frequency chart
Release frequency chart
Testingprocess improvementsoftware testingtest managementQuality Metricsdefect analysis
FunTester
Written by

FunTester

10k followers, 1k articles | completely useless

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.