Operations 8 min read

How AutoV IT Transformed Shopping Cart Test Automation and Boosted Efficiency by 30%

This article details how the AutoV IT framework addressed low test coverage, high maintenance costs, and data dependencies in a shopping‑cart checkout system, introducing embedded service startup, mock services, Excel‑driven test cases, and parallel execution to achieve up to 30% faster testing cycles and significantly higher confidence in releases.

Vipshop Quality Engineering
Vipshop Quality Engineering
Vipshop Quality Engineering
How AutoV IT Transformed Shopping Cart Test Automation and Boosted Efficiency by 30%

Background

Before mid‑2016, the shopping‑cart checkout and special‑event testing team had very limited automated test coverage, relying on native Java TestNG for a few domains. Even where coverage existed, maintenance costs were high, test case design was inefficient, data was static, and pass rates depended heavily on stable test data and environments, leading to common automation problems.

To improve product testing quality and release confidence amid rapid version iterations, the team launched a new automation initiative based on the AutoV IT framework.

Automation with AutoV IT Framework

The new AutoV IT version addressed the aforementioned common automation issues.

Key features include:

No need to deploy the service under test (embedded service startup)

Fully automated without external dependencies except the database

Local testing for OSP/Saturn/Web

Mock services for OSP/HTTP

Mock Venus public services

Configurable mocks without any coding

Support for API testing, automation testing, and performance testing

Excel‑driven data management for test cases (AT)

Excel‑driven data management for mock data

Using the AutoV IT framework, the team built an automation engineering pipeline.

Example domain structure:

The team defined responsibilities for each testing stage and encapsulated corresponding methods.

Note: idMap stores initialized test data for later use and can be cleared after test execution to restore the environment.

Maintaining AT test cases and mock data via Excel greatly simplified authoring and upkeep.

Additional modules developed to accelerate coding:

Test data initialization module

SQL operation module

Mock module (HTTP, OSP, ES, configuration center)

Utility module (JsonUtils, sharding, checker, etc.)

Excel‑driven management of AT test cases and mock data

Challenges faced and solutions:

Initial lack of ready‑made utility methods – skilled developers first encapsulated basic methods for others to use, then optimized per scenario.

Difficulty creating test data for diverse scenarios – abstracted updateData/deleteData methods to adapt to any table, with dynamic field configuration in Excel.

Tested service versions could not be automatically packaged – used pre‑deployment shell scripts to compile, package, and upload the service with versioned identifiers.

Large test case repository leading to low execution efficiency – for data‑independent cases, employed concurrent execution to improve speed.

Results after several months of effort:

Automation coverage and scope increased dramatically, enabling test‑shift‑left and higher release confidence.

Test efficiency improved by 30%.

Weekly interface regression time reduced from ~1.5 days to near‑zero manual intervention.

Test cycle shortened by 30%.

Eight months ago, similar optimization projects required 3 person‑days; now only 1 person‑day.

Parallel execution allowed 1,300+ test cases to finish in half an hour.

High maintainability: when an interface changes, only the corresponding test case needs updating, taking about 2 hours for the whole team.

Template‑based data initialization reduced code volume and duplicated effort.

Test data moved to the test‑case level, easing code reading and future maintenance.

Test cases can be quickly edited with a text editor, boosting authoring efficiency.

Test‑shift‑left achieved via API repository without relying on the service SDK (except for external mock services).

JSON strings used as comparison values, eliminating dependence on the service’s data structures.

Remaining Issues and Future Direction

Although AutoV IT has been successful for the special‑event team, some problems remain that need further optimization in 2018 to support an even larger automation ecosystem.

We welcome feedback and discussion on better automation solutions.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

efficiencysoftware testingTest Automationcontinuous integrationframework
Vipshop Quality Engineering
Written by

Vipshop Quality Engineering

Technology exchange and sharing for quality engineering

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.