Innovations in Mobile Testing under Agile: Risk‑Based Testing, Shift‑Left Practices, and Automation Strategies
The article explores the challenges of mobile testing in fast‑paced agile environments and presents a comprehensive approach that combines risk‑based testing, shift‑left quality assurance, layered automation, service‑interface testing, UI automation, and supporting infrastructure to improve efficiency and product quality.
Author Luo Zhaojun, senior test manager at Ctrip Ticket, shares insights from a Ctrip technology salon on testing innovations for mobile applications.
1. Pain points of mobile testing under agile – Rapid product‑driven development, frequent sprint‑level changes, and compressed test windows increase risk, especially given the diversity of devices, OS versions, and hybrid technologies (Android, iOS, H5, RN). Test teams must balance coverage, speed, and quality.
Risk‑based testing is essential: define test scope based on risk, use gray‑release strategies, and expose quality issues throughout the development lifecycle rather than accumulating them for the final test phase.
2. Test shift‑left (pre‑testing) – Shift‑left is not merely early test involvement; it establishes a full‑process quality assurance mindset, including PRD static testing, contract testing, static code analysis, unit testing, service‑interface testing, developer self‑testing, entry‑gate checks, and quality metrics collection.
Although shift‑left initially adds effort for developers and testers, it leads to higher product and code quality, reduced waste, and a virtuous cycle of efficiency.
Key points for successful shift‑left:
Leadership support and cross‑team alignment.
Accurate workload estimation for sprint tasks (UT, self‑testing, acceptance).
Optimized PRD and code structures for testability.
Robust automation frameworks, CI/CD pipelines, and data‑construction tools.
Testers act as coordinators, providing metrics such as static scan issues, UT coverage, self‑test pass rates, and entry‑gate timeliness.
Example data from a daily release illustrates the metrics collected.
3. Automation testing & test automation
3.1 Layered testing – Mobile apps sit at the top of a complex backend; testing must be layered and isolated. A typical call‑graph for Ctrip Ticket mobile is shown, followed by a layered testing model (unit, integration, service, UI, etc.) with target effort distribution.
Unit tests are developer‑driven; testers focus on static analysis, coverage tracking, and improvement.
3.2 Service‑interface testing – Using mature frameworks (e.g., Nunit) with data‑driven utilities, the key is business‑level coverage, high pass rates, and stable environments. Emphasis on dynamic test data, handling dependencies, multi‑service chaining, performance, and validation methods.
Dynamic data generation avoids false failures caused by static fixtures.
3.3 UI automation – Appium (supports XCUITest from 1.6.3) and alternatives like XCTestWD are discussed. UI automation boosts speed and coverage but suffers from high maintenance cost and lower stability compared to lower‑level tests; ROI considerations are critical.
Key focus areas for UI automation include low entry barriers, debugging tools, cross‑platform script reuse, dynamic scenario data, distributed execution platforms, and mock services.
4. Automation support infrastructure
To address dynamic data, system dependencies, environment stability, and execution efficiency, a suite of supporting tools is built:
Dynamic test‑data factory (auto‑generation, DB polling, data restoration).
Mock platform (configurable request/response, production‑traffic import).
Configurable assertions, business‑message comparison, image diff.
CI environment for automated builds, test distribution, remote debugging.
Mock platform design principles: caller isolation, multi‑protocol support, chained mock scenarios, production‑traffic configuration, scenario‑based messages, and cross‑team sharing.
5. Fine‑grained fuzz testing
Fuzz testing injects massive random inputs; the proposed fine‑grained approach categorizes inputs by scenario to enable targeted testing. The Ctrip Ticket team uses the mock platform to generate scenario‑specific data, compare results against a baseline, and automate execution across two test environments.
Workflow steps include tagging code, pulling production messages via mock, constructing test cases, executing against both environments, and comparing responses.
Conclusion
In agile development, risk‑based testing combined with comprehensive automation is essential to balance quality and speed. Automation spans multiple layers (unit, integration, UI, fuzz) and requires a full‑stack supporting ecosystem—from data factories and mock services to CI pipelines and detailed metrics—to achieve low‑cost, reliable, and maintainable testing.
Ctrip Technology
Official Ctrip Technology account, sharing and discussing growth.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.