R&D Management 11 min read

How to Guarantee Quality in Massive Frontend Projects – Lessons from CloudMap

This article reflects on the quality assurance challenges and solutions of the large‑scale CloudMap redesign project, covering core difficulties, QA responsibilities, testing strategies, performance monitoring, and post‑release optimization to ensure stable delivery for thousands of users.

Qunhe Technology Quality Tech
Qunhe Technology Quality Tech
Qunhe Technology Quality Tech
How to Guarantee Quality in Massive Frontend Projects – Lessons from CloudMap

Reviewing Quality Assurance for Massive Projects Using CloudMap as an Example

Background: CloudMap was a major redesign of a design tool involving over 20 agile teams, more than 100 participants, and an 11‑month timeline, with six months from kickoff to release and five months for user migration.

Core Challenges

1. Extensive core infrastructure changes made testing coverage difficult.

Previously, each business owned separate front‑end modules with independent entry points, which avoided coupling but led to duplicated basic interactions (undo/redo, selection, dragging) and inconsistent UI. To reduce development cost and improve reusability, a common application framework was built, providing unified data management, standard component libraries, and shared utilities.

Key questions then arose: Are the framework’s capabilities clearly bounded? How to ensure framework upgrades remain transparent to business modules? How to guarantee data correctness across modules? How to assess the impact of framework changes? All these require robust testing measures.

2. Merged tool modules increased resource consumption, threatening performance.

Integrating modules reduces the number of switches designers need, boosting efficiency, but also raises performance demands that must be addressed by the development team.

3. Interaction and layout improvements inevitably alter user habits, requiring a balance between new experiences and existing expectations.

Despite careful product discussions, predicting user reactions to multiple changes remains difficult; the planned migration in March was only completed in September, with continuous product iteration and operational adjustments to help users adapt.

Solutions

What responsibilities should QA assume in such a project, and how can many agile teams collaborate efficiently?

Testing responsibilities are not as simple as they seem. Questions include whether each module only needs to ensure its own quality, whether product‑defined issues still require testing, and whether historical unresolved problems must be monitored. Clarifying these points ensures a unified execution plan across the functional team.

Large projects often lack strict processes, leading to information asymmetry. To minimize impact on product delivery quality:

1. Testing leads overall deployment and release. Controlling release reduces risk and allows integration of quality gates and continuous‑integration checks. In CloudMap, the team introduced a new front‑end release system, automated test cases, baseline configuration checks, and enterprise‑WeChat bots for real‑time alerts at key integration points.

2. Define quality assurance strategies, targets, and check mechanisms for each project milestone. Instead of a single final quality goal, the team set stage‑specific objectives for module testing, integration testing, and gray‑release phases, ensuring continuous attention to quality and enabling timely risk mitigation.

3. Focus on module boundaries and coupling. Since different groups own separate modules, unclear boundaries can cause hidden issues. Automated scripts were added to verify that changes in one module do not break another, though further safeguards are needed.

4. Leverage technical tools to surface problems early. The team built performance baselines, monitoring dashboards, alert systems, enterprise‑WeChat bots, multilingual detection tools, and began developing a Chrome plugin to automatically detect duplicate requests, error logs, and low‑frame‑rate scenarios during testing.

Finally, release is not the endpoint; stable delivery to users completes the project lifecycle.

After release, CloudMap underwent a five‑month user migration with iterative traffic pulling, feedback collection, and optimization. A well‑designed traffic‑pull strategy limited impact while providing rapid feedback. However, performance issues persisted due to insufficient online performance metrics during early testing, highlighting the need for robust measurement mechanisms beyond pre‑release test samples.

Key takeaways: Do not rely solely on test‑phase results to judge user readiness. Establish comprehensive metrics, combine traffic‑pull feedback with product data (e.g., rollback rates), and set up rapid data‑collection and response processes to iterate quickly.

Performance Testingquality assurancefrontend testinglarge-scale projectQA strategy
Qunhe Technology Quality Tech
Written by

Qunhe Technology Quality Tech

Kujiale Technology Quality

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.