How to Evaluate Design Impact and Iterate Products with the “Find‑Split‑Test” Method

This article explains a data‑and‑experience‑driven framework—Find, Split, Test—for assessing design effectiveness and iterating a group‑buying product, illustrating each step with concrete metrics, flow analysis, A/B experiments, and measurable conversion improvements.

JD.com Experience Design Center
JD.com Experience Design Center
JD.com Experience Design Center
How to Evaluate Design Impact and Iterate Products with the “Find‑Split‑Test” Method

1. Find – Identify Core Product Data Metrics

Every product has a final outcome metric such as e‑commerce sales, subscription renewal rate, or TV viewership. For the "Pai Pian Yi" group‑buying app, core metrics were derived from basic e‑commerce indicators and the product’s specific characteristics.

2. Split – Dissect the Product Flow and Analyze Metric Influencers

The app consists of six core stages. During iteration we examine each stage for data‑driven improvement opportunities, later illustrated with detailed case studies.

3. Test – Propose Solutions, Validate Online, and Capture Learnings

We prioritize single‑variable A/B tests; testing multiple variables at once prevents clear insight accumulation.

Entry → List Page

Analysis: The entry’s primary role is to increase traffic, making more users aware of the app.

Attempt: Because the product received strong platform exposure, we secured prominent entry positions across QQ, WeChat, H5, and PC sites, obtained seller support for high‑quality platform activities, and purchased external traffic. Emphasis was placed on traffic quality, as low‑quality inflow caused conversion drops.

List → Group Page

Problem: Click‑through rate was flat with no noticeable rise.

Analysis: Users felt the product catalog was limited (only about ten items at launch), making it hard to find appealing goods.

Attempt: A) Enrich product offerings by onboarding more merchants, expanding the catalog. B) Optimize SKU display: highlight product + price, showcase rules, add countdown timers and crowd‑psychology cues (how many people have joined). This information hierarchy boosted click conversion from 20% to 35%.

Group Page → Order Confirmation Page

Problem: Need to raise group‑joining rate and formation rate.

Analysis: This page is the conversion hub; it must address two scenarios: a) users interested in the product and ready to order, b) users uninterested and needing redirection.

Attempt A – Interested Users: Research revealed needs for more product details and difficulty forming a group. Solutions included a prominent "view details" link, a new "ongoing groups" feature, and psychological triggers—urgency, herd mentality, and incentive—to stimulate ordering.

Attempt B – Uninterested Users: Implement a "you may also like" recommendation module, relying on precise BI targeting.

Iteration Effect: Group‑joining rate rose from 5% to 15%; formation rate jumped from 20% to 90% while total group count remained stable.

Adding Shopping Atmosphere

Problem: Create a lively, crowd‑participation feel.

Analysis: The product resembles an offline group‑buying shout‑out scenario; a simple, fun "cheer" page can simulate that atmosphere.

Attempt: Introduce a cheer page where users select or edit recommended cheer messages.

Result: The intended lively atmosphere was achieved without noticeable impact on conversion; conversion rates stayed stable across entry‑to‑order and group‑to‑order flows.

In summary, the "Find, Split, Test" framework consists of:

Find: Locate core data metrics.

Split: Break down metrics and study influencing factors.

Test: From a design perspective, identify solutions that affect metric factors and validate them.

Note: Choose stable data periods, avoiding large promotions or new‑launch volatility.

Common question – how to assess improvements driven by design?

In a stable project environment, improvements from design optimization occur without changes in product importance, functionality, or operations personnel.

Use comparative testing; identical content with different styles reveals effect changes.

Design influences not only visual style but also interaction patterns; evaluate contribution throughout the process, not merely data uplift.

This article commemorates the now‑closed "Pai Pai" platform.

Product DesignA/B testingdesign metricsUX evaluationdata-driven iterationgroup buying
JD.com Experience Design Center
Written by

JD.com Experience Design Center

Professional, creative, passionate about design. The JD.com User Experience Design Department is committed to creating better e-commerce shopping experiences.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.