What Tiny Design Changes Reveal About Purchase Intent and User Engagement

This article explains how subtle AB‑test variations—such as rounding prices or adjusting font size—can dramatically affect purchase intent and comment participation, backed by two real‑world case studies and practical takeaways for designers and product managers.

Suning Design
Suning Design
Suning Design
What Tiny Design Changes Reveal About Purchase Intent and User Engagement

AB testing is an optimization method that splits traffic between two versions (A and B) with a deliberate difference, collects data, and draws statistically significant conclusions.

Designers often create large visual differences, assuming they yield clearer results, but even single‑element tweaks can produce notable metric improvements. The following two classic cases from BEHAVE.org illustrate this point.

Case 1: Which version boosts purchase intent?

A company tested whether a shortened price format ($19) versus a full‑decimal price ($19.00) would affect buying intent, measured by add‑to‑cart clicks and detail‑view clicks. Over a week, more than 20,000 users were randomly assigned. The rounded‑price version (A) increased add‑to‑cart behavior by 9.3% and detail views by 29% with a 99.99% confidence interval, indicating a highly significant difference.

The analysis concluded that price perception changes: a clean, rounded price feels cheaper and requires less cognitive effort, especially for emotional shoppers, while the decimal version appears more complex and reduces confidence.

For emotional consumers, rounded prices can raise purchase intent; for rational consumers, non‑rounded prices may work better.

Price length equals perceived price; removing decimals simplifies perception, whereas adding them can emphasize savings.

Case 2: Which version raises user engagement?

A Japanese blog worried about declining comment participation hypothesized that larger font size would improve readability and thus engagement. Version A used 13 px text, while version B used 12 px. Contrary to expectations, the smaller‑font version increased engagement by 29.2%.

The result suggests that slightly smaller text can be scanned faster, allowing users to consume more content quickly and feel more inclined to comment.

Selecting the optimal font size is a crucial design decision that significantly influences reader participation.

Continuous testing is needed to balance content presentation with readability.

Font size choices should consider design goals, language, and cultural context; research shows 12 px often works best for paragraph text across various cultures and screen sizes.

Although these findings may contradict conventional design wisdom, they demonstrate that systematic testing can uncover counter‑intuitive insights. Designers should remember that observed correlations are not proven causations, that a better solution under specific conditions is not necessarily the ultimate optimum, and that results may not transfer unchanged to different environments. Ongoing experimentation and careful analysis are essential for precise design decisions.

AB testinguser engagementconversion optimizationUX designprice perception
Suning Design
Written by

Suning Design

Suning Design is the official platform of Suning UED, dedicated to promoting exchange and knowledge sharing in the user experience industry. Here you'll find valuable insights from 200+ UX designers across Suning's eight major businesses: e-commerce, logistics, finance, technology, sports, cultural and creative, real estate, and investment.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.