Improving New User Retention in a Video App through A/B Testing: A Case Study
This article presents a detailed case study of how a video app team used two rounds of A/B testing with different swipe‑up guide designs to diagnose retention issues, refine the user onboarding experience, and ultimately achieve significant improvements in new‑user retention and engagement metrics.
Product growth traditionally follows the AARRR funnel, emphasizing acquisition, but as traffic bonuses wane, the RARRA model shifts focus to retention as the primary driver of sustainable growth.
The article examines a real‑world example from a video app similar to Douyin, where the product team conducted two rounds of A/B experiments using the in‑house DataTester platform to improve the onboarding swipe‑up guide for new users.
In the first experiment, a static hand‑icon guide was replaced with a semi‑dynamic guide featuring a scrolling indicator. The test allocated 10% of traffic (5% each to control and variant) for one month, targeting new users of the latest app version. Core metrics included new‑user retention, correct swipe‑up operation rate, and error operation rates.
The results showed no significant lift in retention; the correct swipe‑up rate actually decreased by about 1% (significant), while error rates behaved inconsistently, leading the team to conclude the change was ineffective and to roll back the feature.
Learning from the failure, the team designed a fully dynamic guide with an animated overlay demonstrating the swipe‑up motion and updated copy. A second A/B test, using the same traffic split and duration, compared the original guide (control) with the new dynamic guide (variant).
The second experiment yielded clear positive outcomes: new‑user retention increased by approximately 1%–1.8% (significant), average session duration and video plays rose markedly, and the correct swipe‑up operation rate improved by about 1.5% (significant). Regional analysis revealed stronger effects in economically developed areas.
Based on these results, the dynamic guide was rolled out to all users, though ongoing monitoring identified a subset of users still not performing the swipe‑up, especially in developed regions, prompting further investigation.
The case illustrates a complete A/B testing lifecycle—from hypothesis and design, through execution and multi‑dimensional analysis, to decision making—and highlights the importance of iterative experimentation for product growth.
DataTester, the proprietary A/B testing platform used in the study, has facilitated over 1.5 million tests within ByteDance and is now offered externally via Volcano Engine.
DataFunTalk
Dedicated to sharing and discussing big data and AI technology applications, aiming to empower a million data scientists. Regularly hosts live tech talks and curates articles on big data, recommendation/search algorithms, advertising algorithms, NLP, intelligent risk control, autonomous driving, and machine learning/deep learning.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.