Surviving a Pandemic-Scale Backend: Lessons from Tencent’s 2020 Video Red‑Packet Project

This article recounts the technical and organizational challenges of building Tencent WeSee's 2020 Spring video red‑packet system, detailing high‑traffic spikes, consistency and security demands, TRPC adoption risks, performance bottlenecks, and the 33 robustness principles that guided the subsequent architecture overhaul.

Tech Architecture Stories
Tech Architecture Stories
Tech Architecture Stories
Surviving a Pandemic-Scale Backend: Lessons from Tencent’s 2020 Video Red‑Packet Project

In November 2019, our smart‑hardware product faced a costly, unsubsidized dilemma while competitors offered sub‑hundred‑yuan devices, leading to its eventual cancellation despite good reputation.

Our team was reassigned to WeSee, where we tackled volatile workloads and peak‑intensity development, focusing on UGC video/image upload, effects, and editing.

During the 2020 Spring Festival, we invested heavily in large‑scale activities such as a card‑collecting game and video red‑packet feature, leveraging a special WeChat video‑red‑packet release channel.

The backend development adopted Tencent’s new TRPC framework, presenting several technical challenges:

Unpredictable traffic spikes during New Year’s Eve.

Complex video‑red‑packet functionality requiring strong consistency and financial security.

High cash flow demanding robust anti‑fraud and system stability measures.

Instability and heterogeneity introduced by the nascent TRPC framework, with multiple RPC implementations and poor service discovery.

Performance‑critical core modules needing extensive resources and rapid scaling of naming services and underlying platforms.

Amid the COVID‑19 pandemic, the team continued to work under intense pressure, coordinating with growth, SRE, and other teams to mitigate risks, secure resources, and evaluate technical solutions.

“If something can go wrong, it will.” — Murphy’s Law

On New Year’s Eve, the video‑red‑packet system collapsed under sudden load, causing outages that were only resolved after hours, despite a record DAU.

Subsequent Lantern Festival activities suffered similar cache failures, compounding the issues.

In early 2020, I rebuilt the backend team, expanding from 20 to about 80 engineers across Beijing and Shenzhen, and led a comprehensive refactor of foundational systems and the video‑red‑packet service.

We established 33 robustness guidelines covering four key areas: observability, performance & capacity, disaster recovery & resilience, and call‑graph & upstream‑downstream relationships. Each subsystem underwent at least one rigorous robustness review.

The video‑red‑packet architecture was redesigned with domain‑layer separation—business, routing, transaction, accounting, and persistence layers—using a saga + TCC hybrid transaction model, supporting multi‑currency and multi‑period accounting, and incorporating full‑link stress testing.

Despite relentless effort, the pandemic’s financial strain and unsustainable subsidy costs led to a contraction in late 2021, prompting a company‑wide “cost‑reduction and efficiency‑increase” initiative, where I again faced the challenge of cutting machine and bandwidth expenses while keeping the service alive.

Original Source

Signed-in readers can open the original source through BestHub's protected redirect.

Sign in to view source
Republication Notice

This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactadmin@besthub.devand we will review it promptly.

BackendSystem ArchitectureScalabilitytRPCvideo red packet
Tech Architecture Stories
Written by

Tech Architecture Stories

Internet tech practitioner sharing insights on business architecture, technology, and a lifelong love of tech.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.