Technical Architecture and High‑Concurrency Solutions for Shopee Shake During Major Promotions

Shopee Shake’s architecture separates admin and user sides into three layers—access, application, and resource—and uses horizontal scaling, bucketed Redis coin pools, multi‑level caching, asynchronous message queues, precise capacity formulas, and comprehensive monitoring and chaos‑engineered runbooks to reliably handle over 300,000 QPS during major promotional events.

Shopee Tech Team
Shopee Tech Team
Shopee Tech Team
Technical Architecture and High‑Concurrency Solutions for Shopee Shake During Major Promotions

Background : Shopee runs several large‑scale promotional events each year. The marketing mini‑game "Shopee Shake" is the most frequently used game during these events, attracting massive concurrent traffic.

Game and Promotion : The game is a shake‑to‑win experience where users shake their phones to earn virtual coins. It has three UI stages – pre‑heat, gameplay, and result – and can generate over 300,000 QPS during peak moments (e.g., the 2021 May 5th promotion).

Technical Challenges : The backend must handle (1) extremely high instantaneous concurrency, (2) very short game duration (10‑30 seconds per round), and (3) concurrent deduction of a shared coin pool, which can become a single‑point bottleneck.

Architecture Design : The system is split into an Admin side for operators and a User side for players. It follows a three‑layer model:

Access layer – request entry, authentication, protocol conversion.

Application layer – core business logic, activity configuration, inventory, ranking, and various micro‑services.

Resource layer – MySQL, Codis (Redis cluster), and Shopee middle‑services (notification, coin transfer, chat).

High‑Concurrency Techniques :

Horizontal Scaling

The system is built to scale horizontally. Stateless services in the access and application layers can be replicated, while the storage layer uses Codis, which allows adding more Redis instances to increase capacity.

To avoid a single‑key hotspot, the coin pool is sharded into multiple buckets. The effective QPS becomes: QPS = N * 单台 Redis OPS where N is the number of buckets.

Caching

Read‑heavy, write‑light data (game config, static assets) are cached at multiple levels: CDN, web cache, process cache, and Codis. Cache‑aside with a lock mechanism prevents thundering‑herd attacks on the database. If the lock cannot be obtained within a timeout, the system falls back to stale cache data.

Asynchronous Processing

Time‑consuming operations (ranking write, coin distribution, award notification, analytics) are off‑loaded to a message queue (e.g., Kafka). This reduces latency of the critical “game‑end” API. The design ensures no message loss and idempotency by using globally unique request IDs.

Capacity planning uses formulas such as: 容器数量 = 最大 QPS / 单容器容量 and Codis OPS = 最大 QPS * 每请求 Codis 操作次数 These calculations guide the number of containers, Codis instances, and downstream service capacities required for a promotion.

Monitoring and Incident Planning : A three‑dimensional monitoring system covers access, application, resource, and hardware layers. Detailed metrics are defined for each layer. Pre‑, emergency, and recovery runbooks are prepared for scenarios like configuration errors, service instability, or middleware failures.

Fault Drills : Regular chaos engineering exercises simulate failures (e.g., injecting faults, traffic shading) to validate runbooks, improve team response, and uncover hidden issues.

Conclusion : By combining horizontal scaling, bucketed inventory, extensive caching, asynchronous processing, rigorous capacity planning, and comprehensive monitoring, Shopee Shake can sustain high traffic during major promotions while maintaining stability and a good user experience.

distributed systemscachingHigh Concurrencycapacity planningasynchronous processingShopee Shake
Shopee Tech Team
Written by

Shopee Tech Team

How to innovate and solve technical challenges in diverse, complex overseas scenarios? The Shopee Tech Team will explore cutting‑edge technology concepts and applications with you.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.