Preventing Cache Breakdown in High‑Concurrency Systems
This article explains what cache breakdown is in high‑traffic scenarios, why it can overload databases, and presents practical solutions such as pre‑warming hot data, scheduled cache refreshes, and multi‑level caching with code examples in Java and Redis.
Cache breakdown occurs when a hot key suddenly expires (due to timeout or manual removal), causing a flood of concurrent requests to bypass the cache and hit the database, which can instantly overload or crash the DB. This problem is common in flash‑sale, hot‑article, live‑stream, and popular‑product scenarios.
Solution 1: Pre‑heat Hot Data
Load likely hot data into the cache before a traffic peak so that the first request does not trigger a cache miss.
@PostConstruct
public void preheatHotData() {
List<String> hotKeys = getHotKeysFromConfig(); // fetch hot keys from config center
for (String key : hotKeys) {
String data = db.get(key);
redis.setex(key, 3600, data); // warm for 1 hour
}
}For extremely hot items (e.g., user profiles, product details) you may omit the expiration time and rely on logical expiration or asynchronous background updates to keep the data fresh.
Solution 2: Periodic Refresh of Hot Cache
Use a scheduled task to refresh the most frequently accessed keys at regular intervals, which can completely avoid breakdown at the cost of possible stale data.
@Scheduled(cron = "0 */5 * * * ?") // every 5 minutes
public void refreshHotCache() {
List<String> top100Keys = getTopHotKeys(); // obtain top‑100 hot keys from monitoring system
for (String key : top100Keys) {
refreshKeyAsync(key); // async refresh
}
}Be aware that this approach may introduce data inconsistency; the business must tolerate temporary stale data.
Solution 3: Multi‑Level Cache + Logical Expiration
Combine a local cache (e.g., Caffeine or Guava) in front of Redis. Even if Redis experiences a breakdown, the local cache can absorb most traffic.
Typical flow: request → local Caffeine cache (15 s TTL) → Redis cache (130 min TTL) → database.
These strategies—pre‑warming, scheduled refresh, and multi‑level caching—provide strong defenses against cache breakdown in high‑concurrency environments, though each comes with trade‑offs regarding data freshness and complexity.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
Mike Chen's Internet Architecture
Over ten years of BAT architecture experience, shared generously!
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
