Understanding Big Keys in Redis: Causes, Impacts, Detection, and Optimization
The article explains what constitutes a big key in Redis, why they arise, the performance and blocking issues they cause, and provides multiple practical methods—including built‑in commands, SCAN, MEMORY USAGE, RDB analysis tools, and cloud services—to detect and mitigate big keys effectively.
What Is a Big Key?
A big key (bigkey) is a Redis key whose associated value occupies a large amount of memory. Rough guidelines consider a string value larger than 1 MB or a composite type (list, hash, set, sorted set, etc.) with more than 5 000 elements as a big key.
How Big Keys Are Created and Their Harms
Big keys typically arise from poor program design (e.g., storing large binary files in a string), inadequate data‑scale planning, or failure to clean up obsolete data. They consume extra memory and bandwidth and can cause severe performance problems, including:
Client‑side timeout due to Redis’s single‑threaded command execution being blocked by time‑consuming operations on big keys.
Network blockage because large keys generate massive traffic (e.g., a 1 MB key accessed 1 000 times per second creates 1 GB/s traffic).
Worker‑thread blockage when deleting big keys with DEL blocks the server.
These issues also affect replication and cluster scaling.
How to Detect Big Keys
1. Use Redis’s --bigkeys option
# redis-cli -p 6379 --bigkeys
# Scanning the entire keyspace to find biggest keys as well as
# average sizes per key type. You can use -i 0.1 to sleep 0.1 sec
# per 100 SCAN commands (not usually needed).
[00.00%] Biggest string found so far "ballcat:oauth:refresh_auth:f6cdb384-9a9d-4f2f-af01-dc3f28057c20" with 4437 bytes
[00.00%] Biggest list found so far "my-list" with 17 items
-------- summary -------
Sampled 5 keys in the keyspace!
Total key length in bytes is 264 (avg len 52.80)
Biggest list found "my-list" has 17 items
Biggest string found "ballcat:oauth:refresh_auth:f6cdb384-9a9d-4f2f-af01-dc3f28057c20" has 4437 bytes
1 lists with 17 items (20.00% of keys, avg size 17.00)
0 hashs with 0 fields (00.00% of keys, avg size 0.00)
4 strings with 4831 bytes (80.00% of keys, avg size 1207.75)Running this command scans the entire keyspace, so it may impact performance; use the -i flag to throttle the scan frequency (e.g., redis-cli -p 6379 --bigkeys -i 3 pauses 3 seconds between scans).
2. Use the SCAN command combined with type‑specific length commands such as STRLEN , HLEN , LLEN , SCARD , and ZCARD to evaluate key sizes.
3. Analyze RDB snapshot files with open‑source tools like redis‑rdb‑tools (Python) or rdb_bigkeys (Go) when Redis uses RDB persistence.
4. Leverage cloud‑provider Redis analysis services , such as Alibaba Cloud’s real‑time key statistics feature.
How to Handle and Optimize Big Keys
Split big keys : Divide a large hash or other composite structure into multiple smaller keys.
Manual cleanup : Use UNLINK (Redis 4.0+) for asynchronous deletion, or combine SCAN with DEL for batch removal.
Choose appropriate data structures : Store binary files using more suitable types (e.g., HyperLogLog for UV counts, Bitmaps for flags) instead of large strings.
Enable lazy‑free : From Redis 4.0 onward, lazy‑free allows asynchronous memory release, preventing main‑thread blocking during deletions.
References
[1] Redis Common Blocking Problems Summary: https://javaguide.cn/database/redis/redis-common-blocking-problems-summary.html
[2] redis‑rdb‑tools: https://github.com/sripathikrishnan/redis-rdb-tools
[3] rdb_bigkeys: https://github.com/weiyanwei412/rdb_bigkeys
IT Services Circle
Delivering cutting-edge internet insights and practical learning resources. We're a passionate and principled IT media platform.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.