Databases 7 min read

Configuring Redis Memory Size and Eviction Policies, Including LRU and LFU Algorithms

This article explains how to size Redis memory, configure maxmemory and maxmemory‑policy, and details the various eviction strategies—both non‑eviction and eviction policies—while describing the inner workings of Redis's LRU and LFU algorithms and their practical trade‑offs.

IT Architects Alliance
IT Architects Alliance
IT Architects Alliance
Configuring Redis Memory Size and Eviction Policies, Including LRU and LFU Algorithms

When a cache grows larger than its allocated memory, Redis must evict less important data; selecting an appropriate eviction strategy frees space for new entries.

It is common to set Redis's maxmemory to 15‑30% of the total dataset (often around 20% based on the 80/20 rule). For example, to allocate 5 GB you can run:

config set maxmemory 5gb

You can verify the setting with:

config get maxmemory

Redis provides eight eviction policies, divided into two groups: non‑eviction (noeviction) and eviction policies. The eviction policies include:

allkeys‑random

allkeys‑lru

allkeys‑lfu

volatile‑random

volatile‑ttl

volatile‑lru

volatile‑lfu

volatile‑* (targets keys with an expiration time)

Policies prefixed with volatile only consider keys that have an expiration set, while allkeys policies consider every key regardless of TTL.

To change the eviction policy, use the config set maxmemory-policy command, for example:

config set maxmemory-policy allkeys-lru

The LRU (Least Recently Used) algorithm keeps a timestamp for each key and, when eviction is needed, samples a set of keys and removes the one with the smallest LRU value, reducing the overhead of maintaining a full linked list.

The LFU (Least Frequently Used) algorithm, added in Redis 4.0, tracks both a timestamp (ldt) and a usage counter for each key. When evicting, Redis first removes keys with the lowest access count, breaking ties with the timestamp.

Both LRU and LFU are provided because some workloads contain many keys accessed only once; LFU can more effectively evict such rarely‑used keys, while LRU is better for workloads where recent access patterns dominate.

Memory ManagementRediscachingLRULFUEviction Policy
IT Architects Alliance
Written by

IT Architects Alliance

Discussion and exchange on system, internet, large‑scale distributed, high‑availability, and high‑performance architectures, as well as big data, machine learning, AI, and architecture adjustments with internet technologies. Includes real‑world large‑scale architecture case studies. Open to architects who have ideas and enjoy sharing.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.