Backend Development 7 min read

Cache Update Strategies: LRU/LFU/FIFO, Timeout Eviction, and Proactive Refresh

The article reviews three cache update strategies—algorithmic eviction (LRU/LFU/FIFO), timeout‑based eviction, and proactive updates—analyzing their consistency guarantees and maintenance costs, and then proposes a combined best‑practice approach for reliable backend caching.

Architect
Architect
Architect
Cache Update Strategies: LRU/LFU/FIFO, Timeout Eviction, and Proactive Refresh

Cache update strategies can be divided into three main categories: algorithmic eviction (such as LRU, LIRS, FIFO), timeout‑based eviction, and proactive (active) updates. This article briefly explains each method from the perspectives of data consistency and maintenance cost.

1. Algorithmic Eviction (LRU/LFU/FIFO)

Use case: When cache usage exceeds the configured maximum size, an eviction algorithm decides which entries to remove. For example, FIFO removes the newest entries, while LRU removes the least recently used ones.

Common implementations include Memcached (which uses LRU) and Redis, where the maxmemory-policy setting controls the eviction behavior.

Strategy

Consistency

Maintenance Cost

LRU/LIRS/FIFO algorithm

Worst

Low

Timeout eviction

Poor

Low‑Medium

Proactive update

Strong

High

Redis configuration relevant to eviction:

Configuration

Description

Default

maxmemory

Maximum memory that Redis may use

Unlimited (no limit)

maxmemory-policy

Eviction policy when memory is exhausted

volatile-lru

volatile-lru → evict using LRU among keys with an expiration

allkeys-lru → evict using LRU among all keys

volatile-random → randomly evict keys with an expiration

allkeys-random → randomly evict any key

volatile-ttl → evict keys with the nearest expiration time

noeviction → never evict; commands return an error when memory is full

Consistency is generally poor because the eviction decision is made by the cache server, not the application, and developers have limited control over which specific data is removed.

Maintenance cost is low: developers only need to set maxmemory and choose an appropriate policy.

2. Timeout Eviction

Use case: Setting an expiration time for cached entries (e.g., using Redis EXPIRE or Memcached expiration). This allows temporary inconsistency between cache and the source database, which is acceptable for many read‑heavy scenarios.

When the TTL expires, the entry is removed and the next request will fetch fresh data from the primary store and repopulate the cache.

Consistency: Data may be stale for the duration of the TTL.

Maintenance cost: Minimal; only the expiration time needs to be configured.

3. Proactive (Active) Update

Background: Some applications require near‑strong consistency, needing the cache to be refreshed immediately after the underlying data changes.

Typical implementations use message queues, database triggers, or change‑data‑capture listeners to push update notifications to the cache.

Consistency: Highest among the three strategies, but if the update mechanism fails, stale data may persist until a timeout eviction occurs.

Maintenance cost: High, because developers must write and maintain the update logic and monitoring.

4. Best Practice

The recommended approach is to combine strategies:

Configure an eviction policy (e.g., LRU) and a maximum memory limit to prevent out‑of‑memory crashes.

Use timeout eviction for data that can tolerate short periods of staleness.

Apply proactive updates for critical data that requires strong consistency, and rely on timeout eviction as a safety net if the proactive path fails.

Source: ITeye (original article: http://carlosfu.iteye.com/blog/2240426)

BackendCachingLRUtimeoutEvictionproactive update
Architect
Written by

Architect

Professional architect sharing high‑quality architecture insights. Topics include high‑availability, high‑performance, high‑stability architectures, big data, machine learning, Java, system and distributed architecture, AI, and practical large‑scale architecture case studies. Open to ideas‑driven architects who enjoy sharing and learning.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.