Master LRU Cache: Interview‑Ready Algorithms, Use Cases, and Python Implementation
This article explains the LRU cache eviction algorithm, its core principle, real‑world applications in Redis and MySQL, compares naive array and linked‑list solutions, and provides a complete O(1) doubly‑linked‑list‑plus‑hash‑map implementation in Python for interview preparation.
Why LRU matters in interviews
Interviewers at major tech companies often test candidates on fundamental caching algorithms, and LRU (Least Recently Used) is a high‑frequency topic that can differentiate strong candidates.
What is LRU?
LRU stands for Least Recently Used . It assumes that data not accessed for a long time is unlikely to be needed soon, so when the cache is full the least recently accessed item is evicted.
Application scenarios
LRU is used in Redis for its eviction policy (combined with periodic and lazy deletion of expired keys) and in MySQL Buffer Pool to reduce disk I/O by discarding the oldest pages.
Naïve array implementation (O(n))
One can use a fixed‑size array where each element carries a timestamp or counter. On insertion, all existing elements increment their counters; the element with the highest counter is evicted when the array is full. This approach has O(n) time complexity for each operation.
Doubly linked list + hash map (O(1))
To achieve O(1) for both lookup and update, combine a hash map (key → node) with a doubly linked list that maintains usage order: the head holds the least recently used item, the tail holds the most recently used. Operations:
Lookup: use the hash map to locate the node instantly.
Insert: add a new node at the tail; if capacity is exceeded, remove the head node and its hash entry.
Update/access: move the accessed node to the tail by adjusting its prev/next pointers.
The doubly linked list is essential because it provides a backward pointer, allowing O(1) removal of any node without traversing from the head.
Python implementation
class Node(object): # stores key and value
def __init__(self, key=None, value=None):
self.key = key
self.value = value
self.next = None
self.prev = None
class LRUCache:
def __init__(self, capacity: int):
self.capacity = capacity
self.hashmap = {}
self.head = Node()
self.tail = Node()
self.head.next = self.tail
self.tail.prev = self.head
def move_node_to_tail(self, key):
if key not in self.hashmap:
return
node = self.hashmap[key]
# detach node
node.prev.next = node.next
node.next.prev = node.prev
# attach at tail
node.prev = self.tail.prev
node.next = self.tail
self.tail.prev.next = node
self.tail.prev = node
def get(self, key: int) -> int:
if key in self.hashmap:
self.move_node_to_tail(key)
return self.hashmap[key].value
else:
return -1
def put(self, key: int, value: int) -> None:
if key in self.hashmap:
self.hashmap[key].value = value
self.move_node_to_tail(key)
else:
if len(self.hashmap) == self.capacity:
# evict least recently used
lru = self.head.next
self.hashmap.pop(lru.key)
self.head.next = lru.next
lru.next.prev = self.head
new = Node(key, value)
self.hashmap[key] = new
new.prev = self.tail.prev
new.next = self.tail
self.tail.prev.next = new
self.tail.prev = newPracticing such implementations and understanding the underlying trade‑offs will help candidates confidently tackle LRU questions in technical interviews.
NiuNiu MaTe
Joined Tencent (nicknamed "Goose Factory") through campus recruitment at a second‑tier university. Career path: Tencent → foreign firm → ByteDance → Tencent. Started as an interviewer at the foreign firm and hopes to help others.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
