Fundamentals 10 min read

Java Implementation of an O(1) LRU Cache Using a Doubly Linked List and HashMap

This article explains the design and Java implementation of an LRU (Least Recently Used) cache that achieves O(1) time complexity for get and put operations by combining a doubly linked list with a hash map, and provides full source code and usage examples.

Architect's Guide
Architect's Guide
Architect's Guide
Java Implementation of an O(1) LRU Cache Using a Doubly Linked List and HashMap

LRU (Least Recently Used) cache is a common eviction strategy that removes the least recently accessed items when the cache reaches its capacity.

The article describes the design of an LRUCache class in Java with O(1) time complexity for both get and put operations, using a combination of a doubly linked list and a hash map.

The doubly linked list maintains the order of usage, with the most recently used node at the head and the least recently used at the tail, while the hash map provides constant‑time lookup of nodes by key.

Key methods include:

addFirst(node): inserts a node right after the dummy head.

remove(node): detaches a node from the list.

moveToFirst(node): moves an existing node to the head.

removeLast(): removes the tail node (the least recently used).

get(key): returns the value if present, otherwise -1, and moves the accessed node to the head.

put(key, value): updates an existing key or inserts a new one, evicting the tail node when capacity is exceeded.

The full Java implementation is provided below:

class LRUCache {
    class DLinkedNode {
        int key;
        int value;
        DLinkedNode pre;
        DLinkedNode next;
        public DLinkedNode() {}
        public DLinkedNode(int key, int value) {
            this.key = key;
            this.value = value;
        }
    }
    private Map
cache = new HashMap<>();
    private int size;
    private int capacity;
    private DLinkedNode head, tail;
    public LRUCache(int capacity) {
        this.size = 0;
        this.capacity = capacity;
        head = new DLinkedNode();
        tail = new DLinkedNode();
        head.next = tail;
        tail.pre = head;
    }
    public int get(int key) {
        DLinkedNode node = cache.get(key);
        if (node == null) return -1;
        moveToFirst(node);
        return node.value;
    }
    public void put(int key, int value) {
        DLinkedNode node = cache.get(key);
        if (node != null) {
            node.value = value;
            moveToFirst(node);
        } else {
            if (capacity == cache.size()) {
                DLinkedNode tail = removeLast();
                cache.remove(tail.key);
                --size;
            }
            DLinkedNode newNode = new DLinkedNode(key, value);
            cache.put(key, newNode);
            addFirst(newNode);
            ++size;
        }
    }
    private void addFirst(DLinkedNode node) {
        node.pre = head;
        node.next = head.next;
        head.next.pre = node;
        head.next = node;
    }
    private void moveToFirst(DLinkedNode node) {
        remove(node);
        addFirst(node);
    }
    private void remove(DLinkedNode node) {
        node.pre.next = node.next;
        node.next.pre = node.pre;
    }
    private DLinkedNode removeLast() {
        DLinkedNode ans = tail.pre;
        remove(ans);
        return ans;
    }
    private int getSize() {
        return size;
    }
}

The core insight is that the hash map supplies O(1) key lookup, while the doubly linked list supplies O(1) insertion, deletion, and ordering, fulfilling the requirements of an efficient LRU cache.

JavaAlgorithmData Structurelru-cacheO(1)
Architect's Guide
Written by

Architect's Guide

Dedicated to sharing programmer-architect skills—Java backend, system, microservice, and distributed architectures—to help you become a senior architect.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.