Backend Development 9 min read

Master Node.js Caching with lru-cache: From Basics to Advanced Techniques

Learn how to boost Node.js application performance using the popular lru-cache package, covering the LRU algorithm fundamentals, basic installation and usage, advanced features like peek, fetchMethod, and custom disposal, plus an in‑depth look at its internal Map and doubly‑linked list implementation.

Code Mala Tang
Code Mala Tang
Code Mala Tang
Master Node.js Caching with lru-cache: From Basics to Advanced Techniques

What is the LRU Algorithm

LRU stands for Least Recently Used, a cache eviction policy that removes the least recently accessed items, keeping frequently accessed data in the cache to improve hit rate.

Basic Usage

Install the package with npm or yarn:

<code>npm install lru-cache</code>

Example:

<code>const LRU = require('lru-cache');
// Create a cache instance
const options = {
  max: 100, // maximum number of entries
  maxAge: 1000 * 60 * 5 // entry TTL in ms
};
const cache = new LRU(options);

// Set entries
cache.set('key1', 'value1');
cache.set('key2', 'value2');

// Get entry
console.log(cache.get('key1')); // 'value1'

// Delete entry
cache.del('key2');

// Check existence
console.log(cache.has('key1')); // true
console.log(cache.has('key2')); // false

// Clear cache
cache.reset();</code>

Key options:

max : maximum number of entries; excess entries are evicted.

maxSize : maximum total size; takes precedence over max . Each entry must provide a size via sizeCalculation or size .

maxAge : time‑to‑live for each entry in milliseconds.

length : function to calculate entry length; defaults to 1.

dispose : function called when an entry is removed, useful for cleanup.

stale : if true, stale entries are returned while being removed asynchronously.

Advanced Usage

Additional methods:

peek(key) : read a value without updating its recent‑use order.

keys() and values() : retrieve arrays of all keys or values.

fetchMethod : automatically fetch and cache missing data.

dispose and disposeAfter : run custom logic when entries are removed.

Example of fetchMethod :

<code>const LRU = require('lru-cache');
const cache = new LRU({
  max: 100,
  fetchMethod: async (key) => {
    const data = await fetchDataFromAPI(key);
    return data;
  }
});

async function fetchDataFromAPI(key) {
  return new Promise((resolve) => setTimeout(() => resolve(`data for ${key}`), 100));
}

// Use fetch
cache.fetch('key').then(data => console.log(data));
</code>

Example of custom disposal:

<code>const LRU = require('lru-cache');
const cache = new LRU({
  max: 100,
  dispose: (key, value) => {
    console.log(`Disposed ${key}: ${value}`);
  },
  disposeAfter: (key, value) => {
    console.log(`Fully removed ${key}: ${value}`);
  }
});

cache.set('key', 'value');
cache.del('key'); // Triggers dispose and disposeAfter
</code>

Internal Implementation

The source code (node-lru-cache) stores keys and values in separate arrays and a Map for O(1) look‑ups, while a doubly‑linked list maintains usage order.

Map (keyMap) : stores cache entries for constant‑time access.

Doubly‑linked list (prev/next) : tracks recency of use, enabling fast moves to the tail.

Core functions include:

set : adds or updates an entry, moving it to the tail.

has : checks existence and updates timestamp without moving to tail.

get : retrieves an entry and moves it to the tail.

moveToTail : moves a node to the end of the list.

When evicting, the cache removes the head of the list; removal from the map is O(1). The implementation also uses AbortController to handle fetch aborts, providing a polyfill for environments without native support.

Conclusion

lru-cache is a powerful, easy‑to‑use caching solution for Node.js that can dramatically improve performance by reducing redundant data fetches.

BackendperformanceJavaScriptNode.jscachingLRU Cache
Code Mala Tang
Written by

Code Mala Tang

Read source code together, write articles together, and enjoy spicy hot pot together.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.