Mastering Backpressure in Node.js: Prevent Silent OOM Crashes

This article explains what backpressure is, why Node.js services silently run out of memory, and provides practical streaming and bounded‑concurrency patterns with code examples to keep memory usage stable under load.

Code Mala Tang
Code Mala Tang
Code Mala Tang
Mastering Backpressure in Node.js: Prevent Silent OOM Crashes

What Is Backpressure?

Backpressure is a system capability that signals "stop! I can't keep up." In Node.js it appears when a producer generates data faster than a consumer can process, causing buffers to fill, memory usage to rise, and eventually an out‑of‑memory (OOM) crash.

Quick Mental Model

Imagine a coffee shop where orders arrive faster than the barista can make drinks; the counter fills up and chaos ensues. Backpressure is the barista shouting "pause new orders until the queue is cleared."

Why Node.js Is Prone to Silent Buffering

Node.js feels safe because it is event‑driven, but streams, HTTP request bodies, compression, JSON parsing, logging, and queues can all buffer data. Buffering looks helpful until it exhausts memory.

Typical OOM Pattern

You read data as fast as possible (network/file/queue).

Your processing (DB writes, compression, downstream API) is slower.

Node buffers the difference.

Memory grows steadily.

The process crashes, restarts, and repeats.

Concrete Example: Upload Proxy Without Backpressure

The naive implementation reads the entire request into memory before forwarding it, which works for tiny payloads but OOMs on large files.

import http from "node:http";
http.createServer(async (req, res) => {
  const chunks = [];
  for await (const chunk of req) chunks.push(chunk);
  const body = Buffer.concat(chunks); // 🚨 unlimited memory use
  // ...forward body somewhere...
  res.end("ok");
}).listen(3000);

Backpressure‑Friendly Approach Using Streams

Pipe the request directly to a writable stream; the pipeline handles backpressure automatically.

import http from "node:http";
import { pipeline } from "node:stream/promises";
http.createServer(async (req, res) => {
  try {
    const fs = await import("node:fs");
    const out = fs.createWriteStream("./upload.bin");
    await pipeline(req, out); // ✅ backpressure‑aware transfer
    res.end("uploaded");
  } catch (err) {
    res.statusCode = 500;
    res.end("failed");
  }
}).listen(3000);

Key Signals in Node Streams

writable.write(chunk)

returns true → continue writing.

Returns false → pause and wait for the 'drain' event.

The internal buffer size is controlled by highWaterMark. pipe() automatically pauses and resumes as needed.

Minimal Example Respecting .write() and drain

function writeWithBackpressure(writable, chunks) {
  return new Promise((resolve, reject) => {
    let i = 0;
    const writeMore = () => {
      while (i < chunks.length) {
        const ok = writable.write(chunks[i]);
        i++;
        if (!ok) {
          writable.once("drain", writeMore);
          return;
        }
      }
      writable.end();
    };
    writable.on("error", reject);
    writable.on("finish", resolve);
    writeMore();
  });
}

Real‑World Bottlenecks That Trigger Backpressure

Slow database writes – the buffer becomes a queue.

Compression/encryption streams – improper wrapping discards backpressure.

Downstream API retries – unlimited retry queue creates hidden buffering.

Logging under load – slow log sinks cause memory growth.

Practical Pattern: Bounded Concurrency + Streaming

When each chunk requires asynchronous work, limit the number of in‑flight operations.

import { setTimeout as sleep } from "node:timers/promises";

async function processStreamWithLimit(readable, limit, handler) {
  const inFlight = new Set();
  for await (const chunk of readable) {
    const p = Promise.resolve().then(() => handler(chunk));
    inFlight.add(p);
    p.finally(() => inFlight.delete(p));
    if (inFlight.size >= limit) {
      await Promise.race(inFlight); // ✅ slow down when saturated
    }
  }
  await Promise.all(inFlight);
}

// Example handler
async function fakeDbWrite(chunk) {
  await sleep(10);
}

Backpressure Checklist for Code Review

Avoid Buffer.concat(chunks) on untrusted or large inputs.

Prefer streaming JSON parsing instead of loading whole bodies into memory.

Respect the return value of write(); pause when it returns false.

Limit concurrency when processing each chunk asynchronously.

Ensure any in‑memory queues have size limits and are observable.

Consider behavior when downstream services become slow for minutes.

Symptoms in Production

Memory climbs steadily while CPU stays normal.

P95 latency rises without obvious errors.

Increased garbage‑collection activity.

Restarts temporarily "fix" the issue until load returns.

Conclusion

Backpressure is not a fancy optimization; it is an essential mechanism for building stable Node.js services that handle uploads, proxy request bodies, file streams, compression, database writes, or downstream API calls under load.

Node.jsStreamsOOMbackpressure
Code Mala Tang
Written by

Code Mala Tang

Read source code together, write articles together, and enjoy spicy hot pot together.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.