Understanding Node.js HTTP Request Processing and Its Performance Overheads

This article explains how Node.js handles HTTP requests using an I/O multiplexing model, walks through a simple hello‑world server, examines connection and request events, discusses keep‑alive, Expect headers, proxying, and presents performance measurements comparing Node.js to a pure C implementation.

Hujiang Technology
Hujiang Technology
Hujiang Technology
Understanding Node.js HTTP Request Processing and Its Performance Overheads

Node.js processes network requests with an I/O multiplexing model that uses a single thread and leverages operating‑system asynchronous I/O, reducing context switches and resource consumption compared to traditional thread‑per‑connection models.

A minimal "hello world" HTTP server is created with

require('http').createServer((req, res) => { res.end('hello world'); }).listen(3333);

, which internally creates an http.Server that inherits from net.Server and ultimately a TCP object implemented in C++.

The createServer([requestListener]) method registers listeners for the request and connection events; the latter triggers when a new socket is accepted, wrapping the file descriptor in a net.Socket object.

void TCPWrap::Listen(const FunctionCallbackInfo<Value>& args) {
  int err = uv_listen(reinterpret_cast<uv_stream_t*>(&wrap->handle_),
                     backlog,
                     OnConnection);
  args.GetReturnValue().Set(err);
}

When data arrives on the socket, the http‑parser (a high‑performance C library) parses the request and invokes callbacks such as on_message_begin, on_url, on_headers_complete, on_body, and on_message_complete. These callbacks are wrapped by the JavaScript Parser class and expose five events to JavaScript: kOnHeaders, kOnHeadersComplete, kOnBody, kOnMessageComplete, and kOnExecute.

For keep‑alive connections, Node.js maintains separate incoming and outgoing queues per socket; after response.end() the finish event fires, and the server either closes the connection or emits another request event if more requests remain. The default idle timeout is two minutes and can be changed via http.Server.setTimeout.

var delay = [2000, 30, 500];
var i = 0;
require('http').createServer((req, res) => {
  setTimeout(() => { res.end('hello world'); }, delay[i]);
  i = (i + 1) % delay.length;
}).listen(3333, () => { console.log('listen at 3333'); });

When a client sends an Expect: 100-continue header (typically for large POST bodies), Node.js automatically replies with status 100 and triggers the checkContinue event; other expectation values trigger checkExpectation and result in a 417 response.

curl -vs --header "Expect:100-continue" http://localhost:3333

A simple HTTP proxy can be built by listening to request, parsing the target URL, forwarding the request with http.request, and piping the response back to the client.

var http = require('http');
var url = require('url');
http.createServer((req, res) => {
  var urlObj = url.parse(req.url);
  var options = {
    hostname: urlObj.hostname,
    port: urlObj.port || 80,
    path: urlObj.path,
    method: req.method,
    headers: req.headers
  };
  var proxyRequest = http.request(options, (proxyResponse) => {
    res.writeHead(proxyResponse.statusCode, proxyResponse.headers);
    proxyResponse.pipe(res);
  }).on('error', () => { res.end(); });
  req.pipe(proxyRequest);
}).listen(8089, '0.0.0.0');

Node.js includes several performance optimizations: a cache pool for http_parser objects (up to 1000 instances), a default limit of 32 header fields to avoid dynamic allocations, and the ability to set maxConnections for overload protection.

A comparative benchmark using a pure C HTTP server built on libuv shows that the C implementation handles 100 000 requests with 5000 concurrent connections in about 0.8 s and 0.6 MB memory, whereas the Node.js version takes roughly 5 s and 51 MB memory, illustrating the trade‑off between raw performance and development simplicity.

Overall, while Node.js introduces some overhead, its concise API, event‑driven architecture, and extensive ecosystem make it a practical choice for building HTTP services.

proxybackend developmentNode.jsTCPHTTPEvent Loop
Hujiang Technology
Written by

Hujiang Technology

We focus on the real-world challenges developers face, delivering authentic, practical content and a direct platform for technical networking among developers.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.