Performance Comparison of fetch and Undici HTTP Clients in JavaScript
This article examines the design goals, typical use cases, and performance differences between the browser‑oriented fetch API and the high‑throughput Undici library for Node.js, including a simple benchmark that shows Undici completing the same workload roughly twice as fast as fetch.
Introduction to fetch and Undici
In modern JavaScript applications, fetch and Undici are two common HTTP client tools; while both issue network requests, they differ significantly in design goals, suitable scenarios, and performance characteristics.
fetch Overview
fetch is the standard browser API for making network requests, designed to provide a simple, unified way to handle HTTP requests and responses. It is widely used in front‑end development for GET, POST, JSON retrieval, form submission, etc. Since Node.js 18, fetch is also available on the server side.
The main advantages of fetch are its concise Promise‑based API and ease of use, making it ideal for front‑end developers. However, in high‑concurrency, large‑payload, or long‑lived connection scenarios, fetch may be less efficient because each request often creates a new connection, adding overhead for high‑performance server applications.
Undici Overview
Undici is a high‑performance HTTP client built specifically for Node.js, targeting high‑concurrency and high‑throughput network demands. Compared with fetch, Undici focuses on performance optimization, especially in server‑side environments. Its name comes from the Italian word for “eleven,” referencing the HTTP/1.1 RFC.
Undici’s core strengths lie in efficient connection management via a built‑in connection pool that reuses HTTP connections, reducing the cost of establishing new connections. It fully supports HTTP/1.1 and HTTP/2, excels at stream handling for large data transfers, and offers robust error handling with automatic retries.
Comparison
fetch is a general‑purpose HTTP client suitable for browsers and simple server requests, whereas Undici is engineered for high‑performance, high‑concurrency Node.js server applications. Undici’s connection pooling and stream processing provide a noticeable performance boost in complex server scenarios.
The following table summarizes their features, performance, and typical use cases:
Feature
fetch
Undici
Target Environment
Primarily browsers; Node.js 18+ support
Designed for Node.js server‑side applications
Design Goal
General HTTP client for simple requests
High‑performance, low‑overhead client for high concurrency
Performance
Moderate, suitable for small or ordinary requests
High performance, especially under heavy load
Connection Management
May create a new connection per request
Built‑in connection pool with reuse
Async Support
Native Promise‑based API
Optimized async handling using modern JS features
Stream Handling
Supports
ReadableStreamfor streaming responses
Efficient streaming for large data transfers
Error Handling
Manual handling required
Built‑in mechanisms with automatic retries
Request Interception
Can abort via
AbortControllerProvides internal interception for complex control
HTTP/2 Support
Not supported
Full support for HTTP/1.1 and HTTP/2
File Upload
Uses
FormDataEfficient handling of file uploads and large payloads
API Complexity
Simple and concise
Rich configuration options and features
Dependencies
No extra dependencies; native in Node.js
Requires npm installation, allowing flexible upgrades
Typical Scenarios
Simple, generic HTTP requests, especially in browsers
High‑performance, high‑concurrency server apps and micro‑services
Extensibility
Strong generality, custom wrappers needed for complex cases
Highly extensible for complex request requirements
Ease of Use
Very easy for front‑end developers
Steeper learning curve but significant performance gains
Additional Features
Built‑in CORS support for cross‑origin requests
Focus on performance optimization and resource management
Community Support
Widely supported browser API with extensive docs
Maintained by the Node.js team, gaining adoption
Simple Test
The author performed a basic benchmark by repeatedly issuing 100,000 requests to a local HTTP server, measuring total execution time for each client.
import { request } from 'undici'; // using undici
let start = new Date().getTime();
for (let i = 0; i < 100000; i++) {
// await request('http://localhost:8080').then(response => {
// response.body.text();
// });
await fetch('http://localhost:8080').then(response => {
response.text();
});
}
let duration = new Date().getTime() - start;
console.log('cost time:', duration / 1000);The local server responded instantly, capable of over 100k QPS. Using raw timing, fetch took about 8 seconds, while undici completed in roughly 4.2 seconds, indicating roughly a two‑fold speed improvement.
Undici also offers a stream method, a faster variant of request , though the author did not observe a noticeable difference in their test.
/ A faster version of `request`.
declare function stream(
url: string | URL | UrlObject,
options: { dispatcher?: Dispatcher } & Omit
,
factory: Dispatcher.StreamFactory
): Promise
;Root Cause Analysis
Undici’s superior performance over native fetch stems mainly from two areas:
Connection Management
Connection reuse: Undici’s built‑in pool reuses TCP connections, reducing the overhead of establishing and closing connections, which is especially beneficial under high concurrency.
Efficient concurrent handling: Its connection strategy significantly lowers latency when processing many simultaneous requests.
Performance Optimizations
Lean implementation: Undici minimizes abstraction layers and extra features, resulting in faster request processing.
Node.js‑specific design: It fully leverages Node.js’s asynchronous capabilities.
Reduced middleware: fetch’s general‑purpose nature introduces additional overhead that Undici avoids by focusing solely on server‑side use cases.
The current test covers a serial asynchronous scenario; further testing in fully parallel or limited‑concurrency contexts is pending.
FunTester
10k followers, 1k articles | completely useless
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.