Mastering ReadableStream: A Deep Dive into Web Streams API
This article introduces the concept of streams, explains the Web Streams API and its ReadableStream component, details constructors, methods, queuing strategies, back‑pressure handling, BYOB and byte streams, and provides practical code examples and usage scenarios for modern web development.
For XMLHttpRequest and Fetch, which are the main protocols for front‑end and back‑end communication, we are all familiar with sending HTTP requests in text format. Have you used the Web Streams API? It is an API for operating Streams in the browser, and today we focus on ReadableStream.
01 What Is a Stream?
A stream, as the name suggests, continuously flows from one point to another, similar to water. In browsers, streams split resources into small chunks and process them piece by piece. Streams are everywhere: video playback can start before the entire file is downloaded, and images can gradually appear.
Historically, JavaScript could not handle streams directly; developers had to download the whole file, deserialize it, and then process the content. The Web Streams API began gaining browser support around 2017 and was standardized by WHATWG. By 2018 most modern browsers supported it, and later Node.js also incorporated the API, allowing unified stream handling on both client and server.
02 ReadableStream Overview
The constructor syntax is:
<code>new ReadableStream(underlyingSource, queuingStrategy)</code>Both parameters are optional. underlyingSource defines the stream’s behavior and can implement the following methods:
start(controller) (optional) – Called immediately when the stream is constructed. It may be async and should return a promise. The controller is a ReadableStreamDefaultController or ReadableByteStreamController that provides close , enqueue , and error methods.
pull(controller) (optional) – Invoked repeatedly while the internal queue is not full. If it returns a promise, the stream waits for the promise to settle before further pulling.
cancel(reason) (optional) – Called when the stream is cancelled. It may return a promise and receives a DOMString describing the cancellation reason.
The constructor also accepts optional properties:
type (optional) – If set to bytes , the controller becomes a ReadableByteStreamController that handles BYOB (bring‑your‑own‑buffer) byte streams; otherwise it defaults to ReadableStreamDefaultController .
autoAllocateChunkSize (optional) – For byte streams, enables automatic allocation of an ArrayBuffer of the given size.
The queuingStrategy can define:
highWaterMark (optional) – A non‑negative integer that sets the maximum number of chunks the internal queue can hold before applying back‑pressure.
size(chunk) (optional) – A function that returns the size of each chunk (in bytes) to help determine when the high water mark is reached.
Example:
<code>const stream = new ReadableStream({
start(controller) {
interval = setInterval(() => {
let string = randomChars();
controller.enqueue(string);
let listItem = document.createElement("li");
listItem.textContent = string;
list1.appendChild(listItem);
}, 1000);
button.addEventListener("click", function () {
clearInterval(interval);
fetchStream();
controller.close();
});
},
pull(controller) {
// not needed in this example
},
cancel() {
clearInterval(interval);
},
});</code>ReadableStream provides several instance methods and one static method:
from() – Wraps an iterable, async iterable, array, promise array, async generator, another ReadableStream, or a Node.js readable stream into a ReadableStream.
cancel() – Terminates the stream, discarding any queued chunks.
getReader() – Returns a reader that locks the stream for exclusive reading.
pipeThrough() – Pipes the stream through a TransformStream or any pair of writable/readable streams.
pipeTo() – Pipes the stream to a WritableStream and returns a promise that resolves when the transfer completes.
tee() – Creates two identical copies of the stream.
03 Back‑Pressure Explained
Back‑pressure is a mechanism that handles mismatched production and consumption rates. When producers generate data faster than consumers can process it, back‑pressure prevents overload by:
Rate control – Consumers signal the producer about the acceptable data rate.
Buffering – Excess data is temporarily stored in a buffer.
Pause/Resume – If the buffer fills, the producer pauses until space is available.
In the Web Streams API, back‑pressure is built‑in via highWaterMark , the optional size() function, and the desiredSize property of ReadableStreamDefaultController . When desiredSize becomes negative, the producer stops pushing data until the size becomes non‑negative again.
04 When to Use start vs pull
Use start for one‑time initialization such as opening a network connection or preparing a data source.
Use pull for dynamic data sources where you want to generate data based on the consumer’s demand (pull‑based streams).
Example combining both:
<code>const stream = new ReadableStream({
start(controller) {
const fetchRequest = fetch('your-data-source');
fetchRequest.then(response => {
const reader = response.body.getReader();
reader.read().then(({ value, done }) => {
if (done) {
controller.close();
return;
}
controller.enqueue(value);
});
});
},
pull(controller) {
// Called when internal queue is not full; can request more data here.
}
});</code>05 Byte Streams vs Default Streams
Setting type: 'bytes' creates a byte stream. The controller becomes a ReadableByteStreamController , allowing BYOB (bring‑your‑own‑buffer) reads where the consumer supplies an ArrayBuffer . This offers fine‑grained memory control and performance benefits for large binary data.
Example BYOB usage:
<code>// BYOB mode
const stream = new ReadableStream({
type: 'bytes',
pull(controller) {
// Fill controller's internal buffer from the source.
}
});
const reader = stream.getReader({ mode: 'byob' });
const buffer = new ArrayBuffer(1024);
reader.read(buffer).then(({ done, value }) => {
// Process data
});
// Simple byte stream
const stream = new ReadableStream({
type: 'bytes',
start(controller) {
controller.enqueue(new Uint8Array(/* ... */));
}
});
const reader = stream.getReader();
reader.read().then(({ done, value }) => {
// Process Uint8Array data
});</code>06 Consuming a ReadableStream
After creation, obtain a reader via getReader() and read chunks recursively:
<code>const stream = new ReadableStream({
start(controller) {
fetch("xxxxxxxxxxxx").then(response => {
if (!response.ok) throw new Error('Network response was not ok');
const reader = response.body.getReader();
return readFromStream(reader, controller);
}).catch(error => {
console.error('Fetch error:', error);
controller.error(error);
});
},
pull(controller) {
// No special logic needed here for this example.
}
});
function readFromStream(reader, controller) {
reader.read().then(({ value, done }) => {
if (done) {
controller.close();
return;
}
controller.enqueue(value);
return readFromStream(reader, controller);
});
}
const reader = stream.getReader();
function readStreamData(reader) {
reader.read().then(({ value, done }) => {
if (done) {
console.log('Stream has ended');
return;
}
console.log('Stream data chunk:', value);
readStreamData(reader);
}).catch(error => {
console.error('Error reading stream:', error);
});
}
readStreamData(reader);</code>Note that response.body itself is a ReadableStream , so you can pipe data from a fetch stream into a custom stream.
07 Real‑World Use Cases
Network request response handling – fetch returns a ReadableStream for incremental reading.
File upload and download – stream large files in chunks to reduce memory usage.
Real‑time data streams – e.g., stock ticker updates or live chat.
Video and audio streaming – progressive playback without full download.
Log file processing – read and analyze logs in real time.
Data conversion – transform CSV to JSON on the fly.
Dynamic image/chart generation – stream generated binary data to the client.
Database query results – stream large result sets to avoid loading everything at once.
WebGL / WebAssembly byte‑stream operations.
Server‑side pagination – stream paged data to the client.
API response streaming – send large payloads lazily.
Multi‑stage data processing pipelines – pass data between stages via streams.
Web Streams API, especially ReadableStream, provides a powerful and flexible data‑processing solution for modern web development. It improves memory efficiency and optimizes asynchronous stream handling, and as browser support matures, ReadableStream is poised to become an indispensable tool for future web applications.
Code Mala Tang
Read source code together, write articles together, and enjoy spicy hot pot together.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.