How to Control Concurrent Requests with a Custom Promise Limiter in JavaScript
This article explains why browsers limit concurrent HTTP requests, demonstrates the limitation with a test example, and provides a reusable PromiseLimiter class that queues and throttles asynchronous calls to keep the number of simultaneous requests under a defined threshold.
In some cases we need to control the number of concurrent requests.
For example, when writing download or crawler tools, some websites limit the number of concurrent requests.
In browsers, the maximum TCP connections per origin are limited to 6. This means that if you send more than six HTTP/1.1 requests at the same time, the seventh request will wait until a previous request finishes.
We can test this with a simple example. First, the client code:
<code>async function test() {
await Promise.all([...new Array(12)].map((_, i) =>
fetch(`http://127.0.0.1:3001/get/${i}`)));
}
</code>Next, a brief server implementation:
<code>router.get('/get/:id', async ctx => {
const order = Number(ctx.params.id);
if (order % 2 === 0) {
await sleep(2000);
} else {
await sleep(1000);
}
ctx.body = 'done';
});
</code>For the first six requests, even‑numbered IDs wait 2 seconds, odd ones wait 1 second. Opening the Network tab in DevTools shows the Time and Waterfall columns, illustrating the browser's concurrency limit model.
You might wonder why we need to implement our own concurrency control when the browser already limits it. The reasons are:
Resource management – prioritize important requests.
Avoid long queues that cause timeouts.
Prevent server overload and improve response efficiency.
Make retries or error handling easier when a request fails.
Here is a simple queue class to manage the request list:
<code>class Queue {
constructor() {
this.tasks = [];
}
enqueue(task) {
this.tasks.push(task);
}
dequeue() {
return this.tasks.shift();
}
clear() {
this.tasks = [];
}
size() {
return this.tasks.length;
}
}
</code>The PromiseLimiter class schedules tasks, exposing a limit method that enqueues functions and runs them until the concurrency limit is reached; excess requests wait until a running task finishes.
<code>class PromiseLimiter {
constructor(limitCount) {
this.queue = new Queue();
this.runningCount = 0;
this.limitCount = limitCount;
}
get activeCount() {
return this.runningCount;
}
get pendingCount() {
return this.queue.size();
}
async next() {
if (this.runningCount < this.limitCount && this.queue.size() > 0) {
this.queue.dequeue()?.();
}
}
async run(fn) {
return new Promise((resolve, reject) => {
this.runningCount++;
const result = fn();
resolve(result);
result.then(() => {
this.runningCount--;
this.next();
}).catch(() => {
// ignore errors
this.runningCount--;
this.next();
});
});
}
limit(fn) {
return new Promise(resolve => {
this.queue.enqueue(() => {
this.run(fn).then(resolve);
});
this.next();
});
}
}
export default PromiseLimiter;
</code>Usage example with a limiter of 3 concurrent tasks:
<code>const limiter = new PromiseLimiter(3);
const mockPromise = (i) => new Promise(resolve => setTimeout(() => resolve(i), 1000));
(async () => {
const results = await Promise.allSettled(
[...new Array(6)].map((_, i) => limiter.limit(() => mockPromise(i)))
);
console.log('results: ', results);
})();
</code>By employing this pattern you can manage request concurrency, prioritize critical calls, avoid timeouts, protect servers from overload, and simplify error handling.
Code Mala Tang
Read source code together, write articles together, and enjoy spicy hot pot together.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.