How to Prevent Request Overload with a Simple Frontend Request Queue
When a page needs to fire many simultaneous requests—such as loading ten resources at once or uploading dozens of files—the browser can freeze and the server can be overwhelmed, so using a request queue to limit concurrency keeps both responsive and stable.
There are scenarios where a page loads and needs to send many requests at once, causing the page to freeze and the server to near collapse, or where a user triggers dozens of file uploads causing the browser to spin endlessly.
Core idea: do not fire all requests at once; queue them so they run one by one or in small batches.
It’s like a supermarket with one checkout counter and a hundred customers—let them line up instead of all rushing in. Our “request queue” acts as the line manager.
Direct code: a plug‑and‑play request queue
Copy the
RequestPoolclass into your project; it’s under 40 lines.
/**
* A simple request pool/queue to control concurrency
* @example
* const pool = new RequestPool(3); // limit concurrency to 3
* pool.add(() => myFetch('/api/1'));
* pool.add(() => myFetch('/api/2'));
*/
class RequestPool {
/**
* @param {number} limit - concurrency limit
*/
constructor(limit = 3) {
this.limit = limit; // concurrency limit
this.queue = []; // pending requests
this.running = 0; // currently running requests
}
/**
* Add a request to the pool
* @param {Function} requestFn - a function returning a Promise
* @returns {Promise}
*/
add(requestFn) {
return new Promise((resolve, reject) => {
this.queue.push({ requestFn, resolve, reject });
this._run(); // try to run after each addition
});
}
_run() {
// Run while there is room and pending tasks
while (this.running < this.limit && this.queue.length > 0) {
const { requestFn, resolve, reject } = this.queue.shift(); // take next task
this.running++;
requestFn()
.then(resolve)
.catch(reject)
.finally(() => {
this.running--; // free a slot
this._run(); // try to start next one
});
}
}
}How to use it in three steps
Assume you have a request function
mockApithat simulates a slow API.
What happens?
When you run the code you’ll see:
[1]and
[2]start almost simultaneously.
[3],
[4],
[5],
[6]wait in line.
When either
[1]or
[2]finishes, the next queued request starts immediately.
The number of concurrent requests never exceeds 2 .
Console output looks like this:
[1] 🚀 请求开始...
[2] 🚀 请求开始...
// (3,4,5,6 are queued)
[1] ✅ 请求完成!
[1] 收到结果: 任务 1 的结果
[3] 🚀 请求开始... // 1 finished, 3 starts
[2] ✅ 请求完成!
[2] 收到结果: 任务 2 的结果
[4] 🚀 请求开始... // 2 finished, 4 starts
...How it works
add(requestFn) : you provide a “starter” function (e.g.,
() => mockApi(i)) which is placed into the
queuearray.
_run() : the manager checks if there is a free slot (
running < limit) and if the queue is non‑empty; if both are true it shifts the first task, increments
running, and executes the request.
.finally() : after a request settles, it decrements
runningand calls
_run()again to possibly start the next queued request.
This creates an automated flow: as soon as one finishes, the next starts automatically.
In future batch‑request scenarios, avoid blasting all requests with
Promise.all; instead copy the tiny
RequestPoolinto your project, set a reasonable concurrency (e.g., 2 or 3), and wrap your request functions to dramatically reduce server load and keep the app smooth.
This is a simple, elegant, and highly effective frontend optimization technique.
JavaScript
Provides JavaScript enthusiasts with tutorials and experience sharing on web front‑end technologies, including JavaScript, Node.js, Deno, Vue.js, React, Angular, HTML5, CSS3, and more.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.