How to Merge Similar Requests and Manage Concurrency in Frontend Apps
This article explains how to combine duplicate API calls, ensure only the latest request result is displayed, and control concurrent request limits in JavaScript/React applications, providing practical code examples and execution flow analysis.
Introduction
In everyday development we frequently interact with servers, but even simple requests have interesting considerations. Browsers limit concurrent connections to about 6‑8, so we need strategies to merge similar requests.
Case Study
1. Merging Similar Requests
1.1 Background
During a login feature review, multiple components called usePromission to check permissions, resulting in many identical requests that could exceed the browser's concurrency limit. Merging these requests is necessary.
1.2 Implementation: Merging Similar Requests
Mock request function
function mockRequest(params) {
return new Promise((resolve) => {
setTimeout(() => {
resolve(
params.map((p) => {
return {
code: p,
isAllow: Math.random() > 0.5,
};
})
);
}, 1000);
});
}Other code handling request queue
// Use Promise.resolve() to create a promise instance and add a task to the micro‑task queue
let p = Promise.resolve();
let isFlushing = false;
let params = new Set();
function execute() {
if (isFlushing) return;
isFlushing = true;
p = p.then(() => {
return mockRequest([...params]);
});
}
function print(param) {
params.add(param);
execute();
return p.then((data) => {
console.log('data', data);
return data.find((item) => item.code === param).isAllow;
});
}
function test() {
for (let i = 0; i < 10; i++) {
print(i).then((isAllow) => {
console.log(isAllow);
});
}
}
test();Execution order:
i = 0: call print, add param, start execute, set isFlushing true, promise pending.
i = 1‑9: print adds params, execute returns early because isFlushing is true.
After loop ends, micro‑task runs, resolves all requests and logs results.
2. Showing the Latest Result for Different Requests
2.1 Background
When a page has tabs that trigger different APIs, rapid switching may cause an older request to finish after a newer one, leading to mismatched UI state. We need a way to ensure only the latest request updates the view.
2.2 Implementation: Show Latest Result
2.3 Example
Two buttons request 3 or 6 items; the 6‑item request is deliberately slower. The UI should display the 3‑item result when it finishes later.
import { Button } from "antd";
import { useRef, useState } from "react";
const ClickCount = () => {
const [data, setData] = useState([]);
const fetch = (num) => {
// Simulate slower response for larger num
return new Promise((resolve) => {
setTimeout(() => {
resolve(Array(num).fill({}).map((item, index) => ({ index })));
}, num * 500);
});
};
const getData = (num) => {
fetch(num).then((data) => {
setData(data);
});
};
return (
<div>
<Button onClick={() => { getData(3); }}>Click to get 3 items</Button>
<Button onClick={() => { getData(6); }}>Click to get 6 items</Button>
{data.map(({ index }) => (
<div key={index}>Rendered data {index + 1}</div>
))}
</div>
);
};
export default ClickCount;2.4 Solution
Use a useRef counter to track the latest request ID. Only update state when the response ID matches the current counter.
Global request counter: const requestPool = useRef(0); Increment on each request and capture the ID.
When the promise resolves, compare requestId === requestPool.current before setting state.
import { Button } from "antd";
import { useRef, useState } from "react";
const ClickCount = () => {
const [data, setData] = useState([]);
const requestPool = useRef(0); // tracks latest request
const fetch = (num) => {
return new Promise((resolve) => {
setTimeout(() => {
resolve(Array(num).fill({}).map((item, index) => ({ index })));
}, num * 500);
});
};
const getData = (num) => {
requestPool.current += 1;
const requestId = requestPool.current;
fetch(num).then((data) => {
console.log('data', data);
if (requestId === requestPool.current) {
setData(data);
}
});
};
return (
<div>
<Button onClick={() => { getData(3); }}>Click to get 3 items</Button>
<Button onClick={() => { getData(6); }}>Click to get 6 items</Button>
{data.map(({ index }) => (
<div key={index}>Rendered data {index + 1}</div>
))}
</div>
);
};
export default ClickCount;Result: Even if the 6‑item request is slower, the UI only shows the data from the latest request.
3. Different Requests with Concurrency Control
3.1 Background
Sometimes multiple APIs need to be called concurrently, such as loading several resources on page init or downloading multiple files, while respecting the browser's concurrency limit.
3.2 Implementation: Concurrent Requests
3.3 Example
A button triggers six mock requests, but only two should run at the same time.
import { Button } from "antd";
const Demo = () => {
// Mock request
const fetch = (num) => {
return new Promise((resolve) => {
setTimeout(() => {
console.log('num', num);
resolve(num);
// num 1,3,5 return in 1s; 2,4,6 return in 3s
}, 1000 + (num % 2 === 0 ? 2000 : 0));
});
};
const task1 = () => fetch(1);
const task2 = () => fetch(2);
const task3 = () => fetch(3);
const task4 = () => fetch(4);
const task5 = () => fetch(5);
const task6 = () => fetch(6);
const test = async () => {
task1();
task2();
task3();
task4();
task5();
task6();
};
return <Button onClick={test}>Test</Button>;
};
export default Demo;Execution order: after 1 s, 1 3 5 print; after another 2 s, 2 4 6 print.
3.4 Solution
Define a request queue that limits active requests to a maximum concurrency (2). When a request finishes, start the next pending one and resolve when all are done.
import { Button } from "antd";
const Demo = () => {
const fetch = (num) => {
return new Promise((resolve) => {
setTimeout(() => {
resolve(num);
}, 1000 + (num % 2 === 0 ? 2000 : 0));
});
};
// concurrency queue creator
const requestQueue = (concurrency) => {
return (tasks) => {
let result = [];
let curIndex = 0;
let activeCount = 0;
return new Promise((resolve) => {
const dequeue = (task, index) => {
activeCount++;
task()
.then((data) => {
result[index] = data;
})
.finally(() => {
activeCount--;
if (curIndex < tasks.length) {
dequeue(tasks[curIndex], curIndex++);
}
if (activeCount === 0) {
resolve(result);
}
});
};
while (curIndex < tasks.length && activeCount < concurrency) {
dequeue(tasks[curIndex], curIndex++);
}
});
};
};
const enqueue = requestQueue(2);
const task1 = () => fetch(1);
const task2 = () => fetch(2);
const task3 = () => fetch(3);
const task4 = () => fetch(4);
const task5 = () => fetch(5);
const task6 = () => fetch(6);
const test = async () => {
const data = await enqueue([task1, task2, task3, task4, task5, task6]);
console.log('data', data);
};
return <Button onClick={test}>Test</Button>;
};
export default Demo;Execution order becomes 1 3 2 5 4 6 with final result [1,2,3,4,5,6].
Conclusion
Request handling may seem trivial, yet it offers many optimization opportunities: merging similar calls, ensuring the latest result is shown, and controlling concurrency. These patterns improve performance and user experience in front‑end development.
Goodme Frontend Team
Regularly sharing the team's insights and expertise in the frontend field
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
