How React’s Fiber Scheduler Breaks Down Tasks for Smooth Rendering
This article explains React 16’s new Fiber architecture and its cooperative scheduling algorithm, showing how large diff tasks are split into small asynchronous units using priority queues and min‑heap structures, with code examples, performance visuals, and insights into real‑time and delayed task handling.
Weng Binbin, a front‑end engineer at WeDoctor Cloud, never stops improving as a programmer.
Analysis based on React, ReactDOM 16.13.1 version.
Preface
React 16 rewrote the core with a new fiber architecture, introducing a scheduling algorithm. In previous versions, the virtual DOM diff ran without interruption, consuming a lot of execution time and causing rendering delays and page jank. The new fiber architecture lets React break the diff work into many small asynchronous tasks, preventing UI blockage. This article dives into the inner workings of that scheduling algorithm.
How to Split a Giant Task?
Consider the following example that blocks the main thread for several seconds:
<script>
function read(arr) {
arr.forEach((_, i) => {
console.log(i);
});
}
const array = Array.from(Array(1000000)).fill(1);
console.log('task start');
read(array);
console.log('task end');
</script>
<h1>hello world</h1>The script first logs task start , then the browser’s tab spins while the JavaScript runs. After a few seconds it logs task end and finally renders hello world . The blockage occurs because JavaScript execution blocks rendering. The solution is to split the work.
Split into Small Tasks
We can group one million numbers into batches of 100, creating ten thousand tasks, and schedule each batch with a timer so it runs in the next event loop. React’s scheduler follows the same principle.
Chrome Performance screenshots confirm that the JavaScript tasks (yellow) are interleaved with layout (purple) and paint (green) phases after being split.
An improved version of the example looks like this:
const array = Array.from(Array(100000)).map((_, index) => index);
function read(arr) {
arr.forEach(function readCallback(item) {
console.log(item);
});
}
console.log('task start');
let i = 0;
const len = 100;
function task() {
setTimeout(function setTimeoutCallback(r) {
const results = array.slice(i, i + len);
if (results.length === 0) {
console.log('task end');
return;
}
read(results);
i += len;
task();
}, 0);
}
task();Now the browser opens instantly without any noticeable lag.
Scheduling Concepts
Two common scheduling strategies exist: pre‑emptive scheduling (the OS decides CPU time slices) and cooperative scheduling, which React uses—tasks voluntarily yield after a short slice.
Why Does React Need Scheduling?
Split diff work.
Prioritize tasks.
In React, a diff is triggered by updates such as ReactDOM.render, setState, forceUpdate, useState, or useReducer. Google’s RAIL model guides the prioritization:
Focus on user satisfaction.
Respond within 100 ms for discrete interactions.
Generate frames within 16 ms for animations.
Keep main‑thread work under 50 ms per task.
Deliver content within 1000 ms for first‑paint.
React classifies updates into priority levels:
// Immediate
const ImmediatePriority = 1;
var IMMEDIATE_PRIORITY_TIMEOUT = -1;
// User‑blocking (expires after 250 ms)
const UserBlockingPriority = 2;
var USER_BLOCKING_PRIORITY = 250;
// Normal (expires after 5000 ms)
const NormalPriority = 3;
var NORMAL_PRIORITY_TIMEOUT = 5000;
// Low (expires after 10000 ms)
const LowPriority = 4;
var LOW_PRIORITY_TIMEOUT = 10000;
// Idle (never expires)
const IdlePriority = 5;
var IDLE_PRIORITY = 1073741823;Triggering the Scheduler
render ultimately calls scheduleUpdateOnFiber. Both setState and forceUpdate eventually invoke scheduleUpdateOnFiber via enqueueSetState or enqueueReplaceState. useState is implemented on top of useReducer, which also calls scheduleUpdateOnFiber through dispatchAction.
The entry point is therefore scheduleUpdateOnFiber, which decides whether to schedule a synchronous or asynchronous callback.
export function scheduleUpdateOnFiber(
fiber,
expirationTime,
) {
const root = markUpdateTimeFromFiberToRoot(fiber, expirationTime);
if (root === null) {
return;
}
// ...
if (expirationTime === Sync) {
ensureRootIsScheduled(root);
schedulePendingInteractions(root, expirationTime);
if (executionContext === NoContext) {
flushSyncCallbackQueue();
}
} else {
ensureRootIsScheduled(root);
schedulePendingInteractions(root, expirationTime);
}
}
function ensureRootIsScheduled(root) {
// ...
if (/* sync */) {
callbackNode = scheduleSyncCallback(performSyncWorkOnRoot.bind(null, root));
} else if (/* async */) {
callbackNode = scheduleCallback(
priorityLevel,
performConcurrentWorkOnRoot.bind(null, root),
);
} else {
callbackNode = scheduleCallback(
priorityLevel,
performConcurrentWorkOnRoot.bind(null, root),
{ timeout: expirationTimeToMs(expirationTime) - now() },
);
}
root.callbackNode = callbackNode;
}Task Heaps
React uses two min‑heaps: a real‑time task heap and a delayed task heap. Tasks without a delay go into the real‑time heap; delayed tasks go into the delayed heap, reducing unnecessary postMessage loops.
Scheduling Flow
let taskIdCounter = 1;
// delayed task heap
const timerQueue = [];
// real‑time task heap
const taskQueue = [];
function unstable_scheduleCallback(priorityLevel, callback, options) {
var currentTime = getCurrentTime();
var startTime, timeout;
if (typeof options === 'object' && options !== null) {
var delay = options.delay;
if (typeof delay === 'number' && delay > 0) {
startTime = currentTime + delay;
} else {
startTime = currentTime;
}
timeout = typeof options.timeout === 'number'
? options.timeout
: timeoutForPriorityLevel(priorityLevel);
} else {
timeout = timeoutForPriorityLevel(priorityLevel);
startTime = currentTime;
}
var expirationTime = startTime + timeout;
var newTask = {
id: taskIdCounter++,
callback,
priorityLevel,
startTime,
expirationTime,
sortIndex: -1,
};
// delayed task
if (startTime > currentTime) {
newTask.sortIndex = startTime;
push(timerQueue, newTask);
if (peek(taskQueue) === null && newTask === peek(timerQueue)) {
requestHostTimeout(handleTimeout, startTime - currentTime);
}
} else {
// real‑time task
newTask.sortIndex = expirationTime;
push(taskQueue, newTask);
if (!isHostCallbackScheduled && !isPerformingWork) {
isHostCallbackScheduled = true;
requestHostCallback(flushWork);
}
}
return newTask;
}
function workLoop(hasTimeRemaining, initialTime) {
let currentTime = initialTime;
// move expired delayed tasks to real‑time heap
advanceTimers(currentTime);
currentTask = peek(taskQueue);
while (currentTask !== null) {
if (currentTask.expirationTime > currentTime && (!hasTimeRemaining || shouldYieldToHost())) {
break;
}
const callback = currentTask.callback;
if (callback !== null) {
currentTask.callback = null;
currentPriorityLevel = currentTask.priorityLevel;
const didUserCallbackTimeout = currentTask.expirationTime <= currentTime;
callback(didUserCallbackTimeout);
currentTime = getCurrentTime();
if (currentTask === peek(taskQueue)) {
pop(taskQueue);
}
advanceTimers(currentTime);
}
currentTask = peek(taskQueue);
}
if (currentTask !== null) {
return true; // more real‑time work
} else {
const firstTimer = peek(timerQueue);
if (firstTimer !== null) {
requestHostTimeout(handleTimeout, firstTimer.startTime - currentTime);
}
return false; // all work done
}
}requestHostCallback Implementation
React uses a MessageChannel to post work to the next macro‑task. The callback runs for a configurable time slice (default 5 ms). If more work remains, the channel posts another message to continue processing.
const performance = window.performance;
getCurrentTime = () => performance.now();
let isMessageLoopRunning = false;
let scheduledHostCallback = null;
let yieldInterval = 5;
let deadline = 0;
const shouldYieldToHost = function () {
return getCurrentTime() >= deadline;
};
const performWorkUntilDeadline = () => {
if (scheduledHostCallback !== null) {
const currentTime = getCurrentTime();
deadline = currentTime + yieldInterval;
const hasTimeRemaining = true;
try {
const hasMoreWork = scheduledHostCallback(hasTimeRemaining, currentTime);
if (!hasMoreWork) {
isMessageLoopRunning = false;
scheduledHostCallback = null;
} else {
port.postMessage(null);
}
} catch (error) {
port.postMessage(null);
throw error;
}
} else {
isMessageLoopRunning = false;
}
};
const channel = new MessageChannel();
const port = channel.port2;
channel.port1.onmessage = performWorkUntilDeadline;
requestHostCallback = function (callback) {
scheduledHostCallback = callback;
if (!isMessageLoopRunning) {
isMessageLoopRunning = true;
port.postMessage(null);
}
};
cancelHostCallback = function () {
scheduledHostCallback = null;
};Why Not Use requestIdleCallback?
Poor compatibility, especially on iOS.
Uncertain invocation frequency; the remaining time reported is at most 50 ms, insufficient for smooth rendering.
React needs additional options such as priority level and delay, which native requestIdleCallback does not provide.
Current Production Scheduling Mode
Most projects still use ReactDOM.render, which schedules callbacks with requestHostCallback but runs all tasks in a single time slice, so rendering is not truly interruptible. To enable concurrent mode with interruptible rendering, developers must use the experimental ReactDOM.createRoot API.
Conclusion
By analyzing the React Scheduler, we see how small diff tasks are cooperatively scheduled across multiple time slices, avoiding main‑thread blockage and delivering a fluid user experience.
Further Reading
Deep dive: “Scheduling in React” – https://juejin.cn/post/6844903821433372680#heading-3
Discussion of React Scheduler task management – https://zhuanlan.zhihu.com/p/48254036
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
WeDoctor Frontend Technology
Official WeDoctor Group frontend public account, sharing original tech articles, events, job postings, and occasional daily updates from our tech team.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
