How AsyncContext Enables Seamless Async Data Propagation in JavaScript
This article explains the TC39 Async Context proposal, demonstrates how it lets developers propagate identifiers and other values across synchronous and asynchronous JavaScript call stacks, and shows practical use cases such as tracing, task scheduling, and comparisons with thread‑local storage.
Background
Led by Alibaba's TC39 representative, the Async Context proposal became a TC39 Stage 1 proposal in early February 2023. Its goal is to define a way to pass data through JavaScript asynchronous tasks.
Consider a simple npm library that provides log and run functions. Users pass a callback and an id to run, which invokes the callback and allows the callback to call log. The library automatically tags log output with the current id without the user having to thread the identifier through every function.
// my-awesome-library
let currentId = undefined;
export function log() {
if (currentId === undefined) throw new Error('must be inside a run call stack');
console.log(`[${currentId}]`, ...arguments);
}
export function run<T>(id: string, cb: () => T) {
let prevId = currentId;
try {
currentId = id;
return cb();
} finally {
currentId = prevId;
}
}Usage example:
import { run, log } from 'my-awesome-library';
import { helper } from 'some-random-npm-library';
document.body.addEventListener('click', () => {
const id = nextId();
run(id, () => {
log('starting');
// helper may call doSomething.
helper(doSomething);
log('done');
});
});
function doSomething() {
log("did something");
}For each click, the logs appear as:
[id1] starting [id1] did something [id1] doneThis demonstrates an id -based mechanism that propagates through the synchronous call stack, similar to how React Context passes data through component trees.
When asynchronous operations are introduced, the simple stack‑based approach breaks:
document.body.addEventListener('click', () => {
const id = new Uuid();
run(id, async () => {
log('starting');
await helper(doSomething);
// This log may lose the expected id.
log('done');
});
});
function doSomething() {
// Whether this log prints the expected id depends on whether helper awaited before calling it.
log("did something");
}AsyncContext
AsyncContextis a storage that propagates arbitrary JavaScript values across both synchronous and asynchronous boundaries. It provides three minimal operations:
class AsyncContext<T> {
// Capture a snapshot of all AsyncContext instances in the current execution context and return a function that restores it.
static wrap<R>(fn: (...args: any[]) => R): (...args: any[]) => R;
// Immediately execute fn while setting value as the current AsyncContext instance's value. The value is snapshot for any async work started inside fn.
run<R>(value: T, fn: () => R): R;
// Retrieve the current value of this AsyncContext instance.
get(): T;
} AsyncContext.prototype.run()writes a value, get() reads it, and wrap() snapshots and later restores the whole AsyncContext state.
// simple task‑queue example
const loop = {
queue: [],
addTask: fn => { queue.push(AsyncContext.wrap(fn)); },
run: () => { while (queue.length > 0) { const fn = queue.shift(); fn(); } }
};
const ctx = new AsyncContext();
ctx.run('1', () => {
loop.addTask(() => { console.log('task:', ctx.get()); });
setTimeout(() => { console.log(ctx.get()); }, 1000); // => 1
});
ctx.run('2', () => {
setTimeout(() => { console.log(ctx.get()); }, 500); // => 2
});
console.log(ctx.get()); // => undefined
loop.run(); // => task: 1Usage Scenarios
Async chain tracing
OpenTelemetry and other APM tools need to propagate trace data without requiring developers to modify business code. By storing the current span in an AsyncContext, the runtime can retrieve the trace information at any point in the async call chain.
// tracer.js
const context = new AsyncContext();
export function run(cb) {
const span = {
parent: context.get(),
startTime: Date.now(),
traceId: randomUUID(),
spanId: randomUUID()
};
context.run(span, cb);
}
export function end() {
const span = context.get();
span?.endTime = Date.now();
}
// automatic fetch instrumentation
const originalFetch = globalThis.fetch;
globalThis.fetch = (...args) => {
return run(() => originalFetch(...args).finally(() => end()));
};Application code remains unchanged:
// my-app.js
import * as tracer from './tracer.js';
button.onclick = e => {
tracer.run(async () => {
await clickHandler();
tracer.end();
});
};
const clickHandler = () => {
return fetch('https://example.com')
.then(res => processBody(res.body))
.then(data => {
const dialog = html`<dialog>Here's some cool data: ${data}<button>OK, cool</button></dialog>`;
dialog.show();
});
};Async task attribute propagation
Web APIs such as the Scheduling APIs can benefit from automatically propagating task attributes like priority. By using AsyncContext, a scheduler can store the priority in the async context and retrieve it later, eliminating the need for explicit parameter passing.
// simple scheduler example
const scheduler = {
context: new AsyncContext(),
postTask(task, options) {
this.context.run({ priority: options.priority }, task);
},
currentTask() { return this.context.get() ?? { priority: 'default' }; }
};
// user code
const res = await scheduler.postTask(task, { priority: 'background' });
console.log(res);
async function task() {
const resp = await fetch('/hello');
const text = await resp.text();
scheduler.currentTask(); // => { priority: 'background' }
return doStuffs(text);
}
async function doStuffs(text) { return text; }This pattern addresses a known challenge in the WICG Scheduling APIs proposal.
Prior Arts
Thread‑local variables
Thread‑local storage gives each thread its own instance of a variable, solving re‑entrancy problems such as the global errno in C. Example:
#include <iostream>
#include <string>
#include <thread>
#include <mutex>
thread_local unsigned int rage = 1;
std::mutex cout_mutex;
void increase_rage(const std::string& thread_name) {
++rage;
std::lock_guard<std::mutex> lock(cout_mutex);
std::cout << "Rage counter for " << thread_name << ": " << rage << '
';
}
int main() {
std::thread a(increase_rage, "a"), b(increase_rage, "b");
a.join(); b.join();
{
std::lock_guard<std::mutex> lock(cout_mutex);
std::cout << "Rage counter for main: " << rage << '
';
}
return 0;
}Output shows each thread has its own independent rage value.
AsyncLocalStorage
Node.js provides AsyncLocalStorage as an async‑local variable mechanism. AsyncContext builds on this API.
class AsyncLocalStorage<T> {
constructor();
// Run callback with store set to value.
run<R>(store: T, callback: (...args: any[]) => R, ...args: any[]): R;
// Get current store.
getStore(): T;
}
class AsyncResource {
constructor();
// Snapshot current async‑local state and run fn with it restored.
runInAsyncScope<R>(fn: (...args: any[]) => R, thisArg, ...args: any[]): R;
}The Async Context proposal is still at Stage 1; the API may evolve but the core concepts will remain stable.
Noslate & WinterCG
Noslate Aworker, a member of the Web‑Interoperable Runtimes CG (WinterCG), is implementing a subset of AsyncLocalStorage that aligns with the future AsyncContext API, providing an early‑adoption path for runtimes such as Cloudflare Workers and Deno.
More ECMAScript proposals
The JavaScript Chinese Interest Group (JSCIG) invites contributors to discuss ECMAScript proposals on GitHub.
References
Async Context proposal: https://github.com/tc39/proposal-async-context
React Context: https://reactjs.org/docs/context.html
OpenTelemetry: https://opentelemetry.io/
Scheduling APIs: https://github.com/WICG/scheduling-apis
Challenge in unified task model: https://github.com/WICG/scheduling-apis/blob/main/misc/userspace-task-models.md#challenges-in-creating-a-unified-task-model
Thread‑local variables: https://zh.wikipedia.org/wiki/线程局部存储
errno man page: http://man7.org/linux/man-pages/man3/errno.3.html
Noslate Aworker: https://noslate.midwayjs.org/docs/noslate_workers/intro
WinterCG: https://wintercg.org/
AsyncLocalStorage subset: https://github.com/wintercg/proposal-common-minimum-api/blob/main/asynclocalstorage.md
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
