From Reactive Streams to Virtual Threads: A Deep Dive
The article examines Project Loom's virtual threads, compares them with reactive streams, explores their underlying mechanisms, presents a direct‑style flow library (Jox) that combines both paradigms, and evaluates performance, backpressure, and error handling in Java concurrency.
Virtual Threads: History and Outlook
Virtual threads, introduced by Project Loom, were released in Java 21 (September 2023). Features such as Structured Concurrency and Scoped Values are preview‑only and are expected to be finalized in JDK 24 (March 2025). The implementation lives in the JVM; the standard library adds API changes so that any JVM language (Java, Kotlin, Scala, Clojure) can use virtual threads.
Motivation for Virtual Threads
Traditional Java used a one‑thread‑per‑request model where each java.lang.Thread mapped to an OS platform thread. Platform threads are heavyweight: they consume megabytes of stack memory and context switches are costly, limiting the practical number of concurrent threads to a few thousand. As network traffic grew, threads spent most of their time blocked on I/O, leading to poor scalability.
Thread pools and the ExecutionContext abstraction were introduced to improve utilization, and developers migrated to Future -based asynchronous code. This migration introduced three concrete problems that Project Loom aims to solve:
Syntax: developers must write asynchronous code using library‑provided combinators instead of plain control‑flow statements.
Function‑coloring: a method that returns a Future forces callers to also return a Future, propagating asynchrony through the call graph.
Context loss: stack traces are fragmented because exceptions are thrown from a different thread than the original call site.
Virtual threads keep the familiar synchronous syntax while providing the same thread‑utilization benefits that Future s were designed to achieve.
Futures vs. Virtual Threads: Internal Mechanism
Executors maintain a pre‑created pool of platform threads. Submitting a task returns a Future that completes when the task finishes. Project Loom moves this executor logic into the VM: the Thread API becomes transparent—an instance may represent either a platform thread or a virtual thread. Blocking APIs are rewritten so that only the virtual thread blocks; the underlying platform thread remains free to run other virtual threads. Synchronized methods are an exception and will be removed in Java 24.
Reactive Streams: Origin
Reactive Streams were drafted in late 2013, finalized in 2015, and incorporated into Java 9 (2017). The specification defines three low‑level JVM interfaces— Publisher, Subscriber, and Subscription —to enable asynchronous, non‑blocking data exchange with back‑pressure. In practice developers use higher‑level libraries such as Akka Streams, Vert.x, Helidon, RxJava, or Reactor (the latter used in Spring) to obtain declarative APIs, error‑handling utilities, and integration with I/O.
Simple Flow Implementation (Jox)
The Jox library demonstrates a minimal direct‑style flow API that relies only on Java control‑flow constructs ( if, for, while, try‑catch‑finally) and a single ; separator. Each pipeline stage implements FlowStage<T> with a run(FlowEmit<T> emit) method; a Flow<T> wrapper holds the final stage and provides combinators.
public interface FlowStage<T> {
void run(FlowEmit<T> emit) throws Exception;
}
public interface FlowEmit<T> {
void apply(T t) throws Exception;
}A factory method creates an infinite stream:
public class Flows {
public static <T> Flow<T> iterate(T zero, Function<T,T> nextFn) {
return new Flow(emit -> {
T current = zero;
while (true) {
emit.apply(current);
current = nextFn.apply(current);
}
});
}
}The terminal operator runToList() collects emitted elements into a List<T>:
public class Flow<T> {
public List<T> runToList() throws Exception {
List<T> result = new ArrayList<>();
last.run(result::add);
return result;
}
}Common combinators are built by wrapping the previous stage. Example map implementation:
public class Flow<T> {
public <U> Flow<U> map(ThrowingFunction<T,U> fn) {
return new Flow(emit -> {
last.run(t -> emit.apply(fn.apply(t)));
});
}
}Using these combinators a concise pipeline can be expressed:
List<Integer> result = Flows
.iterate(1, i -> i + 1)
.map(i -> i * 2)
.filter(i -> i % 3 == 2)
.take(10)
.runToList();Asynchronous Flow: Channels and Structured Concurrency
Concurrency is handled via virtual threads and Channel<T> primitives inspired by Go. A channel can be marked done (no more elements) or error (short‑circuit). Channels also support a select operation to receive from the first ready channel.
Structured concurrency provides a supervised scope where child virtual threads (forked via scope.fork) are guaranteed to finish before the scope exits, even on failure:
var result = supervised(scope -> {
var f1 = scope.fork(() -> { Thread.sleep(500); return 5; });
var f2 = scope.fork(() -> { Thread.sleep(1000); return 6; });
return f1.join() + f2.join();
});Merging two flows creates a new flow that runs both sub‑flows in a supervised scope, each writing to its own channel, and then selects from the channels until both are done:
public Flow<T> merge(Flow<T> other) {
return new Flow(emit -> {
supervised(scope -> {
Channel<T> c1 = this.runToChannel(scope);
Channel<T> c2 = other.runToChannel(scope);
boolean continueLoop = true;
while (continueLoop) {
switch (selectOrClosed(c1.receiveClause(), c2.receiveClause())) {
case ChannelDone _ -> continueLoop = false;
case ChannelError e -> throw e.toException();
case Object r -> emit.apply((T) r);
}
}
return null;
});
});
}Back‑Pressure
Back‑pressure is implicit: channels have bounded buffers, and a blocked send call naturally pauses upstream production, keeping memory usage bounded.
Error Handling
Synchronous flow errors propagate via .emit and are re‑thrown by terminal operators such as .runForeach or .runDiscard. In asynchronous flows, structured concurrency guarantees that if any child virtual thread throws, the whole scope is closed, the exception is wrapped, and re‑thrown after all children have terminated.
Performance Evaluation
Benchmarks of the channel implementation show performance comparable to Java concurrent collections and Kotlin’s channel library, with a slight slowdown relative to Go’s native channels. HTTP‑server benchmarks comparing a Loom‑based implementation with a Reactive‑Streams‑based implementation (e.g., using Reactor) showed no statistically significant difference; both ultimately rely on OS‑level thread pools.
Conclusion
Jox proves that a virtual‑thread‑based library can match the capabilities of Reactive‑Streams libraries while offering a more straightforward direct‑style programming model. Virtual threads will not entirely replace the Reactive Streams specification, which remains valuable for interoperability, but they simplify writing resilient data‑processing pipelines. Mature libraries such as Akka Streams or Reactor still provide broader API surface and integration, yet the virtual‑thread ecosystem is expected to close the gap and may become the dominant approach for many developers.
Signed-in readers can open the original source through BestHub's protected redirect.
This article has been distilled and summarized from source material, then republished for learning and reference. If you believe it infringes your rights, please contactand we will review it promptly.
JakartaEE China Community
JakartaEE China Community, official website: jakarta.ee/zh/community/china; gitee.com/jakarta-ee-china; space.bilibili.com/518946941; reply "Join group" to get QR code
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.
