Boost Java Parallel Stream Performance with Custom Thread Pools

This tutorial explains Java 8 parallel streams, their performance trade‑offs, and how to run them with a custom ForkJoinPool to improve speed while avoiding memory leaks, including code examples and best‑practice recommendations.

Cognitive Technology Team
Cognitive Technology Team
Cognitive Technology Team
Boost Java Parallel Stream Performance with Custom Thread Pools

1. Overview

Java 8 introduced Streams as an efficient way to perform batch data operations, and parallel streams can be used in concurrent environments.

These streams can bring performance improvements — but at the cost of multithreading overhead.

This quick tutorial shows a major limitation of the Stream API and demonstrates how to make parallel streams work with a custom thread‑pool instance, plus a library that addresses the issue.

2. Parallel Stream

We start with a simple example that calls the parallelStream method on a collection, which returns a possibly parallel stream:

@Test
public void givenList_whenCallingParallelStream_shouldBeParallelStream(){
    List<Long> aList = new ArrayList<>();
    Stream<Long> parallelStream = aList.parallelStream();

    assertTrue(parallelStream.isParallel());
}

By default, such streams are processed using ForkJoinPool.commonPool() , a thread pool shared by the entire application.

3. Custom Thread Pool

We can pass a custom thread pool when processing a stream.

The following example shows how to use a custom thread pool to sum long values from 1 to 1,000,000 in parallel:

@Test
public void giveRangeOfLongs_whenSummedInParallel_shouldBeEqualToExpectedTotal()
    throws InterruptedException, ExecutionException {
    long firstNum = 1;
    long lastNum = 1_000_000;

    List<Long> aList = LongStream.rangeClosed(firstNum, lastNum).boxed()
        .collect(Collectors.toList());

    ForkJoinPool customThreadPool = new ForkJoinPool(4);
    long actualTotal = customThreadPool.submit(
        () -> aList.parallelStream().reduce(0L, Long::sum)).get();

    assertEquals((lastNum + firstNum) * lastNum / 2, actualTotal);
}

We use a ForkJoinPool with parallelism level 4; the optimal value depends on the environment, typically based on CPU core count.

We then reduce the parallel stream to compute the sum.

This simple example may not fully showcase the benefits of a custom thread pool, but its advantages become clear when we do not want long‑running tasks to occupy the common pool—for example, processing network‑derived data—or when the common pool is already used by other components.

If we run the test method, it passes.

However, instantiating a ForkJoinPool in a regular method (instead of a test) can lead to memory‑overflow errors.

4. Beware of Memory Leaks

By default the whole application uses the common thread pool, which is a static instance, so no memory leak occurs.

When we create a ForkJoinPool in a test method, the custom pool is not dereferenced and garbage‑collected after the test finishes; it remains waiting for new tasks.

Thus each test invocation creates a new custom pool that is never released.

The simple solution is to shut down the custom pool after use (note: frequent creation and destruction of thread pools also has a cost):

try {
    long actualTotal = customThreadPool.submit(
        () -> aList.parallelStream().reduce(0L, Long::sum)).get();
    assertEquals((lastNum + firstNum) * lastNum / 2, actualTotal);
} finally {
    customThreadPool.shutdown();
}

5. Conclusion

We briefly covered how to run parallel streams with a custom thread pool, which can yield performance gains in suitable environments and with appropriate parallelism levels.

When creating a custom thread pool, remember to call its shutdown() method to avoid memory leaks.

JavaPerformanceForkJoinPoolParallel Streamscustom-thread-pool
Cognitive Technology Team
Written by

Cognitive Technology Team

Cognitive Technology Team regularly delivers the latest IT news, original content, programming tutorials and experience sharing, with daily perks awaiting you.

0 followers
Reader feedback

How this landed with the community

Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.