Backend Development 11 min read

Why Executors Should Not Be Used to Create Thread Pools: Understanding ThreadPoolExecutor and OOM Risks

This article explains the definition of thread pools, the pitfalls of using Executors to create them, details the ThreadPoolExecutor constructor and parameters, illustrates execution logic, demonstrates OOM scenarios with code examples, and provides guidelines for safely configuring thread pools in Java.

Architecture Digest
Architecture Digest
Architecture Digest
Why Executors Should Not Be Used to Create Thread Pools: Understanding ThreadPoolExecutor and OOM Risks

First, thank you for reading this article; you will learn about thread pool definitions, the various ways Executors create thread pools, the ThreadPoolExecutor class, the relationship between execution logic and pool parameters, how Executors return ThreadPoolExecutor objects, and an OOM test.

Thread Pool Definition

A thread pool manages a group of worker threads. Reusing threads offers several advantages:

Reduces resource creation → lowers memory overhead.

Decreases system overhead → avoids latency caused by thread creation.

Improves stability → prevents unlimited thread creation that leads to OutOfMemoryError (OOM).

Ways Executors Create Thread Pools

Executors can create thread pools that return three types of objects:

ThreadPoolExecutor

ScheduledThreadPoolExecutor

ForkJoinPool

This article focuses only on the methods that return a ThreadPoolExecutor.

ThreadPoolExecutor Object

Before discussing Executors, we introduce ThreadPoolExecutor. Its constructor has four parameters (actually seven, but they delegate to a single constructor):

public ThreadPoolExecutor(int corePoolSize,
                           int maximumPoolSize,
                           long keepAliveTime,
                           TimeUnit unit,
                           BlockingQueue<Runnable> workQueue,
                           ThreadFactory threadFactory,
                           RejectedExecutionHandler handler)

Parameter meanings:

corePoolSize → number of core threads.

maximumPoolSize → maximum number of threads.

keepAliveTime → idle thread lifetime.

unit → time unit.

workQueue → task queue used by the pool.

threadFactory → factory for creating new threads.

handler → policy for handling rejected tasks.

Thread Pool Execution Logic and Parameter Relationship

The execution flow is:

If core threads are not full, a new core thread is created to run the task.

If core threads are full, the task is offered to the queue; if the queue is not full, it is enqueued.

If the queue is full, and the pool has not reached maximumPoolSize, a new non‑core thread is created.

If the pool is also full, the task is handled by the rejection policy.

Executors Methods That Return ThreadPoolExecutor

There are three Executors methods that create a ThreadPoolExecutor:

Executors#newCachedThreadPool → creates a cached thread pool.

Executors#newSingleThreadExecutor → creates a single‑thread pool.

Executors#newFixedThreadPool → creates a fixed‑size pool.

Executors#newCachedThreadPool

public static ExecutorService newCachedThreadPool() {
    return new ThreadPoolExecutor(0, Integer.MAX_VALUE,
                                 60L, TimeUnit.SECONDS,
                                 new SynchronousQueue<Runnable>());
}

Characteristics:

corePoolSize = 0

maximumPoolSize = Integer.MAX_VALUE (practically unlimited)

keepAliveTime = 60 seconds

workQueue = SynchronousQueue (no buffering)

Because the pool can create an unbounded number of threads, it easily leads to OOM in resource‑constrained environments.

Executors#newSingleThreadExecutor

public static ExecutorService newSingleThreadExecutor() {
    return new FinalizableDelegatedExecutorService(
        new ThreadPoolExecutor(1, 1,
                               0L, TimeUnit.MILLISECONDS,
                               new LinkedBlockingQueue<Runnable>()));
}

Characteristics:

corePoolSize = 1

maximumPoolSize = 1

keepAliveTime = 0

workQueue = LinkedBlockingQueue (unbounded)

The unbounded queue can accumulate unlimited tasks, causing OOM; the max pool size and keepAliveTime become ineffective.

Executors#newFixedThreadPool

public static ExecutorService newFixedThreadPool(int nThreads) {
    return new ThreadPoolExecutor(nThreads, nThreads,
                                 0L, TimeUnit.MILLISECONDS,
                                 new LinkedBlockingQueue<Runnable>());
}

Characteristics:

corePoolSize = nThreads

maximumPoolSize = nThreads

keepAliveTime = 0

workQueue = LinkedBlockingQueue (unbounded)

Like SingleThreadExecutor, the unbounded queue can cause OOM when resources are limited.

Summary

FixedThreadPool and SingleThreadExecutor use an unbounded queue (Integer.MAX_VALUE length) → possible OOM.

CachedThreadPool can create an unbounded number of threads (Integer.MAX_VALUE) → possible OOM.

Therefore, using Executors to create thread pools is discouraged; creating ThreadPoolExecutor directly gives full control over parameters.

OOM Test

The following test demonstrates OOM when using a cached thread pool:

public class TaskTest {
    public static void main(String[] args) {
        ExecutorService es = Executors.newCachedThreadPool();
        int i = 0;
        while (true) {
            es.submit(new Task(i++));
        }
    }
}

Run the test with a small JVM heap (e.g., -Xms10M -Xmx10M). The program throws an OutOfMemoryError after creating tens of thousands of threads.

Exception: java.lang.OutOfMemoryError thrown from the UncaughtExceptionHandler in thread "main"
Disconnected from the target VM, address: '127.0.0.1:60416', transport: 'socket'

How to Define Thread Pool Parameters

CPU‑bound: pool size ≈ CPU count + 1 (obtainable via Runtime.availableProcessors()).

IO‑bound: pool size = CPU count × CPU utilization × (1 + waitTime / computeTime).

Hybrid: separate pools for CPU‑bound and IO‑bound tasks.

Work queue: prefer bounded queues to avoid resource exhaustion.

Rejection policy: default is AbortPolicy; alternatives include CallerRunsPolicy, custom handlers, DiscardPolicy, and DiscardOldestPolicy.

When using Executors' static methods, you can also apply a Semaphore to limit task submission and prevent OOM.

Author: He Tiantian (lv‑3) Source: https://juejin.im/post/5dc41c165188257bad4d9e69
BackendJavaConcurrencyoomThreadPoolExecutorExecutors
Architecture Digest
Written by

Architecture Digest

Focusing on Java backend development, covering application architecture from top-tier internet companies (high availability, high performance, high stability), big data, machine learning, Java architecture, and other popular fields.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.