Types of Blocking Queues Used in Java Thread Pools
This article explains the different blocking queue implementations—ArrayBlockingQueue, LinkedBlockingQueue, SynchronousQueue, and PriorityBlockingQueue—used by Java thread pools, describing their structures, behavior, fairness settings, and performance characteristics for concurrent task execution.
Hello, I am Wufan.
Today I bring you a set of interview questions from major tech companies:
What kinds of blocking queues are used in thread pools?
A blocking queue stores tasks waiting for execution. When the number of tasks exceeds the corePoolSize, subsequent tasks are placed into the blocking queue.
The thread pool can use the following blocking queues:
ArrayBlockingQueue
LinkedBlockingQueue
SynchronousQueue
PriorityBlockingQueue
ArrayBlockingQueue
It is a bounded blocking queue based on an array structure that orders elements in FIFO (first‑in‑first‑out) order.
ArrayBlockingQueue is implemented with an array and is bounded.
Insertion blocks when the queue is full; removal blocks when the queue is empty.
Elements are ordered FIFO.
By default it does not guarantee fair access.
Fair access means threads acquire the queue in the order they were blocked.
Unfair mode allows any blocked thread to compete for the queue when it becomes available, which may let a later‑blocked thread acquire it first.
Fairness reduces throughput.
LinkedBlockingQueue
It is a blocking queue based on a linked‑list structure, also FIFO ordered, and usually offers higher throughput than ArrayBlockingQueue. The static factory method Executors.newFixedThreadPool() uses this queue.
Provides both linked‑list and optional bounded capacity.
Insertion blocks when the queue is full; removal blocks when empty.
Default and maximum capacity is Integer.MAX_VALUE , effectively unbounded.
SynchronousQueue
This queue does not store elements; each insert operation must wait for a corresponding remove operation, otherwise the insert remains blocked. It typically provides higher throughput than LinkedBlockingQueue. The static factory method Executors.newCachedThreadPool() uses this queue.
Acts like a perfect hand‑off: a producer must wait for a consumer to take the element.
Transfers data directly from producer to consumer.
After a put call, the queue remains empty.
Each put must be paired with a take to complete.
Suitable for hand‑off scenarios.
Performance is higher than both ArrayBlockingQueue and LinkedBlockingQueue.
PriorityBlockingQueue
This is an unbounded blocking queue that orders its elements according to their priority.
Combines the features of PriorityQueue and BlockingQueue .
Elements are sorted by their natural order by default.
You can customize ordering by implementing Comparable or providing a Comparator to the constructor.
Reference: 45 diagrams explaining 18 types of queues
Thread pool question summary:
Principles of thread pools? (07‑30)
What blocking queues exist? (08‑01)
How to use thread pools?
What rejection policies are available?
How to configure thread pool parameters reasonably?
How to monitor thread pools?
Executor framework?
What types of thread pools does Executor provide?
Wukong Talks Architecture
Explaining distributed systems and architecture through stories. Author of the "JVM Performance Tuning in Practice" column, open-source author of "Spring Cloud in Practice PassJava", and independently developed a PMP practice quiz mini-program.
How this landed with the community
Was this worth your time?
0 Comments
Thoughtful readers leave field notes, pushback, and hard-won operational detail here.