Backend Development 16 min read

Mastering Java Concurrent Queues: When to Use Blocking, Non‑Blocking, and Transfer Queues

This article provides a comprehensive overview of Java's concurrent queue implementations, explaining the differences between blocking and non‑blocking queues, their underlying mechanisms, typical use‑cases, and code examples for classes such as ArrayBlockingQueue, LinkedBlockingQueue, PriorityBlockingQueue, DelayQueue, SynchronousQueue, LinkedTransferQueue, and LinkedBlockingDeque.

macrozheng
macrozheng
macrozheng
Mastering Java Concurrent Queues: When to Use Blocking, Non‑Blocking, and Transfer Queues

Preface

JUC utilities can be roughly divided into six categories: executors & thread pools, concurrent queues, synchronization tools, concurrent collections, locks, and atomic variables.

The current series mainly covers executors & thread pools, synchronization tools, and locks, and frequently mentions queues, so this article gives a high‑level overview of the various concurrent queues in Java.

Concurrent Queues

Java concurrent queues are classified by implementation into two groups: blocking queues and non‑blocking queues.

The former rely on locks, the latter on CAS (compare‑and‑swap) algorithms.

Common queues include:

Why are there so many different queue types? Because each queue is designed to handle specific concurrency scenarios, similar to how different locks address different synchronization needs.

ArrayBlockingQueue

This is a bounded, array‑based blocking queue. Its capacity is set when the queue is constructed.

<code>public ArrayBlockingQueue(int capacity, boolean fair) {
    if (capacity <= 0)
        throw new IllegalArgumentException();
    this.items = new Object[capacity];
    lock = new ReentrantLock(fair);
    notEmpty = lock.newCondition();
    notFull = lock.newCondition();
}</code>

By default it uses a non‑fair lock, meaning threads are not guaranteed to acquire the lock in the order they arrived.

Why does the default use a non‑fair lock? What advantages does it bring, and what potential problems can arise?

LinkedBlockingQueue

This is a optionally‑bounded blocking queue backed by a linked list. Its default maximum capacity is

Integer.MAX_VALUE

, which is why it is described as “optionally‑bounded”.

<code>public LinkedBlockingQueue() {
    this(Integer.MAX_VALUE);
}

public LinkedBlockingQueue(int capacity) {
    if (capacity <= 0) throw new IllegalArgumentException();
    this.capacity = capacity;
    last = head = new Node<E>(null);
}</code>
Linked lists provide higher insertion/removal efficiency than arrays, but arrays can offer more predictable performance in some concurrent scenarios.

PriorityBlockingQueue

This unbounded blocking queue orders its elements according to their natural ordering or a supplied

Comparator

.

Null elements are not allowed.

Elements that cannot be compared will cause a

ClassCastException

.

Elements with equal priority may be ordered arbitrarily unless a tie‑breaker is provided.

<code>class FIFOEntry<E extends Comparable<? super E>> implements Comparable<FIFOEntry<E>> {
    static final AtomicLong seq = new AtomicLong(0);
    final long seqNum;
    final E entry;
    public FIFOEntry(E entry) {
        seqNum = seq.getAndIncrement();
        this.entry = entry;
    }
    public int compareTo(FIFOEntry<E> other) {
        int res = entry.compareTo(other.entry);
        if (res == 0 && other.entry != this.entry)
            res = (seqNum < other.seqNum ? -1 : 1);
        return res;
    }
}</code>
What criteria do you use to choose an initial capacity for a collection?

DelayQueue

A delay queue is an unbounded blocking queue of elements that implement the

Delayed

interface. Elements become available for retrieval only after their delay has expired.

<code>public long getDelay(TimeUnit unit) {
    // best to use nanoseconds for precision
    return unit.convert(time - now(), NANOSECONDS);
}</code>
<code>public int compareTo(Delayed other) {
    if (other == this) return 0;
    if (other instanceof ScheduledFutureTask) {
        ScheduledFutureTask<?> x = (ScheduledFutureTask<?>) other;
        long diff = time - x.time;
        if (diff < 0) return -1;
        if (diff > 0) return 1;
        return (sequenceNumber < x.sequenceNumber ? -1 : 1);
    }
    long diff = getDelay(NANOSECONDS) - other.getDelay(NANOSECONDS);
    return (diff < 0) ? -1 : (diff > 0) ? 1 : 0;
}</code>

Typical use cases include cache expiration and scheduled task execution.

SynchronousQueue

This queue does not store elements; each

put

must wait for a corresponding

take

, and vice versa.

<code>ExecutorService executor = Executors.newFixedThreadPool(2);
SynchronousQueue<Integer> queue = new SynchronousQueue<>();

Runnable producer = () -> {
    Integer e = ThreadLocalRandom.current().nextInt();
    try { queue.put(e); } catch (InterruptedException ex) { ex.printStackTrace(); }
};

Runnable consumer = () -> {
    try { Integer e = queue.take(); } catch (InterruptedException ex) { ex.printStackTrace(); }
};</code>

The cached thread pool created by

Executors.newCachedThreadPool()

internally uses a

SynchronousQueue

because it can create threads on demand up to

Integer.MAX_VALUE

, so tasks are handed off directly without queuing.

Why does newCachedThreadPool use a SynchronousQueue while newSingleThreadExecutor and newFixedThreadPool use LinkedBlockingQueue ?

LinkedTransferQueue

This queue adds three methods to a regular blocking queue:

transfer

,

tryTransfer

, and

tryTransfer

with timeout.

transfer

blocks until a consumer takes the element, providing a “hand‑off” semantics.

BlockingQueue blocks when the queue is full; TransferQueue blocks when there is no consumer waiting.

LinkedBlockingDeque

A double‑ended blocking queue backed by a linked list. It supports insertion and removal at both the head and the tail, reducing contention when multiple threads enqueue simultaneously.

Double‑ended queues have an extra entry point, which can halve contention under heavy concurrent enqueues.

Conclusion

This article quickly categorises and clarifies the purpose of the various Java concurrent queues, helping readers understand their design, blocking behaviours, and appropriate application scenarios. By reviewing the source code of these queues, developers can more easily select the right implementation for their concurrency problems.

JavaconcurrencythreadpoolJUCBlockingQueue
macrozheng
Written by

macrozheng

Dedicated to Java tech sharing and dissecting top open-source projects. Topics include Spring Boot, Spring Cloud, Docker, Kubernetes and more. Author’s GitHub project “mall” has 50K+ stars.

0 followers
Reader feedback

How this landed with the community

login Sign in to like

Rate this article

Was this worth your time?

Sign in to rate
Discussion

0 Comments

Thoughtful readers leave field notes, pushback, and hard-won operational detail here.