Sooner or later, Java developers have to deal with the abstract data type queue, deque, and stack. This article explains the basic functionality of these data structures and gives a detailed overview of all implementations available in the JDK. Numerous code examples should make it easier for you to understand.
In detail, the article answers the following questions:
As always, you can find the code samples in my GitHub repository.
In this chapter, I describe the general operation of queues, deques, and stacks. General means: independent of a specific programming language.
A queue is a list of elements where elements are inserted on one side and removed in the same order on the other side:
A queue provides the following operations:
(The corresponding methods of Java queue implementations have different names, as you will see below.)
Since the order of the elements is maintained, a queue is also called a FIFO list ("first-in-first-out").
A stack is a list of elements in which elements are inserted ("stacked") and removed on the same side (in representations classically at the top):
The stack's operations are:
Since the last element added is the first one removed, a stack is also called a LIFO list ("last-in-first-out").
A deque (Double-ended queue, pronounced "deck") is a list of elements where the elements can be inserted and removed both on one side and on the other:
The operations are "enqueue" and "dequeue" on both sides, similar to the queue:
(As with the queue, the corresponding methods of the Java deque implementations have different names, more on this below.)
A deque can be used both as a queue (FIFO) and as a stack (LIFO). It is, however, not limited to this, since elements can be added to and removed from any side at any time.
If a queue can only hold a limited number of elements, it is called a "bounded queue". The maximum number of elements is called "capacity" and is defined once in the queue's constructor.
If the number of elements in the queue is not limited (or only limited by the available memory), it is called an "unbounded queue".
The same definition applies to deques and stacks.
The following exceptional cases can occur in the queue operations presented:
Depending on the implementation (more on this later), those operations return a specific value (false
or null
) or throw an exception for a non-blocking queue.
A blocking queue provides the following additional blocking operations:
Blocked methods are not automatically resumed in the order in which they were called. Processing in call sequence can be activated for some queue implementations by an optional "fairness policy". However, this increases the overhead and thus reduces the throughput of the queue.
This definition also applies to deques and stacks.
Before I introduce the Java Queue
and Deque
interfaces in detail, I'd like to provide an overview in the form of a UML class diagram:
I will describe the interfaces in the following chapters and explain them with code examples.
Since Java 5.0, the JDK contains the interface java.util.Queue
and several queue implementations, which differ in various properties (such as bounded/unbounded, blocking/non-blocking, thread-safe/not thread-safe).
The java.util.Queue
interface (→ Javadoc) inherits from java.util.Collection
and extends it with methods to add and remove elements and to view the element at the head of the queue without removing it.
There are two methods for each of the three operations, which differ only in error handling: One method throws an exception in case of an error, the other returns a specific return value (false
or null
). Errors can occur, for example, if a bounded queue is full, or if you try to remove an element from an empty queue.
Throws exception | Return value | |
Add element (enqueue) | add(E e) | offer(E e) |
Remove element (dequeue) | remove() | poll() |
View element (examine) | element() | peek() |
Here is a simple example (→ code in GitHub):
public class QueueExample {
public static void main(String[] args) {
Queue<Integer> queue = new ConcurrentLinkedQueue<>();
queue.offer(1);
queue.offer(2);
queue.offer(3);
System.out.println("queue = " + queue);
System.out.println("queue.peek() = " + queue.peek());
System.out.println("queue.poll() = " + queue.poll());
System.out.println("queue.poll() = " + queue.poll());
System.out.println("queue.poll() = " + queue.poll());
System.out.println("queue.poll() = " + queue.poll());
System.out.println("queue.remove() = " + queue.remove());
}
}
The program uses offer()
to add the elements 1, 2, and 3 to the queue, then prints the head element using peek()
and then removes all elements using poll()
. After every element has been removed, the program calls poll()
again and then remove()
.
The program prints out the following:
queue = [1, 2, 3] queue.peek() = 1 queue.poll() = 1 queue.poll() = 2 queue.poll() = 3 queue.poll() = null Exception in thread "main" java.util.NoSuchElementException at [...]
As you can see, the elements are printed in FIFO order, i.e., in the same order as they were inserted. You can also see the difference between poll()
and remove()
: While poll()
returns null
when the queue is empty, remove()
throws a NoSuchElementException
.
Similarly, peek()
would return null
if the queue was empty and element()
would throw a NoSuchElementException
.
The interface java.util.concurrent.BlockingQueue
(→ Javadoc), also available since Java 5.0, extends Queue
with additional blocking operations that wait for an element to be available when removing an element from an empty queue – or wait until free space is available when inserting an element into a full queue.
Beide Operationen gibt es in jeweils zwei Varianten. Eine, die unbegrenzt lange wartet, und eine, die nach Ablauf einer vorgegebenen Wartezeit (Timeout) abbricht. Die abbrechenden Methoden liefern nach Ablauf des Timeouts false
bzw. null
zurück. Blockierende Methoden, die Exceptions werfen, gibt es nicht.
Both operations are available in two variants. One that waits indefinitely, and one that gives up and returns false
or null
after a specified waiting time (timeout). There are no blocking methods that throw exceptions.
Throws exception (inherited from Queue) | Return value (inherited from Queue) | Blocks (new in BlockingQueue) | Timeout (new in BlockingQueue) | |
Add element (enqueue) | add(E e) | offer(E e) | put(E e) | offer(E e, |
Remove element (dequeue) | remove() | poll() | take() | poll( |
View element (examine) | element() | peek() | – | – |
Here is an example, which is much more complex than the first one due to its concurrency (→ code in GitHub):
public class BlockingQueueExample {
private static final long startTime = System.currentTimeMillis();
public static void main(String[] args) {
BlockingQueue<Integer> queue = new LinkedBlockingQueue<>(3);
ScheduledExecutorService pool = Executors.newScheduledThreadPool(10);
// Start reading from the queue immediately, every 3 seconds
for (int i = 0; i < 10; i++) {
pool.schedule(() -> dequeue(queue), i * 3, TimeUnit.SECONDS);
}
// Start writing to the queue after 3.5 seconds (so there are already 2
// threads waiting), every 1 seconds (so that the queue fills faster than
// it's emptied, so that we see a full queue soon)
for (int i = 0; i < 10; i++) {
int finalI = i;
pool.schedule(() -> enqueue(queue, finalI),
3500 + i * 1000, TimeUnit.MILLISECONDS);
}
pool.shutdown();
try {
pool.awaitTermination(1, TimeUnit.MINUTES);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
private static void dequeue(BlockingQueue<Integer> queue) {
log(" Calling queue.take() (queue = %s)...", queue);
try {
Integer e = queue.take();
log(" queue.take() returned %d (queue = %s)", e, queue);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
private static void enqueue(BlockingQueue<Integer> queue, int i) {
log("Calling queue.put(%d) (queue = %s)...", i, queue);
try {
queue.put(i);
log("queue.put(%d) returned (queue = %s)", i, queue);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
private static void log(String format, Object... args) {
System.out.printf(Locale.US, "[%4.1fs] [%-16s] %s%n",
(System.currentTimeMillis() - startTime) / 1000.0,
Thread.currentThread().getName(),
String.format(format, args));
}
}
In this example, we create a blocking, bounded queue with a capacity of 3 and schedule ten enqueue and ten dequeue operations.
The enqueue operations start later, so we can see blocking dequeue operations at the beginning. Furthermore, the enqueue operations occur in shorter intervals, so that the capacity limit of the queue is reached after a while, and we see blocking enqueue operations.
Here the output of the program:
[ 0.0s] [pool-1-thread-1 ] Calling queue.take() (queue = [])... [ 3.0s] [pool-1-thread-3 ] Calling queue.take() (queue = [])... [ 3.5s] [pool-1-thread-2 ] Calling queue.put(0) (queue = [])... [ 3.5s] [pool-1-thread-2 ] queue.put(0) returned (queue = []) [ 3.5s] [pool-1-thread-1 ] queue.take() returned 0 (queue = []) [ 4.5s] [pool-1-thread-10] Calling queue.put(1) (queue = [])... [ 4.5s] [pool-1-thread-10] queue.put(1) returned (queue = []) [ 4.5s] [pool-1-thread-3 ] queue.take() returned 1 (queue = []) [ 5.5s] [pool-1-thread-8 ] Calling queue.put(2) (queue = [])... [ 5.5s] [pool-1-thread-8 ] queue.put(2) returned (queue = [2]) [ 6.0s] [pool-1-thread-9 ] Calling queue.take() (queue = [2])... [ 6.0s] [pool-1-thread-9 ] queue.take() returned 2 (queue = []) [ 6.5s] [pool-1-thread-5 ] Calling queue.put(3) (queue = [])... [ 6.5s] [pool-1-thread-5 ] queue.put(3) returned (queue = [3]) [ 7.5s] [pool-1-thread-6 ] Calling queue.put(4) (queue = [3])... [ 7.5s] [pool-1-thread-6 ] queue.put(4) returned (queue = [3, 4]) [ 8.5s] [pool-1-thread-7 ] Calling queue.put(5) (queue = [3, 4])... [ 8.5s] [pool-1-thread-7 ] queue.put(5) returned (queue = [3, 4, 5]) [ 9.0s] [pool-1-thread-4 ] Calling queue.take() (queue = [3, 4, 5])... [ 9.0s] [pool-1-thread-4 ] queue.take() returned 3 (queue = [4, 5]) [ 9.5s] [pool-1-thread-2 ] Calling queue.put(6) (queue = [4, 5])... [ 9.5s] [pool-1-thread-2 ] queue.put(6) returned (queue = [4, 5, 6]) [10.5s] [pool-1-thread-1 ] Calling queue.put(7) (queue = [4, 5, 6])... [11.5s] [pool-1-thread-10] Calling queue.put(8) (queue = [4, 5, 6])... [12.0s] [pool-1-thread-3 ] Calling queue.take() (queue = [4, 5, 6])... [12.0s] [pool-1-thread-3 ] queue.take() returned 4 (queue = [5, 6, 7]) [12.0s] [pool-1-thread-1 ] queue.put(7) returned (queue = [5, 6, 7]) [12.5s] [pool-1-thread-8 ] Calling queue.put(9) (queue = [5, 6, 7])... [15.0s] [pool-1-thread-9 ] Calling queue.take() (queue = [5, 6, 7])... [15.0s] [pool-1-thread-9 ] queue.take() returned 5 (queue = [6, 7, 8]) [15.0s] [pool-1-thread-10] queue.put(8) returned (queue = [6, 7, 8]) [18.0s] [pool-1-thread-5 ] Calling queue.take() (queue = [6, 7, 8])... [18.0s] [pool-1-thread-5 ] queue.take() returned 6 (queue = [7, 8, 9]) [18.0s] [pool-1-thread-8 ] queue.put(9) returned (queue = [7, 8, 9]) [21.0s] [pool-1-thread-6 ] Calling queue.take() (queue = [7, 8, 9])... [21.0s] [pool-1-thread-6 ] queue.take() returned 7 (queue = [8, 9]) [24.0s] [pool-1-thread-7 ] Calling queue.take() (queue = [8, 9])... [24.0s] [pool-1-thread-7 ] queue.take() returned 8 (queue = [9]) [27.0s] [pool-1-thread-4 ] Calling queue.take() (queue = [9])... [27.0s] [pool-1-thread-4 ] queue.take() returned 9 (queue = [])
In the beginning, the queue is empty so that the first two read attempts (after 0 and 3 s) are blocked.
After 3.5 s (when two reading threads are waiting at the queue), the program starts writing to the queue every second. You can see in the output how each reading thread resumes and immediately removes the element added by a writing thread (at 3.5 and 4.5 s).
Since the program writes to the queue three times as fast as it reads from it, the attempt to write a 7 to the queue is blocked after 10.5 s, because it has reached its capacity limit of 3 with the elements [4, 5, 6].
Only once the 4 has been removed from the queue after 12 seconds, the 7 can be inserted. For the 8 and the 9, we see similar behavior.
In JDK 6.0, the interface java.util.Deque
and several Deque implementations were added. Like the queue implementations, these differ in the characteristics bounded/unbounded, blocking/non-blocking, and thread-safe/not thread-safe.
The interface java.util.Deque
(→ JavaDoc) inherits from java.util.Queue
and extends it with methods to append elements at the front of the deque, remove elements from the back of the deque, and view the element at the back of the deque without removing it.
Again, these methods differ in whether they throw an exception or return a specific value in case of an error.
For consistency reasons, those operations that Deque
inherits from Queue
have been reimplemented with new names (for example, Queue.offer()
as Deque.offerLast()
and Queue.poll()
as Deque.pollFirst()
). You can recognize these cases by the fact that two method names are given in the following table.
Front | Back | |||
Throws exception | Return value | Throws exception | Return value | |
Add element (enqueue) | addFirst(E e) | offerFirst(E e) | addLast(E e) | offerLast(E e) |
Remove element (dequeue) | removeFirst() | pollFirst() poll() | removeLast() | pollLast() |
View element (examine) | getFirst() | peekFirst() peek() | getLast() | peekLast() |
Here is a code example that produces the deque that was shown at the beginning of the article (→ code in GitHub):
public class DequeExample {
public static void main(String[] args) {
Deque<Integer> dequeue = new ArrayDeque<>();
dequeue.offerFirst(20); // -> 20
dequeue.offerFirst(21); // -> 21 20
dequeue.offerFirst(22); // -> 22 21 20
dequeue.offerLast(23); // -> 22 21 20 23
dequeue.offerLast(24); // -> 22 21 20 23 24
dequeue.offerLast(25); // -> 22 21 20 23 24 25
dequeue.offerFirst(26); // -> 26 22 21 20 23 24 25
dequeue.offerFirst(27); // -> 27 26 22 21 20 23 24 25
System.out.println("dequeue = " + dequeue);
System.out.println();
System.out.println("dequeue.offerLast(28) = " + dequeue.offerLast(28));
System.out.println("dequeue.offerFirst(29) = " + dequeue.offerFirst(29));
System.out.println("dequeue = " + dequeue);
while (!dequeue.isEmpty()) {
System.out.println();
System.out.println("dequeue.pollFirst() = " + dequeue.pollFirst());
System.out.println("dequeue.pollLast() = " + dequeue.pollLast());
System.out.println("dequeue = " + dequeue);
}
}
}
With several calls to offerFirst()
and offerLast()
, the program fills the deque, prints it, and then alternately removes elements from both sides until the deque is empty.
This is the output of the program:
dequeue = [27, 26, 22, 21, 20, 23, 24, 25] dequeue.offerLast(28) = true dequeue.offerFirst(29) = true dequeue = [29, 27, 26, 22, 21, 20, 23, 24, 25, 28] dequeue.pollFirst() = 29 dequeue.pollLast() = 28 dequeue = [27, 26, 22, 21, 20, 23, 24, 25] dequeue.pollFirst() = 27 dequeue.pollLast() = 25 dequeue = [26, 22, 21, 20, 23, 24] dequeue.pollFirst() = 26 dequeue.pollLast() = 24 dequeue = [22, 21, 20, 23] dequeue.pollFirst() = 22 dequeue.pollLast() = 23 dequeue = [21, 20] dequeue.pollFirst() = 21 dequeue.pollLast() = 20 dequeue = []
If you look closely, you notice that ArrayDeque.toString()
prints the elements in reverse order (namely from head to tail) than they were shown in the image at the beginning of the article and as they were printed in the queue example by ConcurrentLinkedQueue.toString()
(from tail to head).
Similar to BlockingQueue
, java.util.concurrent.BlockingDeque
(→ Javadoc) offers additional blocking operations that wait until an element is available when removing an element from the deque, or wait until space is available when inserting an element.
Again for consistency reasons, the operations that BlockingDeque
inherits from BlockingQueue
have been reimplemented with new names. You can recognize these cases by the fact that two method names are listed in a single table field.
I have divided the now 30 methods into two tables.
Throws exception (inherited from Deque) | Return value (inherited from Deque) | Blocks (new in BlockingDeque) | Timeout (new in BlockingDeque) | |
Add element (enqueue) | addFirst(E e) | offerFirst(E e) | putFirst(E e) | offerFirst(E e, |
Remove element (dequeue) | removeFirst() remove() | pollFirst() poll() | takeFirst() take() | pollFirst( |
View element (examine) | getFirst() | peekFirst() peek() | – | – |
Throws exception (inherited from Deque) | Return value (inherited from Deque) | Blocks (new in BlockingDeque) | Timeout (new in BlockingDeque) | |
Add element (enqueue) | addLast(E e) add(E e) | offerLast(E e) offer(E e) | putLast(E e) put(E e) | offerLast(E e, |
Remove element (dequeue) | removeLast() | pollLast() | takeLast() | pollLast( |
View element (examine) | getLast() | peekLast() | – | – |
The code example for BlockingDeque
is even longer than the previous ones. Therefore I refrain from printing it here in this article. You can find it (like all the other code examples from this article) in my GitHub repository.
The example is based on the previous BlockingQueue
example and adds elements to / removes them from a random side of the deque.
Just as old as Java is the class java.util.Stack
(→ Javadoc), which has been available since version 1.0. Stack
inherits from java.util.Vector
and extends it with the stack methods push(E item)
, pop()
, and peek()
.
Like Vector
, Stack
is thread-safe: All methods are synchronized
.
Here is a short example of how to use Stack
:
public class StackExample {
public static void main(String[] args) {
Stack<Integer> stack = new Stack<>();
stack.push(1);
stack.push(2);
stack.push(3);
System.out.println("stack = " + stack);
System.out.println("stack.peek() = " + stack.peek());
System.out.println("stack.pop() = " + stack.pop());
System.out.println("stack.pop() = " + stack.pop());
System.out.println("stack.pop() = " + stack.pop());
System.out.println("stack.pop() = " + stack.pop());
}
}
The output of this program is:
stack = [1, 2, 3] stack.peek() = 3 stack.pop() = 3 stack.pop() = 2 stack.pop() = 1 Exception in thread "main" java.util.EmptyStackException at java.base/java.util.Stack.peek(Stack.java:101) at java.base/java.util.Stack.pop(Stack.java:83) at eu.happycoders.cds.stack.StackExample.main(StackExample.java:16)
Both pop()
and peek()
throw an EmptyStackException
if the stack contains no elements.
The Java developers recommend not to use Stack
anymore. The Javadoc says: "A more complete and consistent set of LIFO stack operations is provided by the Deque interface and its implementations, which should be used in preference to this class."
What exactly does that mean? In my opinion, Stack
should not be used for the following reasons:
synchronized
is not a particularly high-performance means of making a data structure thread-safe. Better methods are ReentrantLock
s and CAS ("compare-and-swap") operations, as can be found in the various queue and deque implementations.Vector
, Stack
provides operations that have no place in a stack, such as accessing elements via their index and inserting and deleting elements at any position.Stack
does not implement an interface. Therefore, by using Stack
, you commit yourself to a specific implementation.Instead, the Java developers recommend using one of the deque implementations, such as ArrayDeque
. To be fair, deques also offer operations that a stack should not provide, such as inserting and removing elements at the bottom of the stack.
A clean stack implementation consists of a stack interface and one or more implementations – though there is no reason not to use them as adapters around an existing deque implementation.
Stack interface (→ code in GitHub):
public interface Stack<E> extends Collection<E> {
void push(E item);
E pop();
E peek();
}
Not thread-safe stack implementation based on ArrayDeque
(→ code in GitHub):
public class ArrayDequeStack<E> implements Stack<E> {
private final Deque<E> deque = new ArrayDeque<>();
@Override
public void push(E item) {
deque.addFirst(item);
}
@Override
public E pop() {
return deque.removeFirst();
}
@Override
public E peek() {
return deque.peekFirst();
}
// All other Collection methods are forwarded
// to their corresponding methods in deque:
// [...]
}
Before we come to the queue and deque implementations and their characteristics, here is a short digression to the underlying data structures: arrays and linked lists.
Usually, array-based data structures deliver significantly better performance in Java.
Firstly, the memory requirement for an array is significantly lower*:
ArrayList
): 4 bytes per entry.LinkedList
): 24 bytes per item – each item requires a Node
object, and these each require 12 bytes for the object header + 3 × 4 bytes for the item
, next
, and prev
references.prev
references, 4 pad bytes are inserted per node object, since objects must always occupy a multiple of 8 bytes of memory.Secondly, arrays have the advantage that their elements are located at consecutive memory addresses so that, when accessing main memory, all array elements located on the same memory page are loaded into the CPU cache simultaneously.
The nodes of a linked list, on the other hand, can be spread across the heap, so that when iterating over a linked list, in the worst case, a new memory page must be loaded into the CPU cache for each element.
A third disadvantage of linked lists is that they require more effort for the garbage collector than arrays because the GC has to follow all nodes of the list to mark reachable objects during its reachability analysis.
Generally, array-based data structures should, therefore, be preferred. However, depending on the type of use and the implementation of thread safety, queues/deques based on linked lists may be faster. I will go into this in more detail in the description of the individual queue/deque implementations.
(* For the memory considerations, I assume a 64-bit JVM with compressed oops enabled.)
In the following sections, you will find descriptions and the fundamental characteristics of all queue and deque implementations available in the JDK – first, the non-blocking variants, followed by the blocking ones.
For each implementation, I recommend appropriate use cases.
The class java.util.ArrayDeque
(→ JavaDoc) is – as the name suggests – based on an array, or more precisely: on a circular array. You can read about how it works in detail in this Wikipedia article.
The circular array has the advantage that no elements need to be moved within the array when inserting or removing elements on either side of the deque. So we get the flexibility of a linked list without its performance disadvantages.
The array grows as needed, but is neither automatically shrunk nor can it be shrunk manually.
Further characteristics of ArrayDeque
:
ConcurrentModificationException
) nor weakly consistent (i.e., there are no guarantees of consistency for concurrent changes).size()
: O(1)The ArrayDeque
is a good choice for (and only for) single-threaded applications. The fact that the underlying array cannot be shrunk should be kept in mind.
You can find an ArrayDeque
code example in the section "Java Deque Example" above.
The class java.util.LinkedList
(→ Javadoc) implements a classic doubly-linked list. It exists in the JDK since version 1.2, which is much longer than the Deque
interface. The deque methods were added with the introduction of this interface in JDK 6.0.
Further characteristics of LinkedList
are:
ConcurrentModificationException
if the list is changed concurrently.size()
: O(1) – the size is stored in an int
field of the LinkedList
, i.e., it is not necessary to iterate over the complete list to determine its size – which would result in time complexity O(n).I generally do not recommend using LinkedList
. If you need a list, ArrayList
is the better choice for the performance reasons mentioned above; if you need a not thread-safe queue or deque, use an ArrayDeque
.
To experiment with LinkedList
, you can copy the above "Java Deque Example" and replace the line
Deque dequeue = new ArrayDeque<>();
with
Deque dequeue = new LinkedList<>();
Let's move on to the first thread-safe queue and dequeue implementations.
The class java.util.concurrent.ConcurrentLinkedQueue
(→ Javadoc) is based on a singly-linked list; java.util.concurrent.ConcurrentLinkedDeque
(→ Javadoc) on a doubly-linked list.
Both implementations guarantee thread safety through non-blocking CAS ("compare-and-swap") operations on VarHandles, resulting in a significant performance gain over using synchronized
or locks such as ReentrantLock
.
Further characteristics of ConcurrentLinkedQueue
and ConcurrentLinkedDeque
:
size()
: O(n) since all elements of the queue are iterated over to determine its size. If queue elements are added or removed during size determination, the result may be inaccurate.ConcurrentLinkedQueue
and ConcurrentLinkedDeque
are a good choice if a non-blocking, unbounded, and thread-safe queue or deque is required.
This applies despite my general recommendation to prefer array-based data structures. ArrayDeque
is not thread-safe. And ArrayBlockingQueue
described in the next chapter is a) bounded, and b) thread safety is implemented using a single ReentrantLock
, which is much less performant than Compare-and-swap on VarHandles.
We have already used ConcurrentLinkedQueue
in the first code example in the section "Java Queue Example".
The following example demonstrates the thread safety of ConcurrentLinkedDeque
. Four writing and three reading threads concurrently add and remove elements on random sides of the deque (→ code in GitHub):
public class ConcurrentLinkedDequeExample {
private static final Deque<Integer> deque = new ConcurrentLinkedDeque<>();
public static void main(String[] args) {
// Start 4 writing threads
for (int i = 0; i < 4; i++)
new ModifyingThread(true).start();
// Start 3 reading threads
for (int i = 0; i < 3; i++)
new ModifyingThread(false).start();
}
private static final class ModifyingThread extends Thread {
private final boolean write;
private ModifyingThread(boolean write) {
this.write = write;
}
@Override
public void run() {
for (; ; ) {
ThreadLocalRandom random = ThreadLocalRandom.current();
try {
Thread.sleep(random.nextInt(500, 2000));
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
boolean first = random.nextBoolean();
if (write) {
Integer e = random.nextInt(1000);
if (first) {
deque.offerFirst(e);
System.out.printf("deque.offerFirst(%3d) --> deque = %s%n",
e, deque);
} else {
deque.offerLast(e);
System.out.printf("deque.offerLast(%3d) --> deque = %s%n",
e, deque);
}
} else {
if (first)
System.out.printf(" deque.pollFirst() = %4d --> deque = %s%n",
deque.pollFirst(), deque);
else
System.out.printf(" deque.pollLast() = %4d --> deque = %s%n",
deque.pollLast(), deque);
}
}
}
}
}
Example output:
deque.offerFirst(388) --> deque = [388] deque.offerLast(900) --> deque = [388, 900] deque.pollFirst() = 388 --> deque = [900] deque.offerLast(906) --> deque = [900, 906] deque.pollLast() = 906 --> deque = [900] deque.pollFirst() = 900 --> deque = [] deque.offerLast(727) --> deque = [727] deque.offerFirst(720) --> deque = [720, 727] deque.pollLast() = 727 --> deque = [720] deque.offerFirst(415) --> deque = [415, 720] deque.offerLast(614) --> deque = [415, 720, 614] deque.offerFirst(130) --> deque = [130, 415, 720, 614]
The class java.util.PriorityQueue
(→ Javadoc) is not a queue in its classical sense since its elements are not removed in FIFO order, but according to their natural ordering or according to a Comparator
passed to the constructor.
The smallest element is located at the head of the queue. Sorting is not stable; that is, two elements that are in the same position according to the sort order are not necessarily removed in the same order in which they were added to the queue.
Further characteristics of PriorityQueue
:
ConcurrentModificationException
if the queue is modified concurrently.size()
: O(1)You can use PriorityQueue
when you need a not thread-safe queue with the dequeue order described above.
However, it should be noted that PriorityQueue
is only used in very few places in the JDK and, therefore, there is a certain probability for the presence of bugs (what is little used is little tested).
Here is a short example with ten random numbers being added to and then taken from a PriorityQueue
(→ code in GitHub):
public class PriorityQueueExample {
public static void main(String[] args) {
Queue<Integer> queue = new PriorityQueue<>();
// Enqueue random numbers
for (int i = 0; i < 10; i++) {
Integer e = ThreadLocalRandom.current().nextInt(100);
queue.offer(e);
System.out.printf("queue.offer(%2d) --> queue = %s%n", e, queue);
}
// Dequeue all elements
while (!queue.isEmpty()) {
System.out.printf("queue.poll() = %2d --> queue = %s%n",
queue.poll(), queue);
}
}
}
The following is an exemplary output of the program, where you can nicely see how the elements in the queue are displayed in no particular order (only the smallest element is always in front) and how they are eventually removed in ascending order:
queue.offer(44) --> queue = [44] queue.offer(18) --> queue = [18, 44] queue.offer(58) --> queue = [18, 44, 58] queue.offer(98) --> queue = [18, 44, 58, 98] queue.offer(45) --> queue = [18, 44, 58, 98, 45] queue.offer(83) --> queue = [18, 44, 58, 98, 45, 83] queue.offer(31) --> queue = [18, 44, 31, 98, 45, 83, 58] queue.offer(21) --> queue = [18, 21, 31, 44, 45, 83, 58, 98] queue.offer(22) --> queue = [18, 21, 31, 22, 45, 83, 58, 98, 44] queue.offer(19) --> queue = [18, 19, 31, 22, 21, 83, 58, 98, 44, 45] queue.poll() = 18 --> queue = [19, 21, 31, 22, 45, 83, 58, 98, 44] queue.poll() = 19 --> queue = [21, 22, 31, 44, 45, 83, 58, 98] queue.poll() = 21 --> queue = [22, 44, 31, 98, 45, 83, 58] queue.poll() = 22 --> queue = [31, 44, 58, 98, 45, 83] queue.poll() = 31 --> queue = [44, 45, 58, 98, 83] queue.poll() = 44 --> queue = [45, 83, 58, 98] queue.poll() = 45 --> queue = [58, 83, 98] queue.poll() = 58 --> queue = [83, 98] queue.poll() = 83 --> queue = [98] queue.poll() = 98 --> queue = []
In the following sections, I describe all of the JDK's blocking queue and deque implementations, i.e., those classes that implement the BlockingQueue
or BlockingDeque
interfaces.
The class java.util.concurrent.ArrayBlockingQueue
(→ Javadoc) is based – like ArrayDeque
– on an array, but is
Thread safety is guaranteed by a single ReentrantLock
, so that high thread contention is possible with simultaneous reading and writing threads.
Further characteristics of ArrayBlockingQueue
:
size()
: O(1)Due to the possibly high contention with simultaneous read and write access, you should – if you need a blocking, thread-safe queue – test for your specific use case whether a LinkedBlockingQueue
might have a better performance. It's based on a linked list but uses two separate ReentrantLock
s for reading and writing, which reduces thread contention.
To try out ArrayBlockingQueue
, you can copy the "Java BlockingQueue Example" from above and replace
BlockingQueue<Integer> queue = new LinkedBlockingQueue<>(3);
with
BlockingQueue<Integer> queue = new ArrayBlockingQueue<>(3);
The classes java.util.concurrent.LinkedBlockingQueue
(→ Javadoc) and java.util.concurrent.LinkedBlockingDeque
(→ Javadoc) are based – just like ConcurrentLinkedQueue
and ConcurrentLinkedDeque
– on linked lists, but are – just like ArrayBlockingQueue
–
In contrast to ArrayBlockingQueue
, thread safety of ConcurrentLinkedQueue
is guaranteed not with a single but with two separate ReentrantLocks – one for writing and one for reading. This reduces thread contention between producer and consumer threads.
Further characteristics of LinkedBlockingQueue
and LinkedBlockingDeque
:
size()
: O(1) – the implementations internally use an AtomicInteger
to keep their size at hand (in contrast to ConcurrentLinkedQueue
and ConcurrentLinkedDeque
, which must iterate over all elements for computing their sizes).I recommend using LinkedBlockingQueue
or LinkedBlockingDeque
if you need a blocking, thread-safe queue, or deque.
The LinkedBlockingQueue
class is used by Executors.newFixedThreadPool()
and Executors.newSingleThreadedExecutor()
as a "work queue" for the executor. It is thus used intensively, which keeps the probability of bugs fairly low.
You can find code examples for LinkedBlockingQueue
and LinkedBlockingDeque
in the sections "Java BlockingQueue Example" and "Java BlockingDeque Example" above.
The class java.util.concurrent.PriorityBlockingQueue
(→ Javadoc) is a thread-safe and blocking variant of PriorityQueue
.
Thread safety is provided by a single ReentrantLock
.
Unlike all blocking implementations presented so far, PriorityBlockingQueue
is not bounded, so it has no capacity limit. This means that the method add(e)
throws no exception, offer(e)
never returns false
, and put(e)
and offer(e, time, unit)
never block. Only the dequeue operations block or return null
if the queue is empty.
Further characteristics of PriorityBlockingQueue
:
PriorityQueue
, the elements are stored in an array representing a "min heap"; the iterator traverses the elements in the corresponding order.size()
: O(1)PriorityBlockingQueue
is not used in the JDK. So there is a certain probability that it contains bugs. If you need a queue with these characteristics and you use PriorityBlockingQueue
, you should test intensively.
As an example of PriorityBlockingQueue
, I have slightly modified the code from the section "Java BlockingQueue Example": Instead of a LinkedBlockingQueue
, it uses a PriorityBlockingQueue
; and instead of the numbers 0 to 9, it inserts random numbers into the queue.
Due to the minimal code changes, I do not print the example here. You can find it here in my GitHub repository. When you run it, you'll see that PriorityBlockingQueue
– just like PriorityQueue
– always has the smallest element at its head and returns it first.
Towards the end, we come to the JDK's very special queue implementations.
The class java.util.concurrent.DelayQueue
(→ Javadoc) is – like PriorityQueue
, which uses it internally – no FIFO queue. Instead, elements can only be removed after a specific delay assigned to each element has expired.
To do this, elements must implement the interface java.util.concurrent.Delayed
(→ Javadoc) and its getDelay()
method. For each call, this method returns the remaining delay that must expire before the element can be removed from the queue.
Just like PriorityBlockingQueue
, DelayQueue
is blocking but unbounded. It can, therefore, take any number of elements, and only blocks when trying to remove elements if it's empty.
Thread safety is ensured by a single ReentrantLock
.
Further characteristics of DelayQueue
:
size()
: O(1) as for the underlying PriorityQueue
I have never needed DelayQueue
before and cannot recommend it for any useful purpose known to me. It is only used once in the JDK (in an old Swing class that could probably have been implemented more elegantly with a ScheduledExecutorService
), so it may contain bugs.
In the following example (→ code in GitHub), a DelayQueue
is filled with objects of the class DelayedElement
, which contain a random number and a random initial delay between 100 and 1,000 ms. After that, poll()
is called until the queue is empty again.
public class DelayQueueExample {
public static void main(String[] args) {
BlockingQueue<DelayedElement<Integer>> queue = new DelayQueue<>();
ThreadLocalRandom tlr = ThreadLocalRandom.current();
long startTime = System.currentTimeMillis();
// Enqueue random numbers with random initial delays
for (int i = 0; i < 7; i++) {
DelayedElement<Integer> e = new DelayedElement<>(
tlr.nextInt(10, 100),
tlr.nextInt(100, 1000));
queue.offer(e);
System.out.printf("[%3dms] queue.offer(%s) --> queue = %s%n",
System.currentTimeMillis() - startTime, e, queue);
}
// Dequeue all elements
while (!queue.isEmpty()) {
try {
DelayedElement<Integer> e = queue.take();
System.out.printf("[%3dms] queue.poll() = %s --> queue = %s%n",
System.currentTimeMillis() - startTime, e, queue);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
}
}
And here is the associated class DelayedElement
(→ code in GitHub). To avoid making the code even longer, sorting is not stable, i.e., if two elements with the same delay are placed in the queue, they will be removed in random order.
public class DelayedElement<E extends Comparable<E>> implements Delayed {
private final E e;
private final long initialDelayMillis;
private final long expiration;
public DelayedElement(E e, long initialDelayMillis) {
this.e = e;
this.initialDelayMillis = initialDelayMillis;
this.expiration = System.currentTimeMillis() + initialDelayMillis;
}
@Override
public long getDelay(TimeUnit unit) {
long remainingDelayMillis = expiration - System.currentTimeMillis();
return unit.convert(remainingDelayMillis, TimeUnit.MILLISECONDS);
}
@Override
public int compareTo(Delayed o) {
DelayedElement<?> other = (DelayedElement<?>) o;
return Long.compare(expiration, other.expiration);
}
@Override
public String toString() {
return "{" + e + ", " + initialDelayMillis + "ms}";
}
}
Here is an exemplary output of the program. It is clearly visible that the queue is not sorted, but the element with the shortest remaining delay is always in front (left) and elements are removed after their respective delay has expired:
[ 1ms] queue.offer({19, 713ms}) --> queue = [{19, 713ms}] [ 28ms] queue.offer({15, 643ms}) --> queue = [{15, 643ms}, {19, 713ms}] [ 29ms] queue.offer({35, 253ms}) --> queue = [{35, 253ms}, {19, 713ms}, {15, 643ms}] [ 29ms] queue.offer({11, 781ms}) --> queue = [{35, 253ms}, {19, 713ms}, {15, 643ms}, {11, 781ms}] [ 29ms] queue.offer({17, 844ms}) --> queue = [{35, 253ms}, {19, 713ms}, {15, 643ms}, {11, 781ms}, {17, 844ms}] [ 29ms] queue.offer({40, 490ms}) --> queue = [{35, 253ms}, {19, 713ms}, {40, 490ms}, {11, 781ms}, {17, 844ms}, {15, 643ms}] [ 30ms] queue.offer({39, 119ms}) --> queue = [{39, 119ms}, {19, 713ms}, {35, 253ms}, {11, 781ms}, {17, 844ms}, {15, 643ms}, {40, 490ms}] [151ms] queue.poll() = {39, 119ms} --> queue = [{35, 253ms}, {19, 713ms}, {40, 490ms}, {11, 781ms}, {17, 844ms}, {15, 643ms}] [283ms] queue.poll() = {35, 253ms} --> queue = [{40, 490ms}, {19, 713ms}, {15, 643ms}, {11, 781ms}, {17, 844ms}] [520ms] queue.poll() = {40, 490ms} --> queue = [{15, 643ms}, {19, 713ms}, {17, 844ms}, {11, 781ms}] [673ms] queue.poll() = {15, 643ms} --> queue = [{19, 713ms}, {11, 781ms}, {17, 844ms}] [716ms] queue.poll() = {19, 713ms} --> queue = [{11, 781ms}, {17, 844ms}] [811ms] queue.poll() = {11, 781ms} --> queue = [{17, 844ms}] [874ms] queue.poll() = {17, 844ms} --> queue = []
Another special queue is java.util.concurrent.LinkedTransferQueue
(→ Javadoc). Like PriorityBlockingQueue
, this is an unbounded blocking queue, i.e., only the dequeue operations can fail or block.
LinkedTransferQueue
is based on a linked list, and thread safety is achieved through non-blocking compare-and-swap operations, just like ConcurrentLinkedQueue
and ConcurrentLinkedDeque
.
The special feature of this queue is that it provides additional transfer()
and tryTransfer()
methods that block until the element added by them has been removed from the queue again (in contrast to: until space in the queue has become available). For details of these methods, I refer to the Javadoc linked above.
Further characteristics of LinkedTransferQueue
:
size()
: O(n) – the method iterates over the entire queue to determine its size.LinkedTransferQueue
is not used in the JDK. It was initially implemented for the fork/join framework introduced in JDK 7 but was ultimately not used for this purpose. The probability of bugs is, therefore, quite high so that this class should not be used.
The following example (→ code in GitHub) starts five threads that invoke LinkedTransferQueue.transfer()
. After that, elements are taken from the queue until it is empty again.
public class LinkedTransferQueueExample {
public static void main(String[] args) {
LinkedTransferQueue<Integer> queue = new LinkedTransferQueue<>();
// Start 5 threads calling queue.transfer()
for (int i = 0; i < 5; i++) {
final int finalI = i;
new Thread(() -> {
log("Calling queue.transfer(%d) (queue = %s)...", finalI, queue);
try {
queue.transfer(finalI);
log("queue.transfer(%d) returned (queue = %s)", finalI, queue);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}).start();
try {
Thread.sleep(100);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
log("queue = " + queue);
// Now take all elements until the queue is empty
while (!queue.isEmpty()) {
log(" Calling queue.take() (queue = %s)...", queue);
try {
Integer e = queue.take();
log(" queue.take() returned %d (queue = %s)", e, queue);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
}
private static void log(String format, Object... args) {
System.out.printf(Locale.US, "[%-8s] %s%n",
Thread.currentThread().getName(),
String.format(format, args));
}
}
In the output of the example, you can clearly see how the queue.transfer()
calls first block and then resume as soon as queue.take()
has taken the value passed by transfer()
:
[Thread-0] Calling queue.transfer(0) (queue = [])... [Thread-1] Calling queue.transfer(1) (queue = [0])... [Thread-2] Calling queue.transfer(2) (queue = [0, 1])... [Thread-3] Calling queue.transfer(3) (queue = [0, 1, 2])... [Thread-4] Calling queue.transfer(4) (queue = [0, 1, 2, 3])... [main ] queue = [0, 1, 2, 3, 4] [main ] Calling queue.take() (queue = [0, 1, 2, 3, 4])... [main ] queue.take() returned 0 (queue = [1, 2, 3, 4]) [Thread-0] queue.transfer(0) returned (queue = [1, 2, 3, 4]) [main ] Calling queue.take() (queue = [1, 2, 3, 4])... [main ] queue.take() returned 1 (queue = [2, 3, 4]) [Thread-1] queue.transfer(1) returned (queue = [2, 3, 4]) [main ] Calling queue.take() (queue = [2, 3, 4])... [main ] queue.take() returned 2 (queue = [3, 4]) [Thread-2] queue.transfer(2) returned (queue = [3, 4]) [main ] Calling queue.take() (queue = [3, 4])... [main ] queue.take() returned 3 (queue = [4]) [Thread-3] queue.transfer(3) returned (queue = [4]) [main ] Calling queue.take() (queue = [4])... [main ] queue.take() returned 4 (queue = []) [Thread-4] queue.transfer(4) returned (queue = [])
Let's have a look at the final special blocking queue, java.util.concurrent.SynchronousQueue
(→ Javadoc). "Synchronous" is not to be confused with "synchronized". It rather means that each enqueue operation has to wait for a corresponding dequeue operation, and each dequeue operation has to wait for an enqueue operation.
A SynchronousQueue
never contains elements, even if enqueue operations are currently waiting for dequeue operation. Likewise, the size of a SynchronousQueue
is always 0, and peek()
always returns null
.
Besides ArrayBlockingQueue
, SynchronousQueue
is the second queue implementation that offers a fairness policy. There is a peculiarity here: If the fairness policy is not activated, blocking method calls are resumed in unspecified order according to the documentation. In fact, these methods are resumed in precisely the opposite order (LIFO order), since a stack is used internally.
Further characteristics of SynchronousQueue
:
size()
: O(1) – size()
always returns 0.Just like DelayQueue
and LinkedTransferQueue
, I have never used SynchronousQueue
directly in my projects.
If its characteristics match your requirements, you can use it safely. In the JDK, SynchronousQueue
is used by Executors.newCachedThreadPool()
as a "work queue" for the executor, so the probability of bugs is extremely low.
The following example (→ code in GitHub) first starts three threads that call queue.put()
, then six threads that call queue.take()
, and then another three threads that invoke queue.put()
:
public class SynchronousQueueExample {
private static boolean FAIR = false;
public static void main(String[] args) {
BlockingQueue<Integer> queue = new SynchronousQueue<>(FAIR);
// Start 3 producing threads
for (int i = 0; i < 3; i++) {
final int finalI = i;
new Thread(() -> enqueue(queue, finalI)).start();
sleep(250);
}
// Start 6 consuming threads
for (int i = 0; i < 6; i++) {
new Thread(() -> dequeue(queue)).start();
sleep(250);
}
// Start 3 more producing threads
for (int i = 3; i < 6; i++) {
final int finalI = i;
new Thread(() -> enqueue(queue, finalI)).start();
sleep(250);
}
}
private static void enqueue(BlockingQueue<Integer> queue, int finalI) {
log("Calling queue.put(%d) (queue = %s)...", finalI, queue);
try {
queue.put(finalI);
log("queue.put(%d) returned (queue = %s)", finalI, queue);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
private static void dequeue(BlockingQueue<Integer> queue) {
log(" Calling queue.take() (queue = %s)...", queue);
try {
Integer e = queue.take();
log(" queue.take() returned %d (queue = %s)", e, queue);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
private static void sleep(long millis) {
try {
Thread.sleep(millis);
} catch (InterruptedException e) {
Thread.currentThread().interrupt();
}
}
private static void log(String format, Object... args) {
System.out.printf(Locale.US, "[%-9s] %s%n",
Thread.currentThread().getName(),
String.format(format, args));
}
}
In the output, you can see how the first three calls of queue.put()
block until the added elements are taken by queue.take()
in reverse order.
After that, the three following invocations of queue.take()
block until three more elements have been written to the queue with queue.put()
.
The queue remains empty for the entire time.
[Thread-0 ] Calling queue.put(0) (queue = [])... [Thread-1 ] Calling queue.put(1) (queue = [])... [Thread-2 ] Calling queue.put(2) (queue = [])... [Thread-3 ] Calling queue.take() (queue = [])... [Thread-3 ] queue.take() returned 2 (queue = []) [Thread-2 ] queue.put(2) returned (queue = []) [Thread-4 ] Calling queue.take() (queue = [])... [Thread-4 ] queue.take() returned 1 (queue = []) [Thread-1 ] queue.put(1) returned (queue = []) [Thread-5 ] Calling queue.take() (queue = [])... [Thread-5 ] queue.take() returned 0 (queue = []) [Thread-0 ] queue.put(0) returned (queue = []) [Thread-6 ] Calling queue.take() (queue = [])... [Thread-7 ] Calling queue.take() (queue = [])... [Thread-8 ] Calling queue.take() (queue = [])... [Thread-9 ] Calling queue.put(3) (queue = [])... [Thread-9 ] queue.put(3) returned (queue = []) [Thread-8 ] queue.take() returned 3 (queue = []) [Thread-10] Calling queue.put(4) (queue = [])... [Thread-10] queue.put(4) returned (queue = []) [Thread-7 ] queue.take() returned 4 (queue = []) [Thread-11] Calling queue.put(5) (queue = [])... [Thread-11] queue.put(5) returned (queue = []) [Thread-6 ] queue.take() returned 5 (queue = [])
If you set the constant FAIR
to true
, you see how the elements are not taken in LIFO but FIFO order.
All thread-safe queue and deque implementations provided by the JDK can be safely used in multi-producer-multi-consumer environments. This means that one or more writing threads and one or more reading threads can access the JDK queues and deques concurrently.
With specific algorithms, it is possible to optimize queues and deques in such a way that the overhead for providing thread safety is minimized if there is a restriction to one reading and/or one writing thread.
Accordingly, one distinguishes the following four cases:
The open-source library JCTools provides highly optimized queue implementations for all four cases.
In this Ultimate Guide, I explained the basic workings of queues, deques, and stacks; I presented the corresponding interfaces and implementations in the JDK in detail using numerous code examples, and I made recommendations on which implementations to use in which cases.
For everyday use, these are:
ArrayDeque
for single-threaded applications.ConcurrentLinkedQueue
and ConcurrentLinkedDeque
as thread-safe, non-blocking, and unbounded queues/deques.ArrayBlockingQueue
as thread-safe, blocking, bounded queue, provided you expect little contention between producer and consumer threads.LinkedBlockingQueue
as thread-safe, blocking, bounded queues if you expect a rather high contention between producer and consumer threads (it is best to test which implementation is better performing for your use case).LinkedBlockingDeque
as thread-safe, blocking, bounded deque.If you find the article helpful, feel free to share it using one of the share buttons below. If you would like to be informed when I publish new articles, please subscribe to my mailing list using the following form.