
1. Introduction
In this tutorial, we’ll compare Java Stream and Flux.fromIterable(). We’ll start by exploring their similarities and then dive into their key differences.
2. Java Stream Overview
A Stream represents a sequence of elements supporting functional-style operations. Stream instances are synchronous, pull-based, and process elements only when a terminal operation is invoked.
Let’s look at its key characteristics:
- Supports sequential and parallel execution
- Uses functional-style operations (map(), filter(), etc.)
- Lazy execution (executes when the terminal operation is applied)
- Synchronous and pull-based
3. Flux.fromIterable() Overview
Flux.fromIterable() is a factory method in Project Reactor that creates a Flux from an existing Iterable such as a List, Set, or Collection. It’s useful when we have a collection of data and want to process it reactively.
Here are its key characteristics:
- Supports asynchronous and non-blocking processing
- Provides reactive programming operations (map(), filter(), flatMap(), etc.)
- Push-based – data is emitted as it becomes available
4. Similarities Between Stream and Flux.fromIterable()
Despite being built for different paradigms, both Stream and Flux.fromIterable() share some commonalities. Let’s take a closer look at these commonalities and understand how they align in data processing.
4.1. Functional Style
Both of these support the functional style of programming. We can use filter(), map(), and reduce()-like operations and also chain multiple operations together to create a declarative and readable pipeline for processing data.
Let’s look into an example where we filter a list of numbers to retain only the even ones and then double their values:
@Test
void givenList_whenProcessedWithStream_thenReturnDoubledEvenNumbers() {
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
List<Integer> doubledEvenNumbers = numbers.stream()
.filter(n -> n % 2 == 0)
.map(n -> n * n)
.toList();
assertEquals(List.of(4, 16), doubledEvenNumbers);
}
Now, let’s execute the same example using Flux.fromIterable():
@Test
void givenList_whenProcessedWithFlux_thenReturnDoubledEvenNumbers() {
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
Flux<Integer> fluxPipeline = Flux.fromIterable(numbers)
.filter(n -> n % 2 == 0)
.map(n -> n * 2);
StepVerifier.create(fluxPipeline)
.expectNext(4, 16);
}
4.2. Lazy Evaluation
Both Stream and Flux instances are lazy, meaning operations are only performed when the result is actually needed.
For Stream instances, a terminal operator is needed to execute the pipeline:
@Test
void givenList_whenNoTerminalOperator_thenNoResponse() {
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
Function<Integer, Integer> mockMapper = mock(Function.class);
Stream<Integer> streamPipeline = numbers.stream()
.map(mockMapper);
verifyNoInteractions(mockMapper);
List<Integer> mappedList = streamPipeline.toList();
verify(mockMapper, times(5));
}
In this example we can see, that there is no interaction with the mockMapper function until the terminal operator toList() is called.
Similarly, the Flux only starts processing when there is a subscriber:
@Test
void givenList_whenFluxNotSubscribed_thenNoResponse() {
Function<Integer, Integer> mockMapper = mock(Function.class);
when(mockMapper.apply(anyInt())).thenAnswer(i -> i.getArgument(0));
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
Flux<Integer> flux = Flux.fromIterable(numbers)
.map(mockMapper);
verifyNoInteractions(mockMapper);
StepVerifier.create(flux)
.expectNextCount(5)
.verifyComplete();
verify(mockMapper, times(5)).apply(anyInt());
}
In this example, we can verify that Flux doesn’t interact with mockMapper initially. However, once we subscribe to it using StepVerifier, we can observe that Flux starts interacting with its method.
5. Key Differences Between Stream and Flux.fromIterable()
Although there are a few similarities between Stream and Flux.fromIterable(), they differ in their purpose, nature, and how they process elements. Let’s explore some key differences between these two.
5.1. Synchronous vs. Asynchronous
One of the key differences between Stream and Flux.fromIterable() is how they handle execution.
Stream operates synchronously, meaning computations happen sequentially on the same thread invoking the terminal operation. If we process a large dataset or perform time-consuming tasks with a Stream, it can block the thread until completion. There is no built-in way to execute a Stream instance asynchronously without explicitly using parallelStream() or manually managing threads. Even in parallel mode, streams run in a blocking manner.
On the other hand, Flux.fromIterable() focuses on synchronous vs. asynchronous. By default, it behaves synchronously – emitting elements on the calling thread, similar to a sequential stream. However, Flux provides built-in support for asynchronous and non-blocking execution, which can be enabled using operators like subscribeOn(), publishOn(), and delayElements(). This allows Flux to process elements concurrently without blocking the main thread, making it more suitable for reactive applications.
Let’s examine the asynchronous behavior of Flux.fromIterable():
@Test
void givenList_whenProcessingTakesLongerThanEmission_thenEmittedBeforeProcessing() {
VirtualTimeScheduler.set(VirtualTimeScheduler.create());
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
Flux<Integer> sourceFlux = Flux.fromIterable(numbers)
.delayElements(Duration.ofMillis(500));
Flux<Integer> processedFlux = sourceFlux.flatMap(n ->
Flux.just(n * n)
.delayElements(Duration.ofSeconds(1))
);
StepVerifier.withVirtualTime(() -> Flux.merge(sourceFlux, processedFlux))
.expectSubscription()
.expectNoEvent(Duration.ofMillis(500))
.thenAwait(Duration.ofMillis(500 * 5))
.expectNextCount(7)
.thenAwait(Duration.ofMillis(5000))
.expectNextCount(3)
.verifyComplete();
}
In the above example, we emit numbers every 500ms and delay the processing by 1s. Using merge(), we merge the results of both Flux instances. At the end of 2500ms, all elements are emitted and two have been processed, hence we expect seven items in total. After five seconds, all the elements are processed, so we expect the remaining three items.
Here, the emitter doesn’t wait for the processor to finish its operation, rather it keeps emitting the values independently.
5.2. Exception Handling
In Stream, exception causes immediate termination of the pipeline. If an element throws an exception during processing, the entire stream stops and no further elements are processed.
This is because Stream instances treat exceptions as terminal events. We must rely on try-catch blocks inside the map(), filter(), or use custom exception handling logic at each stage to prevent failure from propagating.
Let’s take a look at how a Stream pipeline behaves when encountering a division by zero exception:
@Test
void givenList_whenDividedByZeroInStream_thenThrowException() {
List<Integer> numbers = List.of(1, 2, 0, 4, 5);
assertThrows(ArithmeticException.class, () -> numbers.stream()
.map(n -> 10 / n)
.toList());
}
Here, the exception terminates the pipeline immediately, and 4, and 5 are never processed.
In contrast, Flux treats errors as data, meaning errors are propagated through a separate error channel rather than terminating the pipeline outright. This allows us to handle exceptions gracefully by using built-in methods like onErrorResume(), onErrorContinue(), and onErrorReturn(), available in Flux, ensuring the processing can continue even when individual elements fail.
Now, let’s see how the same processing works with Flux.fromIterable():
@Test
void givenList_whenDividedByZeroInFlux_thenReturnFallbackValue() {
List<Integer> numbers = List.of(1, 2, 0, 4, 5);
Flux<Integer> flux = Flux.fromIterable(numbers)
.map(n -> 10 / n)
.onErrorResume(e -> Flux.just(-1));
StepVerifier.create(flux)
.expectNext(10, 5, -1)
.verifyComplete();
}
Here the exception is caught, and instead of failing, Flux emits a fallback value (-1). This makes Flux more resilient, allowing it to handle real-world scenarios like network failure, database error, or unexpected input.
5.3. Single Pipeline vs. Multiple Subscribers
A fundamental limitation of Stream is that it represents a single-use pipeline. Once a terminal operation, like forEach() or collect() is invoked, the Stream is consumed and cannot be reused. Each time we need to process the same dataset, a new Stream must be created:
@Test
void givenStream_whenReused_thenThrowException() {
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
Stream<Integer> doubleStream = numbers.stream()
.map(n -> n * 2);
assertEquals(List.of(2, 4, 6, 8, 10), doubleStream.toList());
assertThrows(IllegalStateException.class, doubleStream::toList);
}
In the above example, when we call the terminal operation toList() on the Stream for the first time, it returns the expected result. However, when we attempt to invoke it again, it throws an IllegalStateException because Stream instances cannot be reused once they have been consumed.
On the other hand, Flux supports multiple subscribers, allowing different consumers to react to the same data independently. This enables event-driven architectures where multiple components in a system can respond to the same data stream without reloading the source.
Now, let’s perform the same operation using Flux and see how it behaves:
@Test
void givenFlux_whenMultipleSubscribers_thenEachReceivesData() {
List<Integer> numbers = List.of(1, 2, 3, 4, 5);
Flux<Integer> flux = Flux.fromIterable(numbers).map(n -> n * 2);
StepVerifier.create(flux)
.expectNext(2, 4, 6, 8, 10)
.verifyComplete();
StepVerifier.create(flux)
.expectNext(2, 4, 6, 8, 10)
.verifyComplete();
}
Here, we can see that each subscriber has received the same data and there is no exception thrown. This makes Flux more versatile than Stream, especially in scenarios where multiple components need to listen to and process the same data without duplicating the source.
5.4. Performance
Stream instances are optimized for high-performance in-memory processing. They operate eagerly, meaning all transformations happen in a single pass over the data.
Additionally, Stream supports parallel processing using parallelStream(), utilizing multiple CPU cores via ForkJoinPool to further improve execution speed for large datasets.
On the other hand, Flux.fromIterable() is designed for reactive data handling, making it more flexible but inherently slower for in-memory computations.
Since Flux is asynchronous by nature, every data element is wrapped in reactive signals (onNext, onError, onComplete), adding processing overhead.
Flux follows a non-blocking execution model, which can be beneficial for I/O heavy operations but less efficient for purely computational tasks.
6. Feature Comparison: Java Stream vs. Flux.fromIterable()
Feature | Java Stream | Flux.fromIterable() |
---|---|---|
Execution Model | Pull-based – The consumer requests data when needed | Push-based – The producer pushes data to the consumer asynchronously |
Processing Style | Functional, pipeline-base | Functional, reactive, event-based |
Synchronous or Asynchronous | Synchronous – Can be parallelized using parallelStream() | Asynchronous – Can execute on multiple threads without blocking |
Error Handling | No built-in support. We need to use a try-catch block | Errors are treated as data and propagated through a separate error channel |
Multiple Subscribers | Not supported – Once consumed, the stream cannot be reused | Multiple subscribers can listen to the same Flux |
Use case | Useful for fast, CPU-bound, in-memory transformations | When we need reactive features like asynchronous and non-blocking, handle errors reactively, and batch processing |
7. Conclusion
In this article, we explored the similarities and key differences between Java’s Stream and Reactor’s Flux.fromIterable(). Understanding these distinctions allows us to choose the right approach based on our application’s needs.
As always, all code snippets used in this article are available over on GitHub.
The post Comparing Java Stream and Flux.fromIterable() first appeared on Baeldung.