WebFlux for Reactive Spring Boot microservice implemented with TDD, Testcontainers, and DynamoDB Async clients.

Andres Solorzano
11 min readFeb 1, 2023

--

My previous tutorial used TDD with Testcontainers to develop our Timer Service microservice with Spring Boot 3. Now, it’s time to add reactive programming support to our service using WebFlux employing the same TDD approach. So let’s get started.

To complete this guide, you’ll need the following tools:

NOTE: You can download the project’s source code from my GitHub repository to review the latest changes made in this tutorial.

1. Project Structure.

We will use the same project structure as in the previous tutorial, but we need to start again from scratch. This is because refactoring the code is tedious, and we must refactor the integration tests. The easiest way is to clone the previous project with a different name to start working.

Then, remove the classes from the service and controller packages, including its test classes:

The next step is adding the corresponding Maven dependencies needed for our project.

2. Maven Dependencies.

Let’s start adding the required dependencies to our <pom.xml>:

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-webflux</artifactId>
</dependency>
<dependency>
<groupId>io.projectreactor</groupId>
<artifactId>reactor-test</artifactId>
<scope>test</scope>
</dependency>

The first dependency must replace the regular Spring Boot <web> dependency because we’re using reactive web support.

The other dependency we must upgrade is the AWS-SDK version 2. Concerning our project, explicitly talking about the DynamoDB dependency:

<dependency>
<groupId>software.amazon.awssdk</groupId>
<artifactId>dynamodb-enhanced</artifactId>
</dependency>

Version 2 of the AWS-SDK allows us to couple our reactive programming with the CRUD operations against DynamoDB. So this change is also essential for our project.

Finally, with these changes, some classes will not compile. Classes like <Device> or <DynamoDBConfig> will not compile. Try to remove the code that doesn’t compile. Later, we’re adding the specific code for those classes. Remember that we have the previous project to copy the required code.

2.1 Spring Data R2DBC.

Lamentably, when I write these lines, the Quartz and Flyway libraries use only JDBC to operate. So these 2 libraries need the Spring Data JPA and cannot work with the Spring Data R2DBC for reactive programming in the database layer. I tried many examples, but it isn’t easy, and in the end, code will be not maintainable. So I hope these 2 dependencies will support the reactive R2DBC library soon.

3. Test-Driven Development (TDD).

In our previous tutorial, we used TDD from the beginning. We added some components in the data layer until we arrived at the controller layer, building integration tests in each step. We must do the same thing here, so let’s get started.

3.1 Testcontainers (Postgres and DynamoDB).

The class <AbstractContainerBase> must be the same. The type must configure and starts the Postgres and DynamoDB containers. Also, remember that the configuration made in the class is still the same:

Remember that these configurations are set on the fly when the integration tests run.

3.2 Quartz and Flyway.

As we mentioned before, these libraries still use the Spring Data JPA configuration. So, the same configuration must be defined in the <application.properties> file:

spring.flyway.enabled=true
spring.flyway.connect-retries=3
spring.flyway.baseline-on-migrate=true
spring.flyway.locations=classpath:db/migration
spring.flyway.baseline-description=Initial Quartz migration

spring.quartz.job-store-type=jdbc
spring.quartz.jdbc.initialize-schema=never
spring.quartz.properties.org.quartz.scheduler.instanceName=HiperiumCityTasksScheduler
spring.quartz.properties.org.quartz.jobStore.driverDelegateClass=org.quartz.impl.jdbcjobstore.PostgreSQLDelegate

The migration scripts used by the Flyway library are the same, and they must be saved in the <db/migration> directory as usual.

3.3 Tasks Data Layer.

The <Task>, <TaskRepository>, and <TaskRepositoryTest> must be the same. Copy them from the previous project and paste them into the corresponding packages. Run the <TaskRepositoryTest> class to verify that the persistence layer is working correctly:

So far, so good. The next step is to continue with the <Service> layer, using a reactive programming fashion.

4. Project Reactor.

Before discussing WebFlux, let’s talk about the library behind the scenes, Project Reactor. It’s a reactive library based on the Reactive Streams specification for building non-blocking applications. It provides the <Mono> class to operate 0 or 1 object and the <Flux> class to operate 1 or many objects in a reactive way. The idea is to use these classes in our <Service> layer and make some integration tests using the <StepVerify> class provider by the <Reactor library>. So let’s get started.

4.1 Service Layer.

First, verify that the classes <TaskJob> and <JobsUtil> contain the same code as the previous TDD project. Then, create the class <TaskService> and copy only the method signatures that we need for our actual project:

As we mentioned earlier, to use the <Reactor> library, we need to change the return type of the methods using <Mono> for methods that return only one <Task> object and <Flux> for methods that return a list of <Task>:

We now have the service shell. So then create the <TaskServiceTest> class with the test methods that we need to test:

We have the test shell class to implement the test code we need for our service layer. So now we can start a TDD iteration again to complete the required functionality and test code.

This is an example of how could be the update <Task> method:

And the <update> test method using the <StepVerifier> utility class:

After we complete the rest of the services and integration tests, when we execute the test class, these must be the results:

So far, so good. The next step is to start a new TDD cycle for the <Controller> layer. This time using the WebFlux library.

5. Spring WebFlux.

Spring WebFlux requires the <Reactor> library as a core dependency. A WebFlux API accepts a plain <Publisher> as input, adapts it to a <Reactor> type internally, and returns either a Flux or a Mono as output. So as we did in the <Service> layer, let’s create the required classes.

5.1 Controller layer.

Let’s create the shell classes, one for the <Tasks> controller and another for the <Tasks> tests:

As usual in this tutorial, we replaced the original return types with the <Mono> and <Flux> classes as appropriate. The test class is the same:

Here we have 2 new updates. The first one uses the <WebTestClient> class which belongs to the <Reactor> library, and it’s used to access the reactive endpoints of our controllers. The second update is the <@AutoConfigureWebTestClient> annotation that allows us to inject a valid <WebTestClient> object using the <@Autowired>. With a valid object injected, the client has injected the valid properties to access the URI endpoints.

So now, let’s complete the required code with a new TDD cycle using our integration test and controller classes.

The <TaskController> class is elementary:

And one of the integration test methods must be as follows:

Notice that the <WebTestClient> class has a method called <value> where you can extract the <Task> object as a result of the response and execute all the <Assertions> we want to validate the results of our web invocations.

After completing all TDD iteration cycles, if we execute the <TaskControllerTest>, these must be the results:

So we have completed the implementation code for the <Repository>, <Service>, and <Controller> layers using TDD with integration tests. So the next step is to perform some functional tests.

6. DynamoDB Async Client (AWS SDK 2).

Here we have some significant changes because we are now using the AWS SDK version 2, and our DynamoDB client can be asynchronous. So the bean producer on the class <DynamoDBConfig> must change to generate the new DynamoDB client:

And one more time, we need to create shell classes for the <DeviceRepository> and <DeviceRepositoryTest> that are still using the same methods as the previous TDD project:

The DynamoDB async client uses the <CompatableFuture> class to operate against the tables on AWS. So, as we use the <Reactor> library, we can use the <fromFuture> method to resolve the method calls:

And the test method could be as used in the previous sections:

When running the <DeviceRepositoryTest> class, the results must be as follow:

Finally, we must update the <TaskJob> class to use a reactive programming approach based on the last changes:

So far, we’re reaching the complete microservice refactor in terms of code, but we need a last step that it’s an improvement for our local architecture environment for testing purposes.

7. LocalStack.

LocalStack is a cloud service emulator that runs in a single container on our local environments. We can deploy our AWS applications on our local environment without connecting to AWS, reducing costs while developing our apps. And as its website mentioned:

LocalStack supports a growing number of AWS services, like AWS Lambda, S3, DynamoDB, Kinesis, SQS, SNS, and more! LocalStack Pro supports additional APIs and advanced features to make your cloud development experience a breeze!

The idea is to replace the <dynamodb-local> docker container and replace it with a <localstack> container. We can configure the services we want to activate when the <localstack> starts. So the configuration for our <localstack> container is the following in the <docker-compose.yml> file:

Notice that we are mounting a bash script called <create-resources.sh> that it’s executed by the <localstack> container at boot time:

We can run only the <localstack> container with the following command to see the results:

$ docker compose up tasks-localstack

And we can see that our <Devices> tables are created:

And also the <Device> item in the <devices.json> file:

In a new terminal window, execute the following command to retrieve the <Device> item created in our local DynamoDB:

$ aws dynamodb scan    \
--table-name Devices \
--endpoint-url http://localhost:4566

And you will see our <Device> item:

Finally, you can notice that I’m using the <localstack> container for integration tests using Testcontainers in the <AbstractContainerBase> class:

Notice that I’m setting <DYNAMODB> as a dependent service in the static block. Still, with the Testcontainers library, this configuration is straightforward, and the container is automatically configured for our integration tests.

Now it’s time to deploy the entire solution locally to see the results of all our changes.

8. Functional Testing (Docker Compose).

Well, it’s time to deploy our Reactive Spring Boot application locally and make some HTTP calls to verify the correct functionality of our microservice. So, execute the following command to deploy a local Docker cluster using the compose plugging:

$ docker compose up --build

Then, all the required containers must be started, and the Task Service microservice must be started successfully:

Then, open the Postman tool to make some CRUD operations and see the results. First, create a task with an execution time very close:

Then, when the scheduled job is executed, you must see the following output in the console:

Now let’s query the existing tasks in our microservice:

This is the only task we have created and executed previously. So let’s update the execution time of this task and execute a PUT operation:

The execution time and the name and description of the <Task> were updated. So let’s query this task by ID:

When we go to the terminal, we see now the <Task> executes at minute 10 and the last query by ID:

Finally, let’s delete this task using the Postman tool:

Notice that the HTTP response is 200 OK, so the <Task> with ID 1 was successfully deleted.

9. Running Integration Tests with Maven.

So far, we are running the integration tests using the IDE. But Maven can execute all test classes. So, go to your terminal window and execute the following command:

$ mvn test

And you must see that all Integration Tests are executed, and they must be successfully performed:

And that’s it!!!

We developed a Reactive Spring Boot microservice using WebFlux, TDD, and Testcontainers. We also used the DynamoDB Async client, which works very well with the Reactor library.

I hope this tutorial has been of interest to you. I’ll see you in my next article, where I will discuss how to implement a Spring Boot Native micrioservice using these same technologies and best practices.

I will see you soon.

--

--

Responses (1)