Implementing TDD with Integration Testing and Testcontainers for a Spring Boot application.

Andres Solorzano
13 min readJan 26, 2023

In previous tutorials, I used the Quarkus Framework to develop and deploy our Tasks Service on AWS (the back-end side). Now, it’s time to give a chance to the new version of the Spring Boot Framework 3, using the same component architecture (Quartz, JPA, and Flyway) as we did before with Quarkus. So, we aim to develop and deploy the Task Service utilizing Spring Boot and TDD from the beginning. For the Spring Native support, I will write another tutorial. So let’s get started.

To complete this guide, you’ll need the following tools:

NOTE 1: You can download the project’s source code from my GitHub repository to review the changes made in this tutorial.

NOTE 2: I will not use Unit Testing because mocking components make our tests unreal. That’s because I like to use Integration Tests, and better if I use Testcontainers together.

1. Test Driven Development (TDD).

First, go to the Spring Initializr page and generates a basic project using the web module, the H2 database, and the Lombok dependency. Then, create the initial base of packages for our components. In the development time, we could generate more as required:

It’s a good practice to create the tests from the Repository layer to the Controller layer passing from the Service layer. So, make the exact structure of packages in the “test” directory:

Then, create a new test class inside our Repository layer with the basic CRUD operations:

So let’s start developing the first CRUD operation test method.

The <Task> and <TaskRepository> classes must be created first. So let’s start with the <Task> class in the Model package:

And for the <TaskRepository> interface in the Repository package:

Then, develop the first test method:

If you run this method using the IntelliJ IDE, for example, this is the result:

We are using an in-memory database for this test. That’s why we have the H2 dependency in our project, but later we replace this dependency for the Postgres library.

So let’s complete the rest of the test CRUD methods and execute the entire test class:

In the next section, let’s add our Postgres database to make the tests more real.

2. Testcontainers.

In the POM file, remove the H2 dependency and add the following ones:

<dependency>
<groupId>org.postgresql</groupId>
<artifactId>postgresql</artifactId>
<scope>runtime</scope>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>postgresql</artifactId>
<version>1.17.6</version>
<scope>test</scope>
</dependency>

The first dependency is for the Postgres connector. The second one is for the Postgres test container itself.

Then, create the <PostgresContainerBase> class that initiates the Postgres database as a Docker container:

Update the <TaskRepositoryTest> class to extend the previous one. Also, add the “@AutoConfigureTestDatabase” annotation to indicate to Spring the use of defined connection properties instead of the in-memory ones:

Run the <TaskRepositoryTest> class. The methods must be executed successfully:

Notice that the Testcontainers library download (pulls) the Postgres image from DockerHub before running the integration tests. If you have the docker image in your local environment, then the image is not pulled.

Now it’s time to work with the Quartz objects to create timed tasks.

3. Quartz

Let’s now work on the Service layer. It’s time to develop business logic to create the Quartz Jobs in conjunction with the <Tasks> objects in the database. So, create a new test class called <TaskServiceTest> and makes the initial CRUD test methods:

Append the Quartz dependency on the POM file:

<dependency>
<groupId>org.springframework.boot</groupId>
<artifactId>spring-boot-starter-quartz</artifactId>
</dependency>

Then, add the following lines to the <application-test.properties> file:

tasks.time.zone.id=-05:00
spring.quartz.job-store-type=jdbc
spring.quartz.jdbc.initialize-schema=always

The first property is used to configure the Time Zone when we create the Quartz Job. The other 2 properties indicate to Quartz to use a defined JDBC connection to create its required tables. But remember that the database connection properties are described in the <PostgresTestContainerBase> class.

Now, develop the <create> method in the <TaskService> class:

Here, we create the Quartz Job before persisting the <Task> object into the Postgres database. I copied much of this code from our previous Quarkus project. So the new thing is to accommodate the corresponding Spring components and annotations apart from TDD classes.

The Quartz library needs an execution class and method to call when the scheduled job is triggered. For this reason, I create the <TaskJobExecution> class:

The <execute> method must be called by the Quartz library when the scheduled job is triggered.

Now, execute our integration test class to see the result:

So let’s complete the rest of the CRUD test methods and make some refactors to optimize the code while performing the test. Notice that when we use integration tests in conjunction with Testcontainers, we don’t need to deploy our Spring Boot application locally to verify the overall functionality. In our case, we’re doing so later. That’s the spirit of TDD.

After running all integration tests, the result must be the following:

Now, we need to move to the controller layer. Create a test class called <TaskControllerIT> with the following initial content:

Those are the initial test methods that we need to develop. So let’s create the <TaskController> class with the following content:

Try to test each method at the time. You’ll need to do some refactoring when you develop the second don’tRemember the mantra of TDD:

Red, green, refactor.

But don’t forget to think about what you need to test in the first place:

Image from Kaizenko blog.

When you complete the required code in the controller layer and made refactors, try to run all the integration tests in the <TaskControllerIT> to see the results:

So far, so good. Until now, we have developed and tested our 3 application layers using integration tests and Testcontaineit’srs:

But from the Quartz perspective, it’s a good practice to create the required tables in the database manually or automatically to take better control instead of letting the library do this for us. So in the next section, we’re viewing how to do that.

4. Flyway

Remember that in our previous project using Quarkus, the application was responsible for creating or recreating the QuThat’sables in the Postgres database. That’s our objective in this section but using our Spring Boot application with the help of the Flyway library.

We can get the Quartz database scripts from its GitHub repository. We can copy the one provided by Quarkus in our previous project. The difference with the original script provided by Quartz is that we are setting a sequence for our <Task> table, and we provide the table script as well:

The next step is to add the sequence to the <Task> entity class:

@Id
@GeneratedValue(strategy = GenerationType.SEQUENCE, generator = "HIP_CTY_TASKS_SEQ")
@SequenceGenerator(name = "HIP_CTY_TASKS_SEQ", sequenceName = "HIP_CTY_TASKS_SEQ", allocationSize = 1)
private long id;

Furthermore, add the Flywaydon’tndency in our POM file:

<dependency>
<groupId>org.flywaydb</groupId>
<artifactId>flyway-core</artifactId>
</dependency>

Also, we don’t need to create this table when the Postgres container is started using the Spring JPA configuration property. So remove this property from the <PostgresContainerBase> class:

registry.add("spring.jpa.hibernate.ddl-auto", () -> "update");

Next, add the following lines to the <application.properties> file:

spring.flyway.enabled=true
spring.flyway.connect-retries=3
spring.flyway.baseline-on-migrate=true
spring.flyway.locations=classpath:db/migration
spring.flyway.baseline-description=Initial Quartz migration

We also need to update the <spring.quartz.jdbc.initialize-schema> with the <never> value because we don’t need that the Quartz library creates the tables. This is the new job for our Flyway dependency:

spring.quartz.jdbc.initialize-schema=never

Finally, we need to remove the assignment of <1L> to the task ID. We did this initially because we didn’t have a database sequence. Now with the Quartz database migration, we have a table sequence.

So, it’s time for the truth. Execute all integration tests in the <TaskControllerIT> class to see the results:

You could also execute all integration test classes to verify that all are working successfully.

So far, so good. Now it’s time to deploy our tested Task Service locally and perform functional testing.

5. Docker Compose (Functional Testing).

Beforelet’siguring our Docker Compose file, let’s develop an essential missing part, the database connection. We used Testcontainers with Postgres data for our integration tests. Now, it’s time to adopt a similar approach for a <real> database connection. So let’s update our <TasksApplication> class with the following content:

As usual, I created a multi-stage <Dockerfile> for our project:

And the <docker-compose.yml> file contains the following:

Time for the truth. Execute the following command to build our Docker container and run it in conjunction with the Postgres database container:

$ docker compose up --build

After that, we must deploy our Task Service locally using docker-compose:

Now, open the Postman tool to post a Task in conjunction with a scheduled Quartz job:

Then in the console, you must see the logs for the creational process:

And after a few minutes, you must see the logs for the execution method at the specified task timer:

Excellent!!. Now we have a complete CRUD microservice developed with Spring Boot 3 that creates scheduled Quartz Jobs in conjunction with an <Task> entity object.

Remember that the spirit of the Task Service project is to be used in an Inlet’st of Things (IoT) platform on AWS. So let’s create a DynamoDB table with the information of the devices we need to activate/deactivate with a scheduled Quartz Job.

6. DynamoDB (Testcontainers for Devices Data).

The first thing we need to do is to add the required Maven dependencies to connect our app to tlet’snamoDB container and on the AWS:

<dependency>
<groupId>com.amazonaws</groupId>
<artifactId>aws-java-sdk-dynamodb</artifactId>
<version>1.12.335</version>
</dependency>
<dependency>
<groupId>org.testcontainers</groupId>
<artifactId>junit-jupiter</artifactId>
<scope>test</scope>
</dependency>

Next, let’s create a <Device> class with the required data we need to store on DynamoDB. This class also uses AWS-CDK annotations to map the <Device> class against the DynamoDB table:

Then, let’s create the <DynamoDBContainerBase> test class as we did the Postgres class counterpart:

Finally, let’s create the <DynamoDBTest> class that connects with the DynamoDB container to generate the device table and update an item:

Notice that we didn’t use the <@SpringBootTest> annotation for this test. This is because we’re using a pure Java client with the AWS-CDK library to connect to DynamoDB. We don’t need to run a Spring Boot container to perform these tests.

If you run the previous test class, you must get the following output:

Excellent!!. So now we need to unify the Testcontainlet’sonfiguration to have a single class. So let’s rename the <PostgresContainerBase> to <AbstractContainerBase> and append the DynamoDB container definition:

Noapp’shave a single class definition with our app’s containers for integration tests. Then, we must create our integration test for our <DeviceRepository> class. But first, let’s create a config class for our DynamoDB connection:

Notice that if we have defined the <aws.dynamodb.endpoint-override> property, the DynamoDB builder uses that endpoint to construct the client. In our case, that property is set in the <AbstractContainerBase> class for integration testing.

The next step is to create the <DeviceRepository> class with only 2 methods:

The goal of the Task Service is to Activate/Deactivate a device. Our intention here is only to update the status of a device, not to create or delete it.

So finally, the <DeviceRepositoryIT> class will be:

We must create the <Device> table and aggregate an item for our repository tests. Notice that the test methods ordered with labels 1 and 2 are identical to the previous unit test. Also, notice that this class contains an <@SpringBootTest> annotation, so we can inject the DynamoDB client and mapper components to use them in the Spring context.

Let’s run the test class to see the results:

It’s time to integrate the use of the <Devices> data into the <Task> table. So add the following 2 properties to the class:

@Column(name = "device_id", length = 30, nullable = false)
private String deviceId;

@Column(name = "device_action", length = 30, nullable = false)
private String deviceAction;

NOTE: Don’t forget to add a new version file for the migration script that uses Flyway to migrate the database schema.

Then, complete the <execute> method of our <TaskJob> class:

Now we have updated the device status field when the scheduled job executes. Remember that you can also send an event to an SQS or SNS to notify other processes in your architecture to perform some activity based on this event. Think in terms of Event Driven Architecture ;)

7. Docker Compose (with DynamoDB-local).

Let’s add the DynamoDB-local container to our <docker-compose.yml>:

This Dynamodb local container doesn’t contain any data. So we need to add another container with the provided AWS-CLI support to create our desired <Devices> table:

Now we can deploy our entire solution using the docker command:

$ docker compose up --build

We must see the following output in the console:

So far, so good. But our <Devices> table doesn’t have any device items. So, I created a file called <device-item.json> that contains the data of one device we can use for testing purposes. We must execute the following command to put this item into DynamoDB:

$ aws dynamodb put-item                     \
--table-name Devices \
--endpoint-url http://localhost:8000 \
--item file://utils/dynamodb/items/device-item.json

We can get this item with a scan operation:

$ aws dynamodb scan    \
--table-name Devices \
--endpoint-url http://localhost:8000

Notice that the <status> field has a value of <OFF>, assuming that our device is deactivated.

Now let’s open our Postman tool again and POST a new task that must be executed soonly:

The debug logs must appear in our terminal window, indicating the methods executed and the created task:

And after a minute or so, the Job must be executed, and the device status must be updated in the DynamoDB local:

$ aws dynamodb scan    \
--table-name Devices \
--endpoint-url http://localhost:8000

Notice in the previous image that the scheduled <Task> changed the status of the device from <OFF> to <ON>.

8. Testing with Maven

So far, we are running the integration tests using the IDE, and the test classes have the <IT> word at the end of the class name. We used this convention in the previous Task Service project with Quarkus. But this time, the tests are not executed if we run the maven command. So update all test classes with the word <Test> at the end of the class name:

Maven can now execute these test classes as if they were unit tests. So, go to your terminal and run the following command:

$ mvn test

And all 20 integration tests must be executed:

And that’s it!! We now have a Spring Boot application developed from scratch using TDD with Tescontainers and Integration Testing.

I hope this tutorial has been of interest to you, and I’ll see you in my next article, where I will discuss the implementation of Reactive Programming in this project using WebFlux.

I will see you soon.

--

--