Now it’s time to make our Timer Service application more company-oriented. I want our project to become an OpenSource solution, so we also need to use tools like SonarCloud and Snyk, which I will work on in the following articles. For this reason, I created a new GitHub Repository with the code base of our application inside a GitHub Organization and the same thing for a DockerHub Organization.
IMPORTANT: The last changes I made in this project are detailed in a new article, “Implementing a Multi-Account Environment on AWS.” So I suggest you go to the new one after reading this article to see the latest project improvements.
To complete this guide, you’ll need the following:
- An AWS account.
- Amplify CLI.
- GraalVM 22.1.0 with OpenJDK 17. You can use the SDKMAN tool.
- Apache Maven 3.8 or superior.
- Docker and Docker Compose.
- IntelliJ or Eclipse IDE.
There are essential changes in the backend service that I will share in the following sections before talking about our test components.
Using PostgreSQL as Central Data Store
An essential change to our API service is the exclusive use of PostgreSQL to store our task data. We are not using DynamoDB anymore. This is because we can keep the task data alongside the Quarkus Jobs data in the same database. We can use the same JTA transaction to keep these two data types in the database. Remember that we are using the “Flyway” library to migrate the data tables the Quartz needs in the database. I’m using the exact migration mechanism to create our task table. So, the structure of our task table is the following:
The task class now uses the JPA annotations to map its fields against the database table:
Now, we can store our task data in the PostgreSQL database.
Changing the Order of Persistence
Using the same functional and reactive programming approach, we can persist the Quartz Jobs data first and then the task data in the same PostgreSQL instance. Remember that we use the “Reactive Jackson” maven dependency for our Quarkus REST endpoints. We were persisting the task data to the DynamoDB first and then the Quartz data into PostgreSQL. So, our approach is something like the following:
NOTE: In my previous tutorial, I implemented the Angular NgRx library to convert our front-end project into a reactive one. Following this library’s best practices and conventions, I split the task endpoint into two different components. One resource is dedicated to managing CRUD operations on a single data object, and the other is to query the task objects and return the results.
In this case, we first create a task object calling the create method inside the “JobRepository” component. That component is also in charge of managing CRUD operations against the Quartz objects. Then, we use the “invoke” operation to create the task object in the persistence context. Finally, we return an HTTP response to the caller in the event of a successful operation.
This is the new approach I’m using in the other task endpoints. The rest of the components inside the backend service are the same.
Improvements for Testing purposes
Now, it’s time to describe the changes I made to improve the testing experience for our backend project.
KeyCloak TestContainer for OIDC Testing
Remember that we use an “OpenID Connect” (OIDC) maven dependency to interact with our “Amazon Cognito” service on AWS to validate our user’s credentials. We must test our task endpoints with a real JWT in the HTTP header request. So the “KeyCloak” also supports the OIDC protocol; we can use it as a test container to simulate a real scenario in our JUnit tests. To accomplish this, we need to add the KeyCloak dependency in our maven POM file:
Notice that the Quarkus Framework provides this test container. So we don’t need to make a specific configuration to deploy it during the testing phase. That configuration is part of the next section.
Property Files for each Environment
Before, I used a single “application.properties” to configure our entire microservice. Now, I split this file into multiple ones dedicated to a specific environment, like the common ones: “dev,” “test,” and “prod.” The new structure is the following:
Following the Quarkus convention for property files, we can add the name of the desired environment in the property file name, as you see in the previous image. In the standard “application.properties” file, I’m setting the common properties for all the environments, like the time zone.
I’m using this approach to reserve the testing configurations into a single file. This approach is also helpful in maintaining our application. The other reason is that the “test” environment uses its configuration properties like the KeyCloak and doesn’t define, for example, a data source configuration because the Quarkus Framework deploys a “PostgreSQL Testcontainer” that our application uses to create the initial Quartz tables and the task table for our testing purposes:
Also, we are defining a SQL script to load the initial data for testing purposes. This data is only used in this test environment, so I separated this configuration into its own properties file.
The Testing Components
When I built the testing methods for all of the task endpoints, I noticed that the component class had a lot of methods, so I decided to create different test classes focused on some test scenarios:
For the CRUD operations and the “Not Found” tests, I use a KeyCloak client to obtain a JWT from the running test container:
With this token, we can add it to the request header to access our task endpoints without a 401 HTTP error:
For the “Unauthorized” tests, the idea is not to assign a JWT to the request header, so we can validate that our task endpoint is responding with an HTTP 401 error response:
Quarkus Integration Tests using Testcontainers
Notice that each Test class has its “IT” class, which means Integration Test. These classes have the following structure:
Notice that each integration test extends the JUnit test classes. That’s because the integration tests use the defined “@test” methods in the Unit test classes to execute the integration tests.
For this reason, I like executing Integration Tests and not only Unit Tests. I don’t particularly appreciate mocking every component to pass the Unit tests. Instead, we are using a more realistic scenario to test our components. So we can execute our defined integration tests using the following command:
# mvn test
When the test phase is starting, you must see in the console log a message indicating the creation of the “testcontainers:”
And at the end of the maven execution, we must have the actual five tests passed:
So at this point, the idea is to cover the principal use cases for our task endpoints. In my next article, I will discuss the integration of our Timer Service project into the “SonarCloud” platform to verify the code coverage, among other things. We will create new test classes and improve business logic to accomplish our Quality Gates. We will also automate this process through a “CI/CD pipeline” on AWS.
I hope this article interests you, and I will see you in my following tutorial.