Testcontainers Guide: Stop Using H2 Database for Integration Testing

Docker tutorial - IT technology blog
Docker tutorial - IT technology blog

A Familiar Scene: Tests Pass Locally but Turn ‘Bright Red’ on CI

If you’ve ever written Integration Tests, you’ve likely encountered this: tests pass 100% on your local machine using H2 Database, but as soon as you deploy to Production using real PostgreSQL, SQL syntax errors appear. Or worse, the whole team shares a single database server, leading to data collisions between test suites, resulting in completely inaccurate results.

After more than 6 months of applying Testcontainers to large projects, I’ve found it to be the best way to ensure environment parity. Instead of manual setup, Testcontainers automatically spins up the exact Docker container version you need. Once the tests are finished, it cleans up everything. This process has helped me reduce the rate of flaky tests (tests that pass or fail inconsistently) to nearly 0%.

Why is Testcontainers More ‘Worth Its Weight in Gold’ Than Docker Compose?

Many of you might wonder: ‘Why not just use Docker Compose?’ The answer lies in dynamic control. With Testcontainers, you manage the container lifecycle directly within your Java or Go code.

The biggest advantage is the random port mapping mechanism. If your machine is already running Postgres on port 5432, Testcontainers will automatically find another available port (like 32768) to initialize the new container. This completely avoids conflicts. Additionally, it features Ryuk — a sidecar container dedicated to ‘cleaning up the battlefield.’ Even if the test process crashes unexpectedly, Ryuk ensures the test containers are destroyed, preventing resource leaks.

When configuring the Docker API, if I encounter long JSON responses, I often use a JSON Formatter to check configurations faster. This saves a significant amount of time compared to squinting at raw code.

Real-world Implementation: Integrating PostgreSQL and Redis

Here is how I set it up for a Spring Boot project. Other languages follow a similar logic.

1. Add Necessary Libraries

You need to declare the dependencies in your pom.xml. Make sure to use the specific module for your database type to leverage built-in optimization functions.

<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>postgresql</artifactId>
    <version>1.19.7</version>
    <scope>test</scope>
</dependency>
<dependency>
    <groupId>org.testcontainers</groupId>
    <artifactId>junit-jupiter</artifactId>
    <version>1.19.7</version>
    <scope>test</scope>
</dependency>

2. Dynamic Container Configuration

Don’t hardcode URLs. Let Testcontainers return the connection info after the container starts successfully. I usually create a BaseIntegrationTest class to be reused across the entire project.

@Testcontainers
public abstract class BaseIntegrationTest {

    @Container
    static PostgreSQLContainer<?> postgres = new PostgreSQLContainer<>("postgres:15-alpine")
            .withDatabaseName("test_db")
            .withUsername("user")
            .withPassword("pass");

    @Container
    static GenericContainer<?> redis = new GenericContainer<>(DockerImageName.parse("redis:7-alpine"))
            .withExposedPorts(6379);

    @DynamicPropertySource
    static void configureProperties(DynamicPropertyRegistry registry) {
        registry.add("spring.datasource.url", postgres::getJdbcUrl);
        registry.add("spring.data.redis.host", redis::getHost);
        registry.add("spring.data.redis.port", () -> redis.getMappedPort(6379));
    }
}

The @DynamicPropertySource mechanism is the key here. It helps ‘inject’ the exact port allocated by Docker into your Spring application at runtime.

3. Writing Test Cases

At this point, writing tests is no different from interacting with a real database. You can be confident that if the code passes here, it will run correctly in production.

class UserServiceTest extends BaseIntegrationTest {
    @Autowired
    private UserRepository userRepository;

    @Test
    void shouldCreateUserSuccessfully() {
        User user = new User("admin", "[email protected]");
        userRepository.save(user);
        
        assertNotNull(userRepository.findByUsername("admin"));
    }
}

3 Optimization Lessons to Keep Your Builds Fast

Using Testcontainers is great, but if you’re not careful, your CI build times can skyrocket. Here’s how I optimize:

  • Use Singleton Containers: Don’t let every test class start a new container. Use a single static container for the entire test suite. In one project I worked on, this reduced build time from 15 minutes down to 6 minutes.
  • Prioritize Alpine Images: Always use lightweight versions like postgres:15-alpine instead of full versions. Smaller image sizes help CI servers pull them faster, saving bandwidth and disk space.
  • Allocate Sufficient Resources to Docker: If you’re running Postgres, Redis, and Kafka simultaneously, ensure Docker Desktop is allocated at least 4GB of RAM. Insufficient RAM is the primary reason containers start slowly or crash mid-way.

Conclusion

Testcontainers is more than just a tool; it’s a mindset for mastering test environments. Instead of wasting time complaining about environment discrepancies, use code to define them. If you’re struggling with dynamic port setup or Jenkins connectivity, leave a question below, and we’ll solve it together.

Share: