Site icon Le blog Tech d'Indy

3 different ways to write integration tests with external services dependencies


Integration testing can be tricky to setup because of all external services that can be involved. For example a database, a file storage in the cloud,… As a developer the cost of implementation can be so big that we won’t take the time to implement integration tests. But the benefits are, in my opinion, worth the price especially for applications that are not updated very often. For example, we will be able to upgrade NPM dependencies with a high level of confidence without having to manually test the application each time.

In this article we will focus, as an example, on two external services which are a MongoDB database and AWS S3. We will see the different ways of dealing with these services to be able to implement some integration tests. This is not a full tutorial but more a reflexion about this problematic.

Mock and in memory DB

One of the easiest solutions to solve the problem is to mock the external services that we rely on. This way we don’t have to deal with a complex setup to be able to run our integration tests.

beforeEach(async() => {
  getAwsLinkStub = sinon.stub().resolves('STUB_URL');
  aws.getReadSignedUrlWithAws = getAwsLinkStub;

One of the drawbacks of this solution is that we rely on our own interpretation of the results from the external service. For example, if the API of AWS is changing, our tests won’t fail but it will not work at runtime because there is a change in the API response that we are not aware of. Also mocking the responses can be tedious and time consuming especially when we have a lot of them.

this.mongod = await MongoMemoryServer.create({
  instance: {
    dbName: 'myDb',
    storageEngine: 'ephemeralForTest',
  binary: {
    version: mongoVersion,

This is a good solution, even though there are some limitations on performances for example, but it should not be a problem in most cases. One of the drawbacks could be that this is not exactly the same database as what we have in production so the behaviour can be different between the two environments.

GitHub Actions Services

In GitHub Actions there is a concept of “services” which are like docker-compose services but not exactly the same. This allows us to “deploy” our services as we would have them in production. Then our code can run the same way as it does in production which is really handy to write integration tests.

Below you can find an example of configuration for GitHub Actions. There is an actual MongoDB container running and exposing port 27017 to be able to request, save and delete data from a database as we would do it in production. Also, we have a MinIO service which is an open source object storage solution that expose the same API as AWS S3.

    runs-on: ubuntu-latest
        image: mongo:5.0.4
          - 27017:27017
        image: minio/minio:latest
          - 9000:9000

This is a great solution to facilitate the writing of integration tests but one of the caveat is that we have to run the same container manually in local to be able to run the tests. So we will need a docker-compose file that exposes the same container that we will have in our CI.

The GitHub Actions also have some limitations: for example we can’t explicitly define the command to executed inside the container which can be annoying when the Dockerfile doesn’t set a CMD. Finally, this relies on the CI tool, which is GitHub Actions in our example, but not all companies use it…


As we have seen, the GitHub Actions services are a good solution to our problem but the main issue is that it relies on the fact that you are using GitHub which may not be the case. The good news is that if your test environnement supports Docker, you can benefit from the same kind of advantages as the GitHub Actions services but directly inside your code. At least this is the promise of TestContainers, a solution to help us, developers, to setup and write integration tests. From their website, TestContainers is described as:

Testcontainers is a Java library that supports JUnit tests, providing lightweight, throwaway instances of common databases, Selenium web browsers, or anything else that can run in a Docker container.

The other good news is that it exists a version for NodeJS. The main advantage being that it’s a CI agnostic solution as long as you have access to the Docker daemon.

From a code standpoint it looks a bit like this:

describe('testContainer', () => {
  let mongoContainer: StartedTestContainer;
  let mongoClient;

  let minioContainer: StartedTestContainer;
  let minioClient;

  beforeAll(async () => {
    mongoContainer = await startMongoContainer();
    mongoClient = await getMongoClient({ container: mongoContainer });

    minioContainer = await startMinioContainer();
    minioClient = await getMinioClient({ container: minioContainer });

  afterAll(async () => {
    await mongoContainer.stop();
    await minioContainer.stop();

Basically you’re creating containers from your test code to be able to have access to your Mongo database and your Minio instance. Now let’s see in details how the container are created inside the startMongoContainer and startMinioContainer functions.

async function startMongoContainer() {
  const container = await new GenericContainer('mongo:4.2.6')
    .withExposedPorts({ container: 27017, host: 27018 })

  return container;

async function startMinioContainer() {
  const container = await new GenericContainer('')
    .withEnv('MINIO_ROOT_USER', 'indy')
    .withEnv('MINIO_ACCESS_KEY', 'indy')
    .withExposedPorts({ container: 9000, host: 9999 })
    .withCmd(['server', '/data'])

  return container;

As you can see it’s a simple way to create containers and to configure them directly in the code. We can easily specify the Docker image to use, provide environment variables, expose ports and execute a specific command to launch the container. Then we get the container instance that we can use to create our clients to consume the services. To see that let’s dive in the getMongoClient and getMinioClient functions.

async function getMongoClient({ container }: { container: StartedTestContainer }) {
  const mongoClient = await new MongoClient(

  return mongoClient;

async function getMinioClient({ container }: { container: StartedTestContainer }) {
  const minioClient = new Minio.Client({
    endPoint: container.getHost(),
    port: container.getMappedPort(9000),
    useSSL: false,

  return minioClient;

The container instance that we get from TestContainers is very convenient as it exposes various method to access the mappedPort, the host,… which is very useful to instantiate our clients.

Then we can easily run our tests using our containers and stop them at the end. We have the power of the containers directly inside our tests which can be very helpful in a context of integration testing. Another benefit is that you run your integration tests in the same environment in local and in the CI which is great for debug purposes.

I will end with 2 warning. Firstly be careful with the performances of your tests. Since we are creating containers it might take some time especially if we compare to the mock solution (but still less than asking to your ops team to create a DB for your tests). Secondly, the TestContainers for NodeJS is fairly new compare to the original Java version so all the functionalities may not be available yet and the overall stability of the project might be slightly below its counterpart.


What does TestContainers bring to the table ?

What is the cost ?

This is a project that exists for many languages (Java, JavaScript, Go, Rust,…) and that seems to be used by various companies (JetBrains, Spring, Wise,…). It’s quite easy to start and experiment with so I heavily encourage you to give it a try.


Main website:

TestContainers NodeJS:

Interesting talk about integration tests and TestContainers:

Quitter la version mobile