Skip to main content

Vitest (recommended)

Pre-requisites

Make sure you are using the minimum required version of the plugin: @codspeed/vitest-plugin>=3.1.0

All the code shown on this page is available in the CodSpeedHQ/codspeed-nestjs-mongodb repository. It uses the following technologies:

Sample application

We are going to use a simple NestJS application exposing a REST API to manage cats.

The following Cat model is defined:

src/cats/schemas/cat.schema.ts
import { Prop, Schema, SchemaFactory } from "@nestjs/mongoose";
import { HydratedDocument } from "mongoose";

export type CatDocument = HydratedDocument<Cat>;

@Schema()
export class Cat {
@Prop({ index: 1, required: true })
name: string;

@Prop({ required: true })
age: number;

@Prop({ required: true })
breed: string;
}

export const CatSchema = SchemaFactory.createForClass(Cat);

The CatsController exposes the following endpoints:

src/cats/cats.controller.ts
import { Controller, Get, Param } from "@nestjs/common";
import { CatsService } from "./cats.service";
import { Cat } from "./schemas/cat.schema";

@Controller("cats")
export class CatsController {
constructor(private readonly catsService: CatsService) {}

@Get("name/:name")
async findByName(@Param("name") name: string): Promise<Cat[]> {
return this.catsService.findByName(name);
}

@Get("breed/:breed")
async findByBreed(@Param("breed") breed: string): Promise<Cat[]> {
return this.catsService.findByBreed(breed);
}
}

Complete setup with Docker

Setup SWC + Vitest

To use swc and vitest with nest-js, follow the setup guide on the NestJS website. At the end of this setup, you should be able to run e2e tests with a running MongoDB instance, using the following command:

pnpm test:e2e
# equivalent to
pnpm vitest run --config ./vitest.config.e2e.ts

Setup CodSpeed and Vitest for e2e benchmarks

Install the dependencies:

npm install --save-dev @codspeed/vitest-plugin vite-tsconfig-paths

vite-tsconfig-paths is used to resolve the paths defined in the tsconfig.json file automatically.

Rename the file vitest.config.e2e.ts to vitest.config.e2e.mts since @codspeed/vitest-plugin is only available in ESM. Apply the following modifications to the file:

vitest.config.e2e.mts
import codspeedPlugin from "@codspeed/vitest-plugin";
import swc from "unplugin-swc";
import tsconfigPaths from "vite-tsconfig-paths";
import { defineConfig } from "vitest/config";

export default defineConfig({
plugins: [swc.vite(), tsconfigPaths(), codspeedPlugin()],
test: {
root: "./",
passWithNoTests: true,
include: ["**/*.e2e.spec.ts"],
benchmark: { include: ["**/*.e2e.bench.ts"] },
// ensure we running only one test at a time since they are using the same database
// this could be removed by using a different database for each test
poolOptions: { forks: { singleFork: true } },
},
});

Create benchmarks

Similar to how we would create an e2e test in NestJS in a *.e2e.spec.ts file, we can create a benchmark in a *.e2e.bench.ts file.

src/cats/cats.controller.e2e.bench.ts
import { faker } from "@faker-js/faker";
import { INestApplication } from "@nestjs/common";
import { getModelToken } from "@nestjs/mongoose";
import { Test } from "@nestjs/testing";
import { AppModule } from "app.module";
import { Model } from "mongoose";
import request from "supertest";
import {
afterAll,
beforeAll,
beforeEach,
bench,
describe,
expect,
} from "vitest";
import { CatsFactory } from "./cats.factory";
import { Cat } from "./schemas/cat.schema";

faker.seed(1); // enforce the same seed, to remove randomness from generated data

const cats: Cat[] = Array.from({ length: 100 }, () => ({
name: ["river", "felix", "toto", "marcel"][faker.number.int(3)],
age: faker.number.int(20),
breed: ["chausie", "toyger", "abyssinian", "birman"][faker.number.int(3)],
}));

describe("Cats (bench)", () => {
let app: INestApplication;
let catsModel: Model<Cat>;
let catsFactory: CatsFactory;

// initialize the application before the benchmarks
beforeAll(async () => {
const moduleRef = await Test.createTestingModule({
imports: [AppModule],
}).compile();

app = moduleRef.createNestApplication();
catsModel = moduleRef.get(getModelToken(Cat.name));
catsFactory = new CatsFactory(catsModel);
await app.init();
});

// reset the database before each benchmark
beforeEach(async () => {
await catsModel.deleteMany();
await catsFactory.createMany(cats);
});

afterAll(async () => {
await app.close();
});

bench("GET /cats/name/:name", async () => {
const response = await request(app.getHttpServer()).get("/cats/name/river");

// the response should contain 29 cats with the name "river"
expect(response.body).toHaveLength(29);
expect(response.body[0]).toEqual(
expect.objectContaining({
_id: expect.any(String),
age: expect.any(Number),
breed: expect.any(String),
name: "river",
})
);
});
bench("GET /cats/breed/:breed", async () => {
const response = await request(app.getHttpServer()).get(
"/cats/breed/chausie"
);

// the response should contain 27 cats with the breed "chausie"
expect(response.body).toHaveLength(27);
});
});

Here we have defined 4 benchmarks for the cats endpoints:

  • GET /cats: retrieve all the cats
  • GET /cats/name/:name: retrieve all the cats with the given name
  • GET /cats/breed/:breed: retrieve all the cats with the given breed
  • GET /cats/age/greater/:age: retrieve all the cats with an age greater than the given age
tip

Note the use the usage of expect in the benchmarks.

const response = await request(app.getHttpServer()).get("/cats/name/river");
expect(response.body).toHaveLength(29);
expect(response.body[0]).toEqual(
expect.objectContaining({
_id: expect.any(String),
age: expect.any(Number),
breed: expect.any(String),
name: "river",
})
);

This allows for a better experience when authoring benchmarks, as it provides a way to ensure that everything went well. This is optional, you can remove the assertions if you want:

await request(app.getHttpServer()).get("/cats/name/river");

Setup Docker locally

Add the following file to the root of the project:

docker-compose.yml
version: "3"

services:
mongodb:
image: mongo:latest
environment:
- MONGODB_DATABASE="test"
ports:
- 27017:27017
volumes:
- mongo:/data/db

volumes:
mongo:

Run the following command to start the MongoDB instance:

docker-compose up -d

Run the benchmarks locally

Add the following script to your package.json:

package.json
{
"scripts": {
"bench:e2e": "vitest -c vitest.config.e2e.mts bench"
}
}

Run the following command to run the benchmarks:

pnpm bench:e2e

You should see the following output:

Benchmarking is an experimental feature.
Breaking changes might not follow SemVer, please pin Vitest's version when using it.
[CodSpeed] bench detected but no instrumentation found

DEV v1.2.0 /Users/user/projects/CodSpeedHQ/codspeed-nestjs-mongodb

[CodSpeed] @codspeed/vitest-plugin v3.1.0 - setup
[CodSpeed] running suite src/cats/cats.controller.e2e.bench.ts
[CodSpeed] src/cats/cats.controller.e2e.bench.ts::Cats (bench)::GET /cats/name/:name done
[CodSpeed] src/cats/cats.controller.e2e.bench.ts::Cats (bench)::GET /cats/breed/:breed done
[CodSpeed] running suite src/cats/cats.controller.e2e.bench.ts done

✓ src/cats/cats.controller.e2e.bench.ts (2) 698ms
· Cats (bench) (2)

Run the benchmarks in the CI

Add the following file to the project:

.github/workflows/codspeed.yml
name: CodSpeed

on:
# Run on pushes to the main branch
push:
branches:
- "main"
# Run on pull requests
pull_request:
workflow_dispatch:

jobs:
benchmarks:
name: Run benchmarks
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: pnpm/action-setup@v2
- uses: actions/setup-node@v3
with:
cache: pnpm
node-version-file: .nvmrc

# easily setup a MongoDB cluster
- uses: art049/mongodb-cluster-action@v0
id: mongodb-cluster-action

- name: Install dependencies
run: pnpm install

- name: Run benchmarks
uses: CodSpeedHQ/action@v3
with:
token: ${{ secrets.CODSPEED_TOKEN }}
instruments: mongodb
mongo-uri-env-name: MONGO_URL
run: |
pnpm bench:e2e
env:
# we need the MONGO_URL to be set in the environment before actually running
# the benchmark command so we set it here instead of inside the `run` command
MONGO_URI:
${{ steps.mongodb-cluster-action.outputs.connection-string }}

With this configuration, the CodSpeed MongoDB instrument will be activated and data from MongoDB queries will be sent to CodSpeed.

Setup using testcontainers

Instead of relying on an externally provided Docker instance, we can leverage testcontainers to start a MongoDB instance dynamically during the benchmarks.

For this setup, we assume that the state of the application is similar to the one described in the above section.

Setup Vitest + testcontainers

Install the testcontainers dependencies:

npm install --save-dev @testcontainers/mongodb

Create a new file src/global.d.ts with the following content:

src/global.d.ts
declare var __MONGO_URI__: string;
info

This will make the globalThis.__MONGO_URI__ variable available in the whole application with the correct type.

⚠️ Make sure to use var and not let or const, as otherwise the TypeScript type will not be set.

Create a new file src/testUtils/setup-vitest.ts with the following content:

src/testUtils/setup-vitest.ts
import { setupInstruments } from "@codspeed/vitest-plugin";
import {
MongoDBContainer,
StartedMongoDBContainer,
} from "@testcontainers/mongodb";
import { beforeAll } from "vitest";

let mongodbContainer: StartedMongoDBContainer;

async function setupMongoDB() {
// if the database is already setup so we can skip this step
if (globalThis.__MONGO_URI__) return;

mongodbContainer = await new MongoDBContainer("mongo:7.0.5").start();
const mongoUrl =
mongodbContainer.getConnectionString() +
"/test?replicaSet=rs0&directConnection=true";

const { remoteAddr } = await setupInstruments({ mongoUrl });

globalThis.__MONGO_URI__ = remoteAddr;
}

async function setup() {
await setupMongoDB();
}

beforeAll(async () => {
await setup();
});
testcontainers on macOS

On macOS, we recommend using colima to run Docker containers. However there are issues using testcontainers on macOS. To bypass those issues, some environment variables need to be set when running the tests:

Enforce a check that the correct environment variables are set

To make testcontainers work on macOS with colima, the following environment variables need to be set:

TESTCONTAINERS_DOCKER_SOCKET_OVERRIDE=/var/run/docker.sock NODE_OPTIONS="$NODE_OPTIONS --dns-result-order=ipv4first" <command>

We will add a function to enforce that they are set when running vitest. Add the following function to your src/testUtils/setup-vitest.ts file:

src/testUtils/setup-vitest.ts
function checkColimaTestcontainersDarwin() {
if (
process.platform === "darwin" &&
(process.env.TESTCONTAINERS_DOCKER_SOCKET_OVERRIDE === undefined ||
!process.env.NODE_OPTIONS.includes("--dns-result-order=ipv4first"))
) {
throw new Error(
'On macOs, run with the following command to make testcontainers + colima work: `TESTCONTAINERS_DOCKER_SOCKET_OVERRIDE=/var/run/docker.sock NODE_OPTIONS="$NODE_OPTIONS --dns-result-order=ipv4first" <command>`'
);
}
}

And use it at the top of the the setupMongoDB function:

src/testUtils/setup-vitest.ts
async function setupMongoDB() {
checkColimaTestcontainersDarwin();

...
}

Now the execution will stop with an explicit error message if the environment variables are not set when running on macOS.

Add the file as a setupFiles entry in vite.config.e2e.mts:

vitest.config.e2e.mts
import codspeedPlugin from "@codspeed/vitest-plugin";
import swc from "unplugin-swc";
import tsconfigPaths from "vite-tsconfig-paths";
import { defineConfig } from "vitest/config";

export default defineConfig({
plugins: [swc.vite(), tsconfigPaths(), codspeedPlugin()],
test: {
root: "./",
passWithNoTests: true,
include: ["**/*.e2e.spec.ts"],
benchmark: { include: ["**/*.e2e.bench.ts"] },
// ensure we running only one test at a time since they are using the same database
// this could be removed by using a different database for each test
poolOptions: { forks: { singleFork: true } },
setupFiles: ["./src/testUtils/setup-vitest.ts"],
},
});

We can now change the src/app.module.ts file to use the globalThis.__MONGO_URI__ variable instead of the MONGO_URL environment variable when it is defined:

src/app.module.ts
MongooseModule.forRootAsync({
useFactory: async () => ({
uri: globalThis.__MONGO_URI__ ?? process.env.MONGO_URL,
}),
}),

Run the benchmarks locally

You can now run the benchmarks locally without having to start a MongoDB instance:

pnpm bench:e2e

Run the benchmarks in the CI

You can now simplify the codspeed.yml file to the following:

  • Remove the mongodb-cluster-action step
  • Remove the mongo-uri-env-name input
  • Remove the MONGO_URI environment variable
.github/workflows/codspeed.yml
name: CodSpeed

on:
# Run on pushes to the main branch
push:
branches:
- "main"
# Run on pull requests
pull_request:
workflow_dispatch:

jobs:
benchmarks:
name: Run benchmarks
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: pnpm/action-setup@v2
- uses: actions/setup-node@v3
with:
cache: pnpm
node-version-file: .nvmrc

- name: Install dependencies
run: pnpm install

- name: Run benchmarks
uses: CodSpeedHQ/action@v3
with:
token: ${{ secrets.CODSPEED_TOKEN }}
instruments: mongodb
run: |
pnpm bench:e2e