Skip to main content

tinybench

tip

If using Vitest, is possible in your project, we highly recommend using it instead of tinybench.

Click here to see how to use the MongoDB instrument with Vitest.

Pre-requisites

Make sure you are using the minimum required version of the plugin: @codspeed/tinybench-plugin>=3.0.0

All the code shown on this page is available in the CodSpeedHQ/codspeed-nestjs-mongodb repository. It uses the following technologies:

Sample applicationโ€‹

We are going to use a simple NestJS application exposing a REST API to manage cats.

The following Cat model is defined:

src/cats/schemas/cat.schema.ts
import { Prop, Schema, SchemaFactory } from "@nestjs/mongoose";
import { HydratedDocument } from "mongoose";

export type CatDocument = HydratedDocument<Cat>;

@Schema()
export class Cat {
@Prop({ index: 1, required: true })
name: string;

@Prop({ required: true })
age: number;

@Prop({ required: true })
breed: string;
}

export const CatSchema = SchemaFactory.createForClass(Cat);

The CatsController exposes the following endpoints:

src/cats/cats.controller.ts
import { Controller, Get, Param } from "@nestjs/common";
import { CatsService } from "./cats.service";
import { CreateCatDto } from "./dto/create-cat.dto";
import { Cat } from "./schemas/cat.schema";

@Controller("cats")
export class CatsController {
constructor(private readonly catsService: CatsService) {}

@Get("name/:name")
async findByName(@Param("name") name: string): Promise<Cat[]> {
return this.catsService.findByName(name);
}

@Get("breed/:breed")
async findByBreed(@Param("breed") breed: string): Promise<Cat[]> {
return this.catsService.findByBreed(breed);
}
}

Complete setup with Dockerโ€‹

Setup dependenciesโ€‹

Install the @codspeed/tinybench-plugin:

npm install --save-dev @codspeed/tinybench-plugin

Create benchmarksโ€‹

Let's create a script that defines benchmarks on the cats endpoints of the application.

src/cats/cats.controller.e2e.bench.ts
import { faker } from "@faker-js/faker";
import { INestApplication } from "@nestjs/common";
import { getModelToken } from "@nestjs/mongoose";
import { Test } from "@nestjs/testing";
import { AppModule } from "app.module";
import { Model } from "mongoose";
import request from "supertest";
import { Bench } from "tinybench";
import { CatsFactory } from "./cats.factory";
import { Cat } from "./schemas/cat.schema";

faker.seed(1); // enforce the same seed, to remove randomness from generated data

const cats: Cat[] = Array.from({ length: 100 }, () => ({
name: ["river", "felix", "toto", "marcel"][faker.number.int(3)],
age: faker.number.int(20),
breed: ["chausie", "toyger", "abyssinian", "birman"][faker.number.int(3)],
}));

export function registerCatControllerBenches(bench: Bench) {
let app: INestApplication;
let catsModel: Model<Cat>;
let catsFactory: CatsFactory;

// initialize the application before the benchmark
async function beforeAll() {
const moduleRef = await Test.createTestingModule({
imports: [AppModule],
}).compile();
app = moduleRef.createNestApplication();
catsModel = moduleRef.get(getModelToken(Cat.name));
catsFactory = new CatsFactory(catsModel);
await app.init();
await catsFactory.createMany(cats);
}
// clean up the application after the benchmark
async function afterAll() {
await catsModel.deleteMany();
await app.close();
}

bench.add(
"GET /cats/name/:name",
async () => {
await request(app.getHttpServer()).get("/cats/name/river");
},
{ beforeAll, afterAll }
);

bench.add(
"GET /cats/breed/:breed",
async () => {
await request(app.getHttpServer()).get("/cats/breed/chausie");
},
{ beforeAll, afterAll }
);
}

Here we have defined 4 benchmarks for the cats endpoints:

  • GET /cats: retrieve all the cats
  • GET /cats/name/:name: retrieve all the cats with the given name
  • GET /cats/breed/:breed: retrieve all the cats with the given breed
  • GET /cats/age/greater/:age: retrieve all the cats with an age greater than the given age

We finally have to register the benchmarks in the src/bench.e2e.ts file:

src/bench.e2e.ts
import { Bench } from "tinybench";
import { withCodSpeed } from "@codspeed/tinybench-plugin";
import { registerCatControllerBenches } from "cats/cats.controller.e2e.bench";

const bench = withCodSpeed(new Bench());

(async () => {
registerCatControllerBenches(bench);

await bench.run();
console.table(bench.table());
})();

Setup Docker locallyโ€‹

Add the following file to the root of the project:

docker-compose.yml
version: "3"

services:
mongodb:
image: mongo:latest
environment:
- MONGODB_DATABASE="test"
ports:
- 27017:27017
volumes:
- mongo:/data/db

volumes:
mongo:

Run the following command to start the MongoDB instance:

docker-compose up -d

Run the benchmarks locallyโ€‹

To use tinybench, we recommend using ts-node with swc:

npm install --save-dev @swc/core @swc/helpers ts-node

To enforce using swc when running ts-node, add the following to your tsconfig.json:

tsconfig.json
{
"ts-node": {
"swc": true
}
}

Add the following script to your package.json:

package.json
{
"scripts": {
"bench:e2e": "NODE_ENV=test ts-node --swc -r tsconfig-paths/register src/bench.e2e.ts"
}
}

Run the following command to run the benchmarks:

pnpm bench:e2e

You should see the following output:

[CodSpeed] 4 benches detected but no instrumentation found, falling back to tinybench
โ”Œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ฌโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”
โ”‚ (index) โ”‚ Task Name โ”‚ ops/sec โ”‚ Average Time (ns) โ”‚ Margin โ”‚ Samples โ”‚
โ”œโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ผโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ค
โ”‚ 0 โ”‚ 'GET /cats/name/:name' โ”‚ '260' โ”‚ 3840144.435868008 โ”‚ 'ยฑ9.38%' โ”‚ 131 โ”‚
โ”‚ 1 โ”‚ 'GET /cats' โ”‚ '348' โ”‚ 2870076.392037528 โ”‚ 'ยฑ4.93%' โ”‚ 175 โ”‚
โ”‚ 2 โ”‚ 'GET /cats/breed/:breed' โ”‚ '489' โ”‚ 2043167.1677803504 โ”‚ 'ยฑ3.39%' โ”‚ 245 โ”‚
โ”‚ 3 โ”‚ 'GET /cats/age/greater/:age' โ”‚ '431' โ”‚ 2318777.595405225 โ”‚ 'ยฑ3.97%' โ”‚ 216 โ”‚
โ””โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”ดโ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”€โ”˜

Run the benchmarks in the CIโ€‹

Add the following file to the project:

.github/workflows/codspeed.yml
name: codspeed-benchmarks

on:
# Run on pushes to the main branch
push:
branches:
- "main"
# Run on pull requests
pull_request:
workflow_dispatch:

jobs:
benchmarks:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: pnpm/action-setup@v2
- uses: actions/setup-node@v3
with:
cache: pnpm
node-version-file: .nvmrc

# easily setup a MongoDB cluster
- uses: art049/mongodb-cluster-action@v0
id: mongodb-cluster-action

- name: Install dependencies
run: pnpm install

- name: Run benchmarks
uses: CodSpeedHQ/action@v3
with:
token: ${{ secrets.CODSPEED_TOKEN }}
instruments: mongodb
mongo-uri-env-name: MONGO_URL
run: |
pnpm bench:e2e
env:
# we need the MONGO_URL to be set in the environment before actually running
# the benchmark command so we set it here instead of inside the `run` command
MONGO_URI: ${{ steps.mongodb-cluster-action.outputs.connection-string }}

With this configuration, the CodSpeed MongoDB instrument will be activated and data from MongoDB queries will be sent to CodSpeed.

Setup using testcontainersโ€‹

Instead of relying on an externally provided Docker instance, we can leverage testcontainers to start a MongoDB instance dynamically during the benchmarks.

For this setup, we assume that the state of the application is similar to the one described in the above section.

Setup tinybench + testcontainersโ€‹

Install the testcontainers dependencies:

npm install --save-dev @testcontainers/mongodb

Change the src/bench.e2e.ts file to the following:

src/bench.e2e.ts
import { setupInstruments, withCodSpeed } from "@codspeed/tinybench-plugin";
import { MongoDBContainer } from "@testcontainers/mongodb";
import { registerCatControllerBenches } from "cats/cats.controller.tinybench";
import { Bench } from "tinybench";

async function setupDatabase() {
const mongodbContainer = await new MongoDBContainer("mongo:7.0.5").start();
const mongoUrl =
mongodbContainer.getConnectionString() +
"/test?replicaSet=rs0&directConnection=true";

const { remoteAddr } = await setupInstruments({ mongoUrl });
process.env.MONGO_URL = remoteAddr;
}

const bench = withCodSpeed(new Bench());

(async () => {
await setupDatabase();

registerCatControllerBenches(bench);

await bench.run();
console.table(bench.table());
})();
testcontainers on macOS

On macOS, we recommend using colima to run Docker containers. However there are issues using testcontainers on macOS. To bypass those issues, some environment variables need to be set when running the tests:

Enforce a check that the correct environment variables are set

To make testcontainers work on macOS with colima, the following environment variables need to be set:

TESTCONTAINERS_DOCKER_SOCKET_OVERRIDE=/var/run/docker.sock NODE_OPTIONS="$NODE_OPTIONS --dns-result-order=ipv4first" <command>

We will add a function to enforce that they are set when running tinybench. Add the following function to your src/bench.e2e.ts file:

src/bench.e2e.ts
function checkColimaTestcontainersDarwin() {
if (
process.platform === "darwin" &&
(process.env.TESTCONTAINERS_DOCKER_SOCKET_OVERRIDE === undefined ||
!process.env.NODE_OPTIONS.includes("--dns-result-order=ipv4first"))
) {
throw new Error(
'On macOs, run with the following command to make testcontainers + colima work: `TESTCONTAINERS_DOCKER_SOCKET_OVERRIDE=/var/run/docker.sock NODE_OPTIONS="$NODE_OPTIONS --dns-result-order=ipv4first" <command>`'
);
}
}

And use it at the top of the setupDatabase function:

src/bench.e2e.ts
async function setupDatabase() {
checkColimaTestcontainersDarwin();

await setupMongoDB();
}

Now the execution will stop with an explicit error message if the environment variables are not set when running on macOS.

Run the benchmarks locallyโ€‹

You can now run the benchmarks locally without having to start a MongoDB instance:

pnpm bench:e2e

Run the benchmarks in the CIโ€‹

You can now simplify the codspeed.yml file to the following:

  • Remove the mongodb-cluster-action step
  • Remove the mongo-uri-env-name input
  • Remove the MONGO_URI environment variable
.github/workflows/codspeed.yml
name: codspeed-benchmarks

on:
# Run on pushes to the main branch
push:
branches:
- "main"
# Run on pull requests
pull_request:
workflow_dispatch:

jobs:
benchmarks:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v3
- uses: pnpm/action-setup@v2
- uses: actions/setup-node@v3
with:
cache: pnpm
node-version-file: .nvmrc

- name: Install dependencies
run: pnpm install

- name: Run benchmarks
uses: CodSpeedHQ/action@v3
with:
token: ${{ secrets.CODSPEED_TOKEN }}
instruments: mongodb
run: |
pnpm bench:e2e