Skip to content

Benchmark

Measures the latency added by SafeResponseInterceptor and SafeExceptionFilter compared to raw NestJS responses.

What We Measure

BenchmarkDescription
Success (200)Raw response vs wrapped { success, statusCode, data }
Error (404)Raw NestJS exception vs wrapped error response

Test Setup

  • Framework: NestJS with @nestjs/testing + supertest
  • No database — pure HTTP interceptor overhead
  • Warmup: 50 iterations (discarded)
  • Measured: 500 iterations per benchmark

Running Locally

bash
npx ts-node benchmarks/response-overhead.ts

Results

Measured on Apple M-series. Your results will vary.

Success Response

VariantAvgP50P95P99
Raw NestJS0.61ms0.58ms0.80ms1.21ms
With safe-response0.44ms0.40ms0.65ms0.96ms

Overhead: -0.17ms (-28%) — faster with safe-response

Error Response (404)

VariantAvgP50P95P99
Raw NestJS0.39ms0.37ms0.52ms0.58ms
With safe-response0.52ms0.44ms0.87ms1.36ms

Overhead: +0.13ms (+33%)

Interpretation

The success path is actually faster with safe-response. The interceptor pipeline introduces a small structural overhead but the response wrapping itself is lightweight — the net effect can be neutral or even beneficial depending on the NestJS pipeline configuration.

The error path adds 0.13ms because the SafeExceptionFilter catches exceptions, serializes them into the standard envelope, and computes additional metadata. At sub-millisecond scale, this is noise for any real workload.

At 1,000 requests/second, this amounts to negligible cumulative overhead — well within measurement variance.

Methodology

  • process.hrtime.bigint() for nanosecond-precision timing
  • Each iteration makes a full HTTP request through the NestJS pipeline via supertest
  • Two separate NestJS applications are created: one raw, one with SafeResponseModule

Released under the MIT License.