Nearly 15 years ago, Ryan Tomayko wrote The Thing About Git. It was a time where SVN was still heavily used. Few people understood why Git was so special and I wasn't among them at the time. Ryan's article captured the essence of Git and convinced me to make the switch.

Multiple pieces have been written on the whys and hows of adopting Fastify, but it's 2022 and Express — the most traditional web server framework for Node.js — still has roughly 49 times the number of npm weekly downloads as Fastify:

npm weekly downloads
24,099,092 Express
486,761 Fastify

That isn't so surprising, Express is well written and reliable, so much that it went through many years without any significant updates. It's as good as it can get.

But I think these download stats are probably a good indication that a lot people don't see a full picture when it comes to Fastify.

On the surface, there aren't many practical differences between setting up web servers in Express and Fastify — they both let you register handlers for specific URLs, they both let you chain middleware functions inbetween requests.

But there's a whole lot more Fastify can do, things like logging, validation and serialization, that's all offered out of the box. The articles I linked above are good starting points on learning about them, and they also cover some of the various smart things Fastify does when it comes to performance.

I won't go over all of them again. I'll focus instead on the things that are the most significant to me, that really make a difference, followed by some closing thoughts.

The granularity of hooks

In Express applications, if you want to have any kind of granular control over what gets sent over the wire, you need to overwrite the current response write() and end() methods. There's probably a really hacky way to overrride them at the global level, but that will generate other kinds of maintainability problems. Here's an example I could find on how to log the response body for an Express request:

function logResponseBody (req, res, next) {
  const oldWrite = res.write
  const oldEnd = res.end
  const chunks = []
  res.write = (chunk, ...args) => {
    chunks.push(chunk)
    return oldWrite.call(res, chunk, ...args)
  }
  res.end = (chunk, ...args) => {
    if (chunk) {
      chunks.push(chunk)
    }
    const body = Buffer.concat(chunks).toString('utf8')
    console.log(req.path, body)
    return oldEnd.call(res, chunk, ...args)
  }
  next()
}

There is a fundalmental problem with this pattern: overwriting functions is expensive — there's a lot of work Node.js has to do under the hood to introduce a new function and perform garbage collection. Node.js applications tend to require a lot of CPU and memory — it's generally not so cheap to scale them for massive workloads. Anything you can do to more effectively run JavaScript can have a significant compouding effect in your application's performance and scalability.

Fastify has a hooks systems that lets you intercept multiple stages of the request-response cycle. Note that this type of request and response logging is handled by Fastify already — it uses Pino internally and has log serialization functions that can be configured to capture requests and responses however you like.

Pino by the way is extremely fast, built to reduce the overhead of logging.

See The Cost of Logging by Fastify's co-creator Matteo Collina.

But if it didn't have these logging facilities built-in, we could very easily use the onSend hook to accomplish the same, that is, capture the response body:

fastify.addHook('onSend', (req, reply, payload, next) => {
  console.log(req.url, payload)
  next()
})

Want to run something as soon as a request arrives? Or just before it parses the request body? Maybe before it gets validated? You get the picture.

Safe extension of requests and responses

Another key feature of Fastify is the ability to safely extend the request and response objects — and by safely I mean in way that doesn't hurt performance.

If we couldn't use something like the onSend hook, we could just create a separate logSend() method that is safely attached to the prototype linked to the response object. That means Node.js will only spend time creating that function once, at boot time, as opposed to dynamically, on every single request. There's also the fact that changing the shape of an object on-the-fly will hurt performance.

So with Fastify's decoration API, we could do:

fastify.decorateReply('logSend', function (body) {
  console.log(req.path, body)
  this.send(body)
})

And after that, reply.logSend() becomes available in your handlers. Again, Fastify offers logging out of the box and you wouldn't ever need to do anything like that, but it serves to illustrate how Fastify lets you extend core object classes.

The plugin encapsulation system

An underrated aspect of Fastify is its ability to provide encapsulation. Plugins can be set to run at the global context, or create a separate child context of their own. You can have an infinite tree of plugin ancestors and descendants.

When you create a separate child context, any hooks and decorators defined within that context are bound to that context only.

fastify.register(function privateContext (fastify, _, done) {
  fastify.addHook('onRequest', authorize)
  fastify.register(privateRoutes)
  done()
})
fastify.register(function publicContext (fastify, _, done) {
  fastify.register(publicRoutes)
  done()
})

This improves the safety of your code implictly — if you try to access a method made available by a plugin in the different context, you'll get an error because it won't be available. When you have a clean separation of concerns, it makes it easier to debug and prevent errors when structuring your code.

What about worker environments?

One of the most exciting developments in JavaScript in recent times was the introduction of server-side worker environments, brought to popularity by Cloudflare Workers, Deno Deploy and more recently, Netlify Edge Functions.

It needs to be pointed out that Fastify is a web framework for Node.js — it relies heavily on Node.js-centric APIs. Both Node.js and server-side worker environments are based on the V8 engine — so they have a lot in common — but Node.js has extensions and abstractions of its own to deliver the best performance possible running JavaScript web servers, including for instance worker threads.

The server-side worker platforms are instead all based on the Service Workers standard, which originally governed service workers that run on the browser. The idea of adopting the Service Workers API to write web servers took off rather unexpectedly to me — I've always felt the API was somewhat counter-intuitive when compared to Node.js frameworks.

addEventListener(`fetch`, (event) => {
  event.respondWith(handleRequest(event.request))
})

It builds upon the Fetch Standard. Once you've set up the necessary listeners and handlers, you're essentially working with both the Service Worker API and the URL, Request and Response classes as specified by the Fetch Standard.

Nevertheless, the very fact that it's based on a standard, rigorously thought out and specified, and that standard is the same one adopted universally by browsers' service workers — makes it a compelling approach to build web servers.

That is, if you can afford to depend on any of the managed platforms that support it. At this point, even though the foundational code for Deno Deploy is available, going down the path of self hosting a worker environment to run your code is likely to be more challenging and problematic than you'd expect.

Web frameworks targeting worker environments with a more friendly high-level API have started to pop up — worktop and h3 are popular examples. I see no reason why we can't have a lightweight version of Fastify that runs on these environments. I started an experiment to do so, providing the basic Fastify Request and Reply classes and a few request-response cycle hooks:

import FastifyEdge from 'fastify-edge'

const app = FastifyEdge()

app.addHook('onSend', (req, reply, payload) => {
  if (req.url === '/') {
    return `${payload} World!`
  }
})

app.get('/', (_, reply) => {
  reply.send('Hello')
})

You can check it out (and maybe contribute!) here: galvez/fastify-edge.

How to get started with Fastify

If you're finally convinced you should be using Fastify instead of Express, I recommend watching Matteo Collina's Fast Introduction to Fastify.

Simon Plenderleith's Learning Fastify series and Manuel Spigolon's articles are also good resources. Make sure to go through the documentation too.

It should be noted you might just not need everything Fastify offers. Express-based Node.js web servers still power the majority of Node.js servers in production and all is well. Although that sounds like a valid argument, and it is, to some extent, there's nothing to lose in improving the efficiency of your Node.js web servers.

You spend less on server resources, you ready yourself to scale your application in a number of different ways and you still get to integrate with legacy Express code.

In the past couple of years I've done a lot of consulting for clients experiencing problems scaling their Node.js applications and migrating to Fastify has become my number one action item. That and taking some time to fully understand the Node.js event loop — which is also absolutely essential if you want to avoid wasting your system resources with nonoptimal code.