hachyderm.io is one of the many independent Mastodon servers you can use to participate in the fediverse.
Hachyderm is a safe space, LGBTQIA+ and BLM, primarily comprised of tech industry professionals world wide. Note that many non-user account types have restrictions - please see our About page.

Administered by:

Server stats:

8.9K
active users

Scott Laird

Did you know that by default your zippy static website created with and served via doesn't support HTTP content compression? And did you know that of the three common compression options (gzip, zstd, and brotli), brotli gets substantially better compression results for most content (better than zstd!) and is supported by basically everything?

I wrote up a few details here:

scottstuff.net/posts/2025/03/0

I also wrote a tool to make pre-compressing directories full of web pages much less complicated, see github.com/scottlaird/incremen.

scottstuff.net · Precompressing content with Hugo and CaddyI went down a bit of a rabbit hole this week, and I figured I’d share so hopefully no one else has to dive quite this deep. Modern web browsers generally prefer to fetch compressed content from web servers; they’ll pass an Accept-Encoding header as part of the HTTP request listing which compression types they support, and then the server will attempt to return content compressed in a form that the client supports. Most web servers have the ability to compress web content on the fly, but that feels kind of weird when you’re using a static site generator like Hugo. Why go to all of the work to generate a cheap-to-serve static site and then depend on your web server to do a bunch of expensive compression on the fly over and over again?