How are you compressing css/js/html with Hugo?


Since there is no current pipelining of assets, I was wondering what sort of homebrew solutions have people come up with for compressing said resources?


I use simple NPM scripts to run a postcss build process, which in turn uses cssnano to compress css. You can pretty much do anything you like with NPM scripts, so for JS, you can use Uglify etc.


I am a little bit stupid, can you provide a further small example maybe?

My flow:
-I develop on localhost (inside a Gitlab repository)
-I push to remote master origin

On my webserver, I use a bash script to:
-I pull from the Gitlab repository
-I delete the current /public directory
-I run Hugo to generate the site on the server

This way it’s live. (I’m skiping the staging/testing server, as I do it on localhost)

Where during my workflow should I be using compression? Localhost while testing? Or before hugo generates the the site on remote?

Which npm libs do you use for CSS, and for Js?

What I want to do is take all the Js files, and compress/minify them and package them together into just one file to serve. I want the same for CSS files.

Do you know of any kind of an image optimizer script one could run recursively over the site and automagically compress images as well?

Thank you in advance! :slightly_smiling:


It’s a big topic, and my workflow is very much just how I like it, there are tons of ways to do this. My workflow emphasizes simplicity because I manage a lot of sites, but is less automated than what some may like to keep the number of moving parts low. But, I do recommend use NPM scripts if you can.

So it might be more helpful if I point you to some articles:

Maybe start with this:

and this:

CSS-Tricks article on NPM scripts:

and this:

I’ve got some more, including links to some starter repos, on the Hugo page at my site here:

Hope that helps!


Note that I’m not actively maintaining these with everything on my plate right now, but I put together two different Hugo starters, both of which have image optimization built in (not my building, obviously, but using the awesome Gulp ecosystem):

I think @budparr is right in that the topic is a big one. If nothing else, the basis of the previously mentioned projects might provide some insight into the various techniques. Bud’s pointer are always good, since he is the one who taught me about a lot of this stuff when I built my first static site a few years ago.

This is going to sound like a commercial plug, but if you sign up for netlify, you can do all this with a single click, and it’s up to you to prioritize your time for dev vs. content creation (I seem to recall you and I chatting a few months back and if my memory serves, you write quite a bit). Post-processing, however, is not part of their free tier, but if you open source this project, your hair will really be blown back by how easy they’ll make all of this for you. REsponding to your other threads shortly :smile:


I’m allowing Netlify to handle all compressing, concatenating, and optimizing of assets. For anything needed in in Gulp I create a command hugo && gulp netlify to handle any additional processing.


I use gulp-htmlmin

Following options:

"htmlmin": {
    "collapseWhitespace": true,
    "removeComments": true,
    "minifyCSS": true,
    "minifyJS": true

Works fine.


Query - I use Netlify too and they optimize everything but HTML. Are you saying you can run a gulp script after hugo is run to handle the HTML and that works ok on Netlify?


@Raymond_Camden Are you sure about that? I’m pretty confident my Netlify sites have the HTML gzipped.


I meant minified.


I’m only compressing JavaScript right now (well, JSON), but it seems you’re more interested in the process, so…

I’m using Travis CI to deploy, so I just added a step to the build in my .travis.yml which runs json_reformat from the yajl-tools package on every *.json file in the output directory. Basically:

- hugo
- find web/ -name '*.json' | while read json; do json_reformat -m < "$json" > "$json.min" && mv "$json.min" "$json"; done

json_reformat is a bit annoying since it doesn’t support in-place minification. If they had such a flag (say, “-i”) the script would be a lot simpler:

find web/ -name '*.json' | while read json; do json_reformat -i "$json"; done

I don’t really want anything compressed during development, so this works pretty well for me. It wouldn’t be hard to add support for minimizing CSS/HTML in principle, I’d just need a decent minifier.

IMHO it’s also a good flow for collaborative development. There are no extra tools (other than Hugo) to install locally, and it’s impossible to forget to minify before deploying. And, of course, all the benefits of CI.


@Raymond_Camden, that is correct!

I use build command in Netlify is hugo && gulp netlify

In my gulp file I have this command…

gulp.task('netlify', ['htmlminify', 'babel']);

You can see an example result here here…

Additionally, to @rdwatters point, yes Netlify also gzips your files, I just happen to be a little neurotic about minifying my HTML. In the screenshot below you can see that Hugo builds first, and once it’s complete my gulp process begins to process my public directory.

Hope that helps! I’m new to the world of developer community forums.

16 AM


Nice - my Gulp-fu is weak - would you mind sharing your script?


I use a modified version of the Gulp pipline that is described here.

I haven’t found another way to do the cachbusting with just npm scripts.



I’m relatively new to Hugo, but as an experienced developer I like to make sure my entire workflow is both simple and repeatable. I’ve built lemonade to manage minifying HTML, CSS, XML, SVG, JSON and JS, crushing PNG, JPG and GIF (with optional lossy-ness) compressing with gzip and brotli, and then pushing out a self-contained site to use with caddy; deployment is just rsync / cp (or in old DOS terms, XCOPY). No need to install ANYTHING on the eventual host (well, except for sh / bash and uname, aka BusyBox, so it should work on your router)…

lemonade isn’t finished yet, and you’ll find instructions lacking, but it’s on its way. It works on Mac OS X using home brew and should work on Linux. It also downloads a bespoke caddy with just the essential plugins installed…

As a bonus, it also generates and integrates favicons using RealFaviconGenerator, bootstrap v4 and can compile your sass for you. It prefers statically linked binaries, and will automatically install (locally, in its own cache, not your path) any npm or go tools (minify, hugo) or json (jq) needed. It uses shellfire to manage binary dependencies and shell script idiosyncrasies - no need to install, just git clone and run!


@Raymond_Camden, I made this repo public so you can take a look around! Pretty straightforward example, but I’ve used it on some more complex builds.