What is the best solution to automatically produce a compressed site (with gzip or brotli) ? I am mounting a self-hosted site, served by nginx.
Thanks
What is the best solution to automatically produce a compressed site (with gzip or brotli) ? I am mounting a self-hosted site, served by nginx.
Thanks
I use a script to generate the site on the host
#!/bin/bash
echo -e "\033[44m┌──────────────────────────────────────┐"
echo -e "│ Generating web site ... please wait! │"
echo -e "└──────────────────────────────────────┘\033[0m"
hugo -d /media/usbdisk/html --cleanDestinationDir --gc --enableGitInfo --templateMetrics --templateMetricsHints --logLevel info
echo -e "\033[102m\033[30m┌──────────────────────────────────────┐"
echo -e "│ Starte Komprimierung │"
echo -e "└──────────────────────────────────────┘\033[0m"
./brotli.sh > /dev/null&
./zopfli.sh > /dev/null&
and this
find /media/html -type f -name '*.br' -delete
find /media/html -type f -name '*.html' -exec brotli -Zf '{}' \;
find /media/html -type f -name '*.css' -exec brotli -Zf '{}' \;
find /media/html -type f -name '*.js' -exec brotli -Zf '{}' \;
find /media/html -type f -name '*.xml' -exec brotli -Zf '{}' \;
find /media/html -type f -name '*.json' -exec brotli -Zf '{}' \;
find /media/html -type f -name '*.rss' -exec brotli -Zf '{}' \;
chmod -R 755 /media/html/*.br
and this
find /media/html -type f -name '*.gz' -delete
find /media/html -type f -name '*.html' -exec zopfli '{}' \;
find /media/html -type f -name '*.css' -exec zopfli '{}' \;
find /media/html -type f -name '*.js' -exec zopfli '{}' \;
find /media/html -type f -name '*.xml' -exec zopfli '{}' \;
find /media/html -type f -name '*.json' -exec zopfli '{}' \;
find /media/html -type f -name '*.rss' -exec zopfli '{}' \;
chmod -R 755 /media/html/*.gz
Don’t forget to set the rules in nginx!
Hope this helps to get it done
For example…
GitLab Pages does not compress when serving files, so you have to compress after the build. Our documented GitLab Pages setup includes this step at the end of the workflow file.
I would also compress using Zstandard, which at this point has pretty good browser support. Note that Cloudflare now serves zst
files when the browser’s Accept-Encoding
header indicates that it’s supported.
This table compares the Brotli, Gzip, and Zstandard compression formats, with a ranking from 1 (best) to 3 (worst) for each characteristic.
Format | Compression Speed | Decompression Speed | File Size | Browser Support | Notes |
---|---|---|---|---|---|
Brotli | 3 | 3 | 1 | 2 | details |
Gzip | 1 | 2 | 3 | 1 | details |
Zstandard | 2 | 1 | 2 | 3 | details |
In the above, the compression speed is arguably irrelevant given that we’re compressing before the file is requested.
Our documented GitLab Pages workflow does not include Zstandard compression due to a known issue with GitLab Pages. Once this issue is addressed, we’ll change the compression step to this:
find public/ -type f -regextype posix-extended -regex '.+\.(css|html|js|json|mjs|svg|txt|xml)$' -print0 > files.txt
time xargs --null --max-procs=0 --max-args=1 brotli --quality=10 --force --keep < files.txt
time xargs --null --max-procs=0 --max-args=1 gzip -9 --force --keep < files.txt
time xargs --null --max-procs=0 --max-args=1 zstd -15 --force --keep --quiet < files.txt
In the above, note that we’re using xargs
instead of the find
command’s -exec
flag. I found xargs
to be significantly (3x) faster. For example, for a site with 25,000 pages:
Brotli invocation method | Time |
---|---|
find with -exec flag ending with {} \; |
2m46.689s |
find with -exec flag ending with {} + |
1m26.396s |
find + xargs |
0m25.438s |
so you let git handle the automation.
At the highest compression settings, zstandard still decompresses faster than brotli. Zstandard typically decompresses at around 700 MB/s, whereas brotli decompresses at about 500 MB/s. This difference holds even though brotli may achieve slightly better compression ratios at its maximum settings
You sold me But I notice you do not compress pictures… I only get now why people don’t. Makes sense. I got the perfect workflow now, which is lightning fast (xargs is sooo cool !):
#!/usr/bin/sh
cd /home/drm/NNSITE
hugo --minify --cleanDestinationDir
pagefind
cd /var/www/html
lndir -silent /home/drm/NNSITE/htdocs
fdfind --exclude ‘*.zst’ ‘.(css|html|js|json|mjs|svg|txt|xml)$’ -0 | xargs -0 --max-procs=0 --max-args=1 zstd -15 --force --keep --quiet
open https://localhost
How beautiful !
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.