Hi folks, I’m curious about the if there is an acceptable range/standard/guideline of build time. For example, I have a site that produces 10k-30k pages/files, but I’m not sure if it’s build performance should be considered as slow.
Btw, does Paginator pages were counted towards the Pages metrics?
@jmooring is right in that it depends on your site. Hugo also currently doesn’t scale very well to larger data sets.
I know I’m starting to sound like the boy who cried wolf, but I’m very close to get this over the finish line now: The next Hugo will be perform much better for bigger data.
This is a 2 language site I run. Build times have been high (80K-140K MS) even with caching configured ever since I added images months ago. (Can comp specs affect speed? 8GB of RAM)
$ hugo env
hugo v0.121.1-00b46fed8e47f7bb0a85d7cfc2d9f1356379b740+extended windows/amd64 BuildDate=2023-12-08T08:47:45Z VendorInfo=gohugoio
GOOS="windows"
GOARCH="amd64"
GOVERSION="go1.21.5"
github.com/sass/libsass="3.6.5"
github.com/webmproject/libwebp="v1.3.2"
I see. Will keep this in mind. My comp specs might be contributing somehow to the slow builds.
Here is mine from a previous comment from you I saw somewhere in this forum
[caches]
[caches.getresource]
dir = ':cacheDir/:project'
maxAge = "1h"
[caches.images]
dir = ':resourceDir/_gen'
maxAge = "1440h"
[caches.assets]
dir = ':resourceDir/_gen'
maxAge = -1
I have also been thinking if using shuffle can cause the resources to keep being regenerated on each build? Though only one (of three) usage of shuffle in my templates generates images (the related posts section for like 250-300 pages).