What's normal/acceptable build time

Hi folks, I’m curious about the if there is an acceptable range/standard/guideline of build time. For example, I have a site that produces 10k-30k pages/files, but I’m not sure if it’s build performance should be considered as slow.

Btw, does Paginator pages were counted towards the Pages metrics?

                   |  EN  | ZH-HANS | ZH-HANT
-------------------+------+---------+----------
  Pages            | 3282 |    3259 |    3261
  Paginator pages  |  195 |     195 |     195
...

As I am sure you already know, the number of content pages is only one of many factors that determines build time. For example:

git clone --single-branch -b hugo-github-issue-8602 https://github.com/jmooring/hugo-testing hugo-github-
cd hugo-github-issue-8602
hugo
                   |  EN    
-------------------+--------
  Pages            | 10461  
  Paginator pages  |  1035  
  Non-page files   |     0  
  Static files     |     0  
  Processed images |     0   <-- no image processing
  Aliases          |    10  
  Sitemaps         |     0  
  Cleaned          |     0  

Total in 4445 ms

That’s about 0.0005 seconds per page.

No. In the example above…

  • Pages: 10,461 (content pages + section pages + home page)
  • Paginator pages (pagers): 1035
1 Like

@jmooring is right in that it depends on your site. Hugo also currently doesn’t scale very well to larger data sets.

I know I’m starting to sound like the boy who cried wolf, but I’m very close to get this over the finish line now: The next Hugo will be perform much better for bigger data.

Just took a 300k content file site for a spin:

3 Likes

This is a 2 language site I run. Build times have been high (80K-140K MS) even with caching configured ever since I added images months ago. (Can comp specs affect speed? 8GB of RAM)

$ hugo env
hugo v0.121.1-00b46fed8e47f7bb0a85d7cfc2d9f1356379b740+extended windows/amd64 BuildDate=2023-12-08T08:47:45Z VendorInfo=gohugoio
GOOS="windows"
GOARCH="amd64"
GOVERSION="go1.21.5"
github.com/sass/libsass="3.6.5"
github.com/webmproject/libwebp="v1.3.2"

Always: processor speed, number of cores, RAM speed, RAM capacity (depends on what you’re doing), and I/O (both storage and network).

As for the build time, if you assume that all of the time is spent processing images, that comes out to about 0.065 seconds per image.

1 Like

@Arif I’m pretty sure that’s mostly image processing. Getting the cache setup is a key factor getting that down on repeated builds-

I see. Will keep this in mind. My comp specs might be contributing somehow to the slow builds.

Here is mine from a previous comment from you I saw somewhere in this forum

[caches]
  [caches.getresource]
    dir                                    = ':cacheDir/:project'
    maxAge                                 = "1h"
  [caches.images]
    dir                                    = ':resourceDir/_gen'
    maxAge                                 = "1440h"
  [caches.assets]
    dir                                    = ':resourceDir/_gen'
    maxAge                                 = -1

I have also been thinking if using shuffle can cause the resources to keep being regenerated on each build? Though only one (of three) usage of shuffle in my templates generates images (the related posts section for like 250-300 pages).

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.