In layperson’s terms?
On every page you would be ranging (looping, iterating) over 24 pages.
And the site has 50,000 pages.
That’s 1.2 million times through the loop.
Famous analysis of, and fix for, ridiculously long loading times for Grand Theft Auto:
https://nee.lv/2021/02/28/How-I-cut-GTA-Online-loading-times-by-70/
It checks the entire array, one by one, comparing the hash of the item to see if it’s in the list or not. With ~63k entries that’s
(n^2+n)/2 = (63000^2+63000)/2 = 1984531500checks
A few years ago we had a similar problem with our internal pagination template.
If I’m reading this thread correctly, you went from 12,403 pages in 600s to 49286 pages in 111s. Assuming the build time from the small (aborted) sample would scale linearly, that’s a 95% reduction in build time.
0.048 s/page → 0.002 s/page
I get it now.
Is this why the first call to .Paginate or .Paginator is cached? And should first and last be discouraged for sites with thousands of pages?
No, the issue was related to the pagination template itself:
https://github.com/gohugoio/hugo/issues/8599
As to why we cache the results of .Paginate\.Paginator, it is obviously about performance, but I am not familiar with the inner workings of this feature. It is fast, really fast, and once you understand that the first call is cached, you can code to accommodate.
Not at all. The culprit in the past performance issues on large sites was the internal pagination template and that was addressed a couple of years ago with:
https://github.com/gohugoio/hugo/pull/8602
I thought these thousands of iterations slow build speed a lot (mine are 500,000+). But now I grasp the concept.
This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.