The range construct accepts an index variable. So you can use that to prevent Hugo from including any links past 50k.
In the example given in the docs just change the the range construct from {{ range .Pages }} to {{ range $i, $e := .Pages }}. This will give you access to the $i variable in your loop which will count up from 0 each loop. You can use that to make sure you never go past 50,000.
Something like this:
{{ range $i, $e := .Pages }}
<!-- if $i is less than 50,000 -->
{{ if lt $i 50000 }}
<!-- stuff goes here -->
{{ end }}
{{ end }}
This is how I’d implement a 50k limit. There may or may not be a better method.
How do you reach the limit? 50k seems like a very high number of links. I would attempt to create sitemaps per taxonomy or post type if it’s evenly distributed.
Other than that it would make much sense to enable a limit on amounts of links PER sitemap. I figure 50k lines will result in megabytes which won’t make search engines happy.
@joshmossas solution is fine, but you will loose all links beyond that (or have to manually create another template).
While thinking about it the best way without changing Hugo or adding features to Hugo might be a custom post type that you can page through (sitemap index) and that has all items (sitemaps) in subsequent files.
Consider posts types Photography tagged under Abstract, etc. It’s ridiculously easy to reach the limit and I’m shocked this hasn’t been encountered by anyone else before.
I think that it would be great if Hugo managed sitemap and sitemap index files for sites with pages more than 50k. It might facilitate users maintaining websites with significant number of pages and may add another reason for them to use Hugo.
Echoing sentiments on this issue, this really needs to be addressed.
Despite what is or is not considered normal, it is very easy to find yourself generating a site with more than 50K pages - I run over a dozen such sites.
Google’s hard limit of 50K entries in a single sitemap index file is a problem. I’ve been paging through the Hugo code and looking to understand how an appropriate change could be made, but I just don’t have a grasp on the codebase yet.
I’ll contribute what I can if I can, but I would urge the developers to consider a change that produces a sitemap_index.xml by default and dynamically adds sitemap_XX.xml files as needed in batches of 49,999 entries.
This would be a much-needed addition and from searching this issue it would be very much appreciated by a lot of site developers.
how does your code solve this, if I can hit limit with one section.
We need some way to generate files based on code.
Chunk pages by month (or even week) for example and output into separate files sitemap-2019W34.xml, sitemap-2019W35.xml, … smaller each sitemap, faster getting into google index. I have it tested with large WP site.