Performance, hugo 3.2k products, 8k images

Dears, I love hugo and I am working to deploy a site for a e-commerce client. They are relatively small with about 3.2k products but have 8k images ~10GB in storage. The images are on netlify lfs. But as soon as I added lots of images on my test site with about 200 md files (content with front matter), the build time shot up to 1min.

I have given ignoreFile during build (*.jpg) etc, but I feel it is taking a lot of time on my local machine. I have 32GB ram. Not sure what is causing it to slow down.

Any pointers would be great.
Thanks

1 Like

Can you set up a test repo for this issue? Then others can try to recreate the issue and find solutions.

If you process images it can add significantly to the build time. But it’s also cached in the resources dir so it’s recommended to commit that. (I believe I have read that Netlifly has a plugin to cache processed images as well.)

Non processed images should not add mush to the build time since they are just copied.

Thank you @frjo. Any pointer how to set up a test repo? Is there a write up? thanks.
Regards

@frjo, I do use netlify LFS. So I imagine there is no image processing going on. Unless I have triggered that accidentally. Which is the netlify plugin for caching processed images? thanks

BTW: the build time is local. On the Netlify, I can imagine it will take even more time.
Thanks

If possible you can simply push your current site to a public GitHub repo.

If that is not desirable, create a new site with the same theme and configurations etc. Add enough test content and test images for the slowness to be evident and push that to a public GitHub repo.

Ok. Got it Thanks!

Where are your images mounted in the project? /static?

Use cached partials for some components of your page. I am running a blog and I have all the images embedded(remote images) in post. So nor hugo nor netlify has to process the images.
If you are using netlify or hugo to process the images then just avoid it. Use jpegoptim on your local machine to perform this operation once.

Indeed in :
/static/images/_img

@Prashant_Verma there is no image processing going on. No compression, resize etc . None of that. Should completely ignore my images except for upload during the git to LFS.

Will look into cached partials. Thanks

If this is the hugo server, you may try hugo server --renderToDisk – if it’s the “real build”, then I’m a little surprised.

1 Like

I had to convert a WordPress site with 20GB in images to a Hugo site. I brought all the images into a Photo editing app called Capture One to batch optimize the images. I was able to reduce the size of the images directory from 20GB to 850MB in images. I’ve used Git LFS with Hugo before, but this time I used Forestry.io and their Cloudinary integration. This enabled me to offload 7,500 images from the repo. Believe it or not, my client is still on the free tier with Cloudinary even though they have 7k images in there. Overall, this process allowed me to turn the site back over to the client and they still have a WordPress like experience. In other words, they can still manage the site on their own.

I’m not sure if this helps, but I just wanted to point out that you can place all of your images in another service and never have to worry about an image directory blowing up your repo or build times.

3 Likes

This is perhaps a little off topic but I’ve recently built a Hugo site and deployed it on shared hosting. As soon as I added image galleries it had a big impact on local build times.

Once I had the correct file path, the easiest option for me was to remove my galleries image folder from the static folder, store separately then upload it to the server.

It may seem counter productive but once the initial deploy is done it’s faster to rebuild the site then upload the site and additional images than it is to wait for the increased build time.

This hit on build time initially made me question if Hugo was right for my needs, especially on shared hosting. This work around convinced me it was - Hugo is great - massively improved since I tried it a few years ago.

Thank you @bep I will try that today. On a side note, I am looping through 8 taxonomies and going through site.Pages each time to build the list pages. Maybe that could be that. Still I dont see the link why adding more static images changed anything.

Also my disc goes to 100%.

Hi @somratpro I used the netlify docs as is. Make sure you use the git pull origin master / git push origin master (if that is your branch) to get it syncing. It did not work with Git Desktop for me.

I would recommend reading https://gohugo.io/troubleshooting/build-performance/. You may be able to utilize cached partials to avoid rendering the same blocks of content over and over for each page.

This remind me of a problem I still have, Generating a rather large website
Dit not solve it since then. When I launch server for the first time, seems Hugo take a lot of time to discard my static files copied during previous server lauch, and then copy them again to generated website, though no change happened to this images since the last time I runned server. So I’m interested for the solutions you may find.

I have some tiled panoramas on my web site. this are 80,000 small image files and together 5GB.
I moved it to an folder outside of hugo and linked it in the published folder tree. It works fine for me.