I have a blog running with Hugo and I love the SSG, but I’m getting a bit worried about the size of my GitHub repository.
When I write a post, I commit it to GitHub and it then deploys to Netlify. I’ve found I have to commit all the images and the _gen images for it to successfully process. If I don’t, it times out and fails.
The git repo is about 6GB and when I render the site locally, the public/ folder is about 1.8GB. This seems fine at the moment, but it is only getting bigger.
I make use of the Hugo image processing features (each image is processed and deployed into four smaller sized pictures, along with the original and I use a shortcode to do it) so my understanding is I can’t use Git-LFS features?
So what are the recommendations on what to do here? Am I doing something wrong? What is the best practice for using Hugo’s image processing workflow and also deploying to Netlify?
I did try updating the timeout limit but it didn’t seem to have any effect - locally or on Netlify. Maybe I need to try again.
As for publishing - I think if I move the only option I have is to stop using Git and GitHub and just deploy manually via rclone or similar to a host like Cloudflare and R2. A shame as I love the git - commit - push workflow.
My aim is to keep publishing as simple as possible (for me) while using a SSG!
If you have a lot of high res base images and process to multiple sizes (probably with the more intensive algos?) then your first build time will be very long.
For me a clean start build is 40 minutes with only a fraction of my content up so far (my processing does a little USM sharpening to the images also) .
You need the _gen cache so future builds are shorter.
Because of that I went the route of
Make sure the page’s base images aren’t copied over with
use a grunt script to remove any intermediates used in the processing pipeline but not linked in the html (hoping Hugo can do this in the future)
sftp sync the result
I can add a new post, rebuild, run the script and sync in a minute or two.