Large - really large Hugo sites. 10GB of images

GitHub and Cloudflare pages have size limits, but they’re free. Even if I could pay, there’s no option to have a site that large.

Where would you go to have a site like this?

I’d suggest separating the images from the site. I store my blog’s gigabyte of images and binaries in Amazon S3 buckets; not (quite) free, but keeping them outside of the Hugo directory significantly speeds up my builds.


how often is the site updated/deployed?

Cloudflare you could certainly pay for and operate a large volume project. And they’re limits are very vague. Max 20k files, max 25mb per file…that’s what, 500GB?

I’m not sure if there is a way to utilize Cloudflare Images or Netlify Large Media / Transform Image.

But I assume that all depends if you’re trying to stick to free options.

I’m not sure at what volume of assets a tool like Hugo starts to buckle.

I would also advise you to offload your images to external hosts, such as an S3 bucket, Cloudinary, etc. Both GitHub and Cloudflare can handle your site’s size though it will be slow to clone it, hence the need to offload the images.


I do wish Hugo’s Pipes had support for calling out to other programs in some defined way, so you could write a thing to integrate with Cloudflare Images, S3, R2, whatever. Git repos weren’t designed for all these images.


As far as I can tell, cloudflare pages isn’t the limitation as they clearly state you can have 20k files at a max of 25mb each, but deploying from a git repo wouldn’t be possible without large file storage or a CDN to server large assets from.

With Pages, you could use their Wrangler CLI, or drag directly into the upload folder in the dashboard. I’d imagine Netlify has the same option.

You also could just serve directly out of S3.

I have faced a similar issue. Wherever I want to store such a large site with tones of images, the options are really limited.

I end up making a different approach when I created

It may not be a solution for you, but there are some.
I host all images in albums on Google Photo.
Album shared as public and links to images generated using solution.

The whole site goes down to just 500MB where all GBs of images are on Google Photos.
Suitable solution for this site but may not be best for all.

Also, have a look at this thread Repository storage for Large Hugo site.

Theoretically, you can split a big site into multiple repos below 1GB each and then merge them together during build on Netlify. I was looking into that but in the end, choose the above approach.

Direct upload is currently limited to 1,000 files, though this will be lifted in the future according to them.

Yeah, those limitations seem to keep creeping up. I was pretty stoked on Netlify until I figured out that LFS git is not what I want to spend my time maintaining.

The only feasible solution I can see is to offload image assets elsewhere, and block large files with gitignore id you want to maintain the the repo deploy. Which sucks to leave, cause it’s nice.

While not ideal, would solve the problem on a large site.

I’ve had this same issue, but I had a client with 30GB of images in their site. I chose to set their Hugo site up with which also has a free an integration with Cloudinary.

I actually ran a web image optimizer which reduced the 30GB of images to 15GB. Then I uploaded all of the optimized images to Cloudinary for storage. The made the GIT repo small and fast again.

Also, Cloudinary has amazing image optimizations through “transforms.” Setting this up greatly improve page loading speed as well.

Furthermore, it’s been my experience that clients don’t know how to optimize their images for the web. Thus, they keep uploading 5MB images for a profile thumbnail. Cloudinary is great for this because it will allow them to upload that 5MB image, however, through setting up transforms, I can have Cloudinary transform that image to 80kb to display in the site. So it’s a win-win. Cloudinary can store the large file, yet serve an optimized image on the fly.

1 Like