Hosting Images on Amazon S3

I have 0 experience personally with any of those. But, as long as you get a static URL for the images, you should be fine to use them.

However, I host my websites on Netlify and we do get more or less unmetered storage and 100GB/month bandwidth for free (with a change to upgrade if needed). Since you’re planning to use Hugo, it’s probably going to be a static website, so, if you haven’t already, you can consider it.

I use Netlify for hosting as well. Do you have your images uploaded straight to the repo with Netlify Large Media through Git LFS?

No, I don’t use Git LFS. I upload it directly to repo.

From what I know, Git LFS is useful is you have single file that’s large and bulky in size and not for 1000s of small files (I might be wrong). So, unless you have some really heavy images (more than 50 MiB), Git LFS won’t be needed. Also, GitHub has a size limit of 100 MiB without LFS. I’m using it without any problems for my website and having one or two files a little above 50 MiB. I get a warning in console while pushing changes that I should consider using LFS for those files, but, it works just fine. I have a lot of images too.

Check my repo here:

I see you’re using the Lazy Load. I may look to use that if I’m going to keep uploading images to my repo. Most of my clients won’t know to optimize images before uploading so when they upload a 2MB file, it ends up loading really slow during page load.

So if S3 or anyone else does that automatically for you and gives you a static URL for your assets, it should be really easy. Instead of relative URLs to the images, there would be absolute. But, if it needs APIs to be configured and stuff like that, things might get difficult. I can’t say more than that because I’ve never got the chance to explore S3.

Different topic but do you like using Material? I currently use Bulma as it is lightweight but been hesitating on going to MDC.

Yeah, I have personally loved the looks of Material Design Language (MDL) since it was introduced around 2014. The official CSS and JS from Google is definitely not lightweight. CSS is 266 KB (minified and vendor prefixes removed), and JS is 319 KB. The entire Roboto font family and Material Icons need another 899 KB. JS isn’t needed if you don’t need the animations like the ripple effect, but, then that takes the fun out of using MDL. Also the MDL class names are very long (for example, for a heading 5 style, I have to type mdc-typography--heading-5). More than the time taken to type, the concern is that, each extra character adds a byte to the page size. But, it’s available as a Node Module too, that since we can import individual components in it, I am guessing that it’s size would be less for the end user. Also, there are various other lightweight alternatives that look similar to the official one but aren’t as good in my opinion.

I personally use MDL because of the following reasons:

  1. It’s a complete design language and not just a CSS framework. It has got its own set of fonts, icons and colours that are documented to be used. So, if it’s all put together properly, it gives a consistent look.

  2. People are already familiar with it. Google has been using it since a long time in its apps are websites. So, users are already familiar with the design and the icons.

  3. On smartphones, it gives the look of an app and when it’s combined with a PWA experience, it feels like almost a native app.

Other than that, it’s probably similar like any other CSS framework.

Hugo has good functions for image processing built in.

Thanks for the detailed response! I will probably stick to Bulma after doing some research and I haven’t really had the time to learn any new frameworks… Your website is awesome btw!

1 Like

I was looking into that recently and was trying to figure out how it optimizes images. I will definitely have to figure out how it would work through since by default creates an upload folder in static. But Hugo resources for image processing requires the images to be in the same content directory.

In Forestry setting you can set it so uploads goes it to /assets/ instead. All files in side the assets dir can be processed.

You can also use “mounts” as described here:

I’ve used Cloudinary. I have a few clients set up on it with about 8k images in Cloudinary each and they are still on the free tier. I hate to say it, but optimizing before upload is crucial. Letting a client upload a 2Mb file is one thing, but uploading a 12 or 32Mb file is another. I instruct my clients to use Adobe Lightroom or CaptureOne to manage their images and then EASILY export their images optimized for the web.

In Cloudinary, 1 credit = 1Gb of managed storage. Their free tier gives you 25 Credit limit, or 25Gb of storage. So if your client can upload optimized images from the jump, then they are getting a lot of free stroage! They will never need to use the image optimizations in Cloudinary.


I like that. I have an idea how Hugo can solve the file size issue by using page bundles and uploading only the pipe-optimized images using Github actions. Let the customer do what they want, the system uploads minified and optimized images to Cloudinary… That 25GB bandwidth sound great. running to the tinker chamber

The only thing that scares me about Cloudinary is if my clients go over their free tier. But 25G is a good amount… I might just add it in. I have a client who does blogging so I assume she’s going to be uploading a lot of images.

You can also try a free image optimization tool, like [link removed]. I use it for optimizing my photos, and it really saves a lot of space without any practical quality loss.

Hi @chevindu

There were concerns raised about the link to the unmaintained software that you posted, hence I had to remove it.

Personally I use a mix of command line tools like jpegoptim, pngquant and when needed I have found that ezgif through its web interface provides excellent compression results.

There are plenty of tools out there that are currently maintained.


Hi @alexandros

Apologies for that link, I had no clue if that software had any concerns and thanks for removing it. Appreciate that you suggested more optimization tools, thanks!

1 Like


No problem. Thanks for the understanding.

1 Like

On Mac OSX I use imageOptim. Nice tool.

1 Like

Although, I do all of the image optimization on my own with the images I upload to Cloudinary, the “Transformations” that Cloudinary allows you to set up are amazing. Using Transformations does use your “credits”, however, using Transformations does reduce your “Bandwidth” usage. So, if your client can’t optimize images, or needs a 50MB image uploaded to Cloudinary, you can set up a Transformation that will display that huge file at minimal size, yet still allow you to link to the full sized image if needed for download.

So, host all the images at any size you want on Cloudinary, then set up Transformations for stellar optimization when displaying the images in your site.