Netlify's Large Media support

Found in a Tweet today:

I will eventually look into how well this plays with Hugo and image processing, but my initial thoughts are that it:

  • Should be a great fit!
  • And, looking at the pricing model, you pay for the image transformations, not the storage – which should make Hugo’s built-in image processing look really tempting …

Thanks for this. I’ve moved everything to Cloudinary from Flickr, you can get up to 12GB storage “free” but I will be running out of space in a years time!

As relevant as Amazon’s closed source services.


Hello @anon8657309

You have been making OT contributions consistently, in several topics.

This is the Hugo Support Forum we may occasionally mention different services but this not a place to troll about Netlify/trimming JS/or whatever-else etc.


Trying this out today, but it doesn’t seem to work out of the box with Hugo Image Processing.

In the build progress, I’m encountering the following errors

10:08:10 AM: Error: Error building site: failed to render pages: [en] page "/opt/build/repo/content/overons/": render of "page" failed: execute of template failed: template: overons/hetbedrijf.html:3:6: executing "main" at <partial "hero-image....>: error calling partial: "/opt/build/repo/themes/callvoiptelefonie/layouts/partials/hero-image.html:4:14": execute of template failed: template: partials/hero-image.html:4:14: executing "partials/hero-image.html" at <.Fill>: error calling Fill: fill /opt/build/repo/content/uploads/logo-bord-breed.jpg: image: unknown format

Template code:

{{ with .Params.hero.image}}
{{ $imageResource := ($.Site.GetPage "section" "uploads").Resources.GetMatch (strings.TrimPrefix "/uploads/" . ) }}
{{ with $imageResource }}
{{ $src1x := (.Fill "1920x550 Center") }}
{{ $src2x := (.Fill "3840x1100 Center") }}
<header class="h-64 md:h-96 bg-grey-lightest">
  <picture class="h-full">
    <img src="{{ $src1x.RelPermalink }}" srcset="{{ $src1x.RelPermalink }} 1x, {{ $src2x.RelPermalink }} 2x" class="h-full w-full" style="object-fit: cover;"  />
{{ end }}
{{ end }}

Thanks for trying.

I’m not familiar with Git LFS (that is backing this), but I would guess that the files are stored as some kind of empty proxies, which would explain the error you see from Hugo trying to process it as a JPG when it’s not.

It was also my first encounter with Git LFS. I must say it’s not a smooth ride. The biggest issue is to remove Git LFS from your repo if you don’t want to use it anymore. I gave up and removed the .git folder and started over. Basically, there is no easy way back.

  • Currently Netlify Password Protected websites are not working with Netlify Large Media Support.

I think that you are correct in your analysis. And I don’t think there is an easy way to integrate Git LFS with Hugo Image processing.

If you want to dig a little deeper in Netlify Large Media support. There is a demo repo build with hugo.

This is the file format for a jpg image when using Git LFS

oid sha256:e6df77690697794deacb9d6963e574044448bef923f7d688aab1d856c54388a3
size 4074082

The resizing methods of Hugo’s image processing can be handled by Netlify’s image transformation service, using query string parameters added to image file paths. Taking the examples from the Image processing methods doc, this is how the Hugo methods would translate to Netlify parameters:

Hugo method Netlify parameter
$resource.Resize "600x" ?nf_resize=fit&w=600
$resource.Resize "x400" ?nf_resize=fit&h=400
$resource.Fit "600x400" ?nf_resize=fit&w=600&h=400
$resource.Fill "600x" ?nf_resize=smartcrop&w=600&h=400

On the plus side, this makes builds faster and more efficient. On the minus side, other processing options (quality, rotate, anchor, filters) aren’t supported.

If all you need is resize/fit/fill, it’s possible to alter your templates to replace the image processing calls with query parameters appended to the file path.

1 Like

Netlify’s image processing is hardly a drop-in replacement for Hugo’s image resources. Also, I don’t see how it would make the builds any faster, so could you please elaborate on that.

Sure. To clarify, I’m not talking about builds in local development. Those work exactly the same in either case.

For CI/CD builds, however, using Netlify with Large Media shortens the build time in two ways: by cloning a smaller repo, and by avoiding resizing the same images in every build.

A CI/CD server needs to clone the repo to run the build. Netlify clones Large-Media-enabled repos with GIT_LFS_SKIP_SMUDGE=1, meaning it pulls the tiny pointer files instead of the large asset files. Fewer bytes to move means a faster clone during the build.

Second, if you’re handling image resizing outside of the build (as you would with Netlify’s image transformation, though this principle would also apply to other image handling services like Cloudinary or Imgix), you don’t need to resize them during the build. Less to build means less build time.

To be clear, I’m not saying this is a complete replacement for Hugo’s image resources. Like anything, it depends on the situation. For example, if you had a photo gallery site with a very large number of very large images, regularly added to the repo at full size but frequently viewed on small mobile screens, Netlify Large Media could be a good solution for that.

On the other hand, if your image needs are more in “normal” range, and you want to perform other transformations that Hugo offers, Hugo’s image resources could be a better fit. Another thing worth noting is that you determine which files are tracked by LFS/Large Media. This means you could enable Large Media to handle non-image files like audio, video, and PDFs, and handle your images with Hugo.

  1. Hugo will in its default setting, on Netlify, cache files in /opt/build/cache/hugo_cache/ (this will survive the build, so to say) and will only reprocess the image if the source has changed. With that in mind I guess it’s a question of “what’s the fastest cache?” – which should not matter too much.
  2. I will investigate this vs Netlify’s Git LFS support, but I would hope that Hugo could use the SHA256 hashes in those pointer files to determine to read from cache or not, which I presume is what Netlify does behind the scenes.

Cool about the build caching. That means the time cost of in-build transformations only happens on first transformations.

Regarding #2, I think you’re talking about bringing the actual LFS-tracked images into the build process, correct? In this case, Netlify doesn’t check whether to read from cache, because the buildbot doesn’t read those files at all. Large Media files are uploaded directly to the Netlify LFS store on push, separate from any builds that might be triggered by the push. The images are not included or transformed in the build, and their paths in files go unchanged.

When it comes to viewing the images on the built site, Netlify’s proxy server uses the pointer file to retrieve the image from the LFS store, but this happens on the fly, not during the build. Image transformations requested by query parameters in the image path are similarly handled on the fly, then cached for future requests.

So in short, Large-Media-tracked files are not available during the build in any way. I can see how one might want to override this in some cases, though, and it’s something we’ve talked a bit about internally.