Please explain fingerprinting for static websites to me

This is a question I did already on another thread but @bep told me to bring it into its own thread, so here we go.

I might have understood this all wrong.

What’s the advantage of having fingerprint on .css files for a static website?

I tried it myself.

Instead of css/style.css now I have css/style.938toasdijgokadsn0a9w5.css

Every time I upload this to my CDN provider (cloudfront), do I need to also update all the html files linking to that .css? Because the fingerprint number will change each time, right? That means, the link will change too.

So, say I have 400 html files with content and the 1 css file:


h1 { color: black; }


h1 { color: red; }


  1. Upload style.css
  2. Invalidate css/style.css
  3. Drink grog


  1. Upload style.csao8i9tu89a.css
  2. Upload 400 html files.
  3. Cry
  4. Drink grog.

Is that correct?

if it is, the only thing that I save is not having to invalidate one file, but have to pay for 400 file uploads . Not sure that’s the expected result…

I believe fingerprint for css files work when you have a header.php include for all your pages, server side. There you only have to upload two files, the header with the style9084390849.css hashed link and the header.php

But not so sure if its worth it on a static site where you will have to pay for the upload of the entire HTML site.


So, I have not done the exact math here, so please do not send me a bill if I’m wrong.

For Cloudfront CDN you will typically store the files on Amazon S3, so it will be a CDN cache. I did some investigations about this when I implemented Cloudfront cache invalidations in s3deploy, and here are my knowlege:

  • The first 1000 cache invalidations per month are free. Note that 1 cache invalidation can contain wildcard, so “/*” will be 1
  • After that you pay, but I don’t think it’s that much.
  • If you use s3deploy to do the deployment, I have tried to think smart about this, so for 400 changed files it will fall back to a small amount of invaidations, so even if you do this 30 times a day, it will still be free.

I have some sites running on S3/Cloudfront. Not big sites. But I often force upload everything to test out stuff, and my Amazon bill related to this is around 6 USD a month.

All that said, updating 400 files because of 1 tiny change in a CSS file may sound wasteful.

Until you think about it in scale.

If you set your HTTP cache headers on static resources to 10 years or something, that means that the returning visitors avoids having to redownload these assets every time they click on a link.

So, if you have 10000 visitors a day, it adds up and makes those 400 files you needed to upload a small number compared. Not only does it save bandwith, it also makes your site much faster.

And the only practical way to do this is via fingerprinting.


I hear you @bep, but wouldn’t in that case be more cost effective to upload the CSS file (no fingerprint) and push the invalidation for that file?

The HTML files will remain cached in the user’s browsers and what will be sent is a header to download the CSS file again, which then will show the style change.

Against uploading 400 HTML files (besides the CSS)

Say 10,000 visitors:

Invalidation method:
Invalidate CSS file. 10,000 visitors will download that file.

Fingerprinting method:
Fingerprinted CSS file. 10,000 visitors will download that file.
400 HTML files will also be downloaded because they were updated. That means 10,000 visitors will have to download each one those files again (unlikely to have 10,000 visitors download the entire site, but say you only get 10% of those visitors download those files…)

Again, I’m not sure about this, just thinking out loud if its worth it for static sites. Please remember I am a total ignorant when it comes to all this.

If you only think about yourself and not your site’s visitors, that may be true. But this means that you need to add very short HTTP cache timouts on these assets.

Also, 1 cache invalidation is not somehow less than 1 cache invalidation.

Mmmm not sure if I will be doing my visitors a favor with fingerprinting.

Say you visit a site:

Download the HTML file you already have in your browser AGAIN, just because there was a tiny CSS change. Then the CSS.

Now the CSS has been cached, but if you visit another section you have to download that section’s HTML file again.

So you end up downloading all the HTML files for each section you already visited and were in your browsers cache, again. You see no difference in the content, just an aesthetic change hardly noticeable.


Download the CSS file (no fingerprint). That’s it, you won’t be downloading any of the other HTML files for the other sections of the site AND you will see the style change because you updated (and now cached) the CSS file.

Maybe I’m looking at this wrong (again, ignorant here) but having a website download every HTML file I visit AGAIN, even when I already have it in my browser’s cache only because the CSS file name was updated against downloading just the CSS file wouldn’t be really cost effective for me as a visitor either.

I am looking at this wrong? Please, its eating me!

You can Google this, but my conclusion is this:

The simplest and most effective way to speed up your site is HTTP client caching via carefully set cache expires etc. headers-

This isn’t just about CSS – but all static assets. The gain is, of course, more for bigger assets (images etc.).

The way you write I get the feeling that you update your CSS “all the time”. This isn’t common. Most sites stay fairly static in their look and feel for longer periods of time.

Ahhh yes, you are right. I update my CSS quite a lot after I launch a website. Bad habit ;(

Forgot to thank you, thanks for clearing it up for me @bep.