Hugo and syncing to S3

Hi all, new Hugo user here (in the process of migrating from Jekyll on GitHub Pages) and I had a question about using Hugo with S3. I’ve read numerous posts about using aws s3 sync to upload the pages generated by Hugo to S3, but I’m wondering about whether this command always uploads the entire site (I have close to 1900 posts spanning 12 years) or whether it uploads only changed files. I suppose the answer to this would depend partially on whether Hugo overwrites files that haven’t changed when it generates the site. Anyone have any insight to share?

As far as I know Hugo builds the site completely on generation. Incremental builds have been discussed, but I don’t think that’s on the near-term horizon. However, depending on the complexity of your templates and things like pagination/taxonomies, 1900 posts should build so quickly I doubt it would be an issue. Are you seeing slow build-times?

It’s not build times that concern me. My build times are currently running about 10 seconds or so, which is a whirlwind compared to the 5+ minutes I was seeing with Jekyll. My primary reason for asking is around uploads to S3; I don’t want to upload the entire site every time I build if only a subset of files have been modified. Inadvertently, I think that @bep answered my question with his s3deploy tool, found at https://github.com/bep/s3deploy.

1 Like

Ah, of course; I see. I use Netlify in part because their deployment is super-fast, but in most instances I’ve found CI solutions at least acceptable, having built many large sites.

The CI solves some of the pain, but when Amazon is charging for bandwidth/requests/uploads etc., re-uploading a massive site every time can also cost money. Netlify is good, but it is also very expensive compared to Amazon if you go beyond the free stuff.

Hey, it’s the Scott Lowe. Welcome to Hugo, Steve. :slight_smile:

I’m not “the” anyone…I’m just another IT guy trying to get ahead. :slight_smile:

1 Like

Slightly off topic, but I have looked around for a cheap and simple way of adding Let’s Encrypt TLS support and manage my S3 sites without going the Cloudfront path (not that many, and some of them are frankly just domain placeholders for side projects that never took off … yet) …

And this project looks very promising:

No TLS/Let’s encrypt support, but that should be fairly trivial to add. So if I, say, have 10 domains and 10 sites it would be really cool if I could:

  • Put them in one S3 bucket
  • Deploy them separately with s3deploy
  • Configure them all in one config file with TLS support.

I did some experimenting a few months back and found that the s3cmd sync command was the most efficient at syncing changes to S3.

Something like this works for me:

s3cmd sync --delete-removed --no-mime-magic -M --acl-public --recursive public/ s3://$BUCKET_NAME

http://s3tools.org/s3cmd

3 Likes

I have now released v2.0.0 of s3deploy:

Which is effectively a total rewrite.

2 Likes

this is the correct solution s3 sync doesn’t work, but s3cmd is awesome. It can also in one step invalidate cloudfront pages that have changed.

Thank’s for that, working perfectly here :pray:t2:

can you explain how to use s3cmd to invalidate cloudfront pages that have changed please?

s3deploy is a really good option for people looking to deploy to S3. Before this I used a Gulp task (with the gulp-awspublish plugin) but that one was hard to configure. Plus, it also didn’t work reliably; sometimes it wouldn’t upload a file which, according to its hash file, it did.

Any how, s3deploy is a tool that’s easy to configure, performs reliable uploads, and is just quick. Much quicker than any Gulp task. :slight_smile:

1 Like