Deployment - is there a way to determine and output changes locally?

Hi. Have just configured first Hugo site and migrated a Wordpress site over. This is all fine and running locally and everything is all looking good. Will take a while to get used to writing in markdown but I guess that’s part of the fun… :slight_smile:

Anyway, the question I have is around deploying. An initial deploy of the full site is relatively simple - I am on macOS so use CyberDuck or Transmit to just (S)FTP(S) the whole lot up to the webhost, no probs. It’s slow, but does the job and works.

But now that things are live, incremental deployments are a bit more interesting. I don’t want to redeploy all of it every time for just a new post/article. And I have also looked through the docs on deployments (Hosting & Deployment | Hugo) - and that’s great if you are using one of the many popular services listed there… Azure, GCS… etc.

Aside: I am using Bitbucket for source control so the tie-up with Aerobatic was immediately of interest; but at time of writing it looks like Aerobatic is not a thing anymore as the domain is up for sale?

I realise there are plenty of options if I want to move to one of those listed such as GitHub or pipelines or similar. But I currently have a ‘regular’ / vanilla webhost - have had it for years and it all works absolutely fine; other than there is no SSH access so can’t do rsync and none of the other deployment methods appear to be supported directly - other than possibly the Nanobox route but this seems massively over the top.

What options are there for simpler deployments to a regular host? I’ve tried using the ‘synchronise’ feature in Transmit but this is also very slow.

The docs say of the deploy command:

Hugo will identify and apply any local changes that need to be reflected to the remote target

and I’m wondering if there’s a way of doing the same but just outputting the required updates locally (which can then uploaded via SFTP)?

If it’s not built-in then I would consider writing something to do it; the docs say to clear down the Public folder each time before rendering output which I have been doing but assume this effectively loses history of changes so this would need some thought, but I think it would be fairly straight-forward to e.g., maintain an environment of ‘last deployed’ and use this for comparison against ‘public’ and then diff that to a new folder for upload via SFTP.

Thanks for any thoughts / help / advice!

No there is not.

Instead have a look at Deployment with Rsync | Hugo

Notably the rsync flags -az do:

a first run against the data you want backed up. The next time it runs it’ll do a block checksum of the file, and only copy over the parts which have been modified on existing files, copy new files over, and remove files which are no longer there.


For all Hugo sites that I deploy to standard web servers I use this simple one line script:

#!/usr/bin/env sh

hugo --cleanDestinationDir && rsync -e 'ssh -ax' --archive --delete --verbose --compress --human-readable --exclude '.DS_Store' public/
1 Like

Thanks; what does the x in the -ax do? not familar with that flag. If I can have ssh access enabled by the host this sounds like the most workable solution. slight concern is that I will publish the same site from multiple different places (so build modified/created dates etc. will vary), but if rsync determines changes based on a checksum hopefully this should not be an issue…

Not sure if it helps, but depending on your OS, you might find rclone a better choice than rsync. When I’m on Windows and doing SFTP to plain vanilla hosting I find it to be a ‘life-saver’.

thanks, funnily enough found that on another thread last week and it looks perfect; not had a chance to try yet though.