Uploading directly from the command line?

I’ve gotten very used to using git through the command line over the last 6 months, notably pushing my files up my remote repo after a commit.

Is there a way to do something like git push origin using hugo? Would be great to do something like

hugo push origin --all

and have hugo upload all my old files in one go. Is there any way to do this besides ftp?

Do you mean deploying the current version of your public/ files to your server, or do you have something more specific in mind when you say “upload all my old files”?

yes, exactly that. I don’t know where the “old” crept in there. I mean I make a change to, say, about.md, and create a new post, then I run

hugo --theme=hugo-cactus-theme

to build the static files and then

hugo upload origin bla bla

to upload them to my site, ideally with credentials saved in a config file somewhere, prompting me only for a password

I’d guess hugo is unlikely to include built-in tools for deployment in exactly the way you have in mind. But you could use make and a Makefile. The make command executes a series of tasks specified in a Makefile. It can understand which tasks depend on others, and the order in which tasks have to occur. You can use make to e.g. invoke hugo to build your site, and then deploy it to your server using rsync or similar.

The Makefile I use to build and deploy my own hugo site is here. You would need to change both the obvious details, such as paths and server names, etc, but also likely the steps that generate and compress the CSS files and so on. A minimal build-and-deploy Makefile could be very simple. In any case here’s my example:

And here’s a bare-bones introduction to make, with links to some other information:

It’s also possible to trigger actions (such as build and deployment scripts) from git, using post-commit or post-receive hooks.

You could use something like https://zammu.in/hugo?invitation_code=RORYOK to build and deploy your website to Github Pages whenever you push your code to Github. Or if you want something on your own server, then you’ll have to setup a git hook to do the build and push in a post commit hook.

Hi - I just built such a thing over Christmas… Let me know what you think. It currently supports FTP over TLS only (which must cover 99.98% of low cost hosting accounts).

A few other points:

  1. It tracks what has previously been sent and only sends diffs. Good if you have lots of media etc.
  2. It optionally minifies HTML, CSS, js etc.
  3. You will have to build from source. Let me know if that’s a problem and I’ll see if I can compile for you.
  4. Read the instructions/warnings etc. I’ve tried to document it reasonably well. If there’s anything I’ve missed, let me know.
  5. Yes, I know you can probably use Grunt etc to accomplish this and Kieran’s makefile approach will work fine, but for me nothing beats the beauty of a single, fast, tight executable. That’s why we love Hugo, right!
  6. If your website host doesn’t support FTP, let me know. The code is organised to be able to plug in different deployment methods with relative ease.
  7. Currently the password is held in the config file. Again, if you feel uneasy with that, let me know. It’s easy enough to move it to a command-line flag.



Wow. That’s pretty awesome.

I was thinking about taking a slightly different approach and have been laying some of the foundation for that approach.

Permit me to outline my approach in hopes that it may inspire you further…

My approach would be to add this feature directly to Hugo.

Hugo deploy/publish <destination>

deploy would build it locally and then synchronize it to the destination.
It would use afero & fsync to do this. Hugo already uses them in a lot of places so leveraging them further makes a lot of sense. They also have the benefit of being completely cross platform.

A lot of work has been done as of late on Afero and it’s maturing quickly. We would need to add a backend for the different types of places we would want to support (sFTP, SSH, S3, Zip, etc).

Given the work you’ve already done, I imagine porting sftp to the afero backend would be straightforward enough.

Lastly the destinations would be defined in the config file. Different fields would be used by the various different types.

Does that make sense?

I think it would be great to have you adding this directly to Hugo so all users can benefit from it. Of course you are welcome to keep it as in independent binary, but I think the experience would be much cleaner and consequently the impact much broader if it was together.

Another note, If you are using viper for the config then it would be trivial to support passing secure information via ENV variables … eg HOST__PASSWORD=123456


Hi Steve, it does indeed make sense. I’ll have a look through the Afero & fsync code to see how to to hook it up. I did have a brief look at Afero when I started out on this venture.

I assume the entry points in hugo would simply be a matter of pulling the commands across, and the config would move into the main config file.

We may have to get a little smarter with the file diffing depending on the target.

Since we’re unlikely to use Hugo on two different machines, could you just keep a timestamp of the last push, and then only upload files with a newer creation / modification date than that?

Also a password command line flag would be great. Thanks for building this!

That makes too many bad assumptions. You can’t assume you only push from one machine. You also can’t assume that the destination files haven’t changed since you last touched them.

That’s fair enough. This is why I leave the programming to you guys!

I’ll fix up the password flag for you next day or so @roryok , then work on refactoring in the current standalone project ready for integration into hugo…
Probable refactors / changes:

  1. Modify deployer to use Afero
  2. Wrap FTP in Afero
  3. Fix up config file to use same or compatible settings to Hugo config file. Any ideas on config layout for Afero-backed deployment would be appreciated
  4. Possibly tinker with command structure to make it easier to snap into hugo -
    i.e. hugodeploy deploy --dry-run in place of hugodeploy preview,
    and hugodeploy deploy in place of hugodeply push

Alternative, use lftp.

Check this deployment workflow, based on Travis CI service but can run from local command line as well

Another very bare-bones way of doing this. I wrote a litte windows batch file to run hugo and upload the new build using the WinSCP ftp client (both hugo and winscp are in my PATH):

@echo off

hugo -s "c:\hugo\sites\blog.com"

winSCP /ini=nul /command ^
    "open ftp://username:password@ftp.blog.com/ -rawsettings ProxyPort=0" ^
    "lcd C:\hugo\sites\blog.com\public" ^
    "cd /public_html/" ^
    "put *" ^

if %WINSCP_RESULT% equ 0 (
  echo Success
) else (
  echo Error