Deployment workflow

Thanks so much all. And now I don’t feel so crazy! Cheers, looking forward to using Hugo a lot more:)

Hi @cloudunicorn,

After some procrastination on my part (mea culpa), I have finally updated the docs:

Please check http://gohugo.io/extras/urls/ and http://gohugo.io/tutorials/github-pages-blog/#toc_2 and see if the revision looks okay to you. Comments welcome! :slight_smile:

1 Like

If you wanted to simplify deployments, you could connect your Github repo to CodeShip, then you have your tests and builds auto-deploy to your production server on pushes/pull requests to your repository.

Just wanted to mention that this function in .zshrc works perfectly for me, @spf13.

I had OS X Yosemite’s default rsync, which was v2.2 or something, so I used brew to tap dupes and upgrade it to 3.1.1. But it still does not have the --force or --progress switches.

In my case, my function looks like:

function hugorccdeploy {
  rm -rf /tmp/rcc
  hugo -s /Users/rc/dev/rcc/ -d /tmp/rcc
  rsync -avze "ssh -p 22" /tmp/rcc/ me@myhost.com:/path/to/hugo
}

Just wanted to corroborate that this is working like a charm for me. When I run hugo server to test locally, my baseurl parameter gets overwritten automatically to localhost:1313 as @natefinch mentions, so I don’t have to make any changes to config.toml. I had a little glitchiness with trailing slash or no trailing slash in some css or javascript calls in the head, but that was easy to catch, just looking at the source.

Talking of deployment workflows, I automate it through hosted CI platforms but the gist of the idea could be as follows (if you were using github pages to host your website):

      # Set your environment variables
      export ACCNT_NAME=''
      export GITHUB_PAGES=''  # example would be name.github.io

      # clone the github repo
      git clone git@github.com:$ACCNT_NAME/$GITHUB_PAGES.git

      # build the html files into the cloned repo (overwriting stuff)
      hugo -d $GITHUB_PAGES/

      # Automatically add all new files to be committed
      echo -e "a\n*\nq\n"|git add -i
      
      # commit all files with some log message and push
      git commit -vam 'Build done at - '$NOW_HOUR
      git push -v

You could wrap this all up in a bash/zsh function

I felt similarly to @sheki - I didn’t want to check my generated site in to git. So, I built something similar that leverages AWS for building and hosting my site. It deploys immediately after a git push and costs pennies to host per month. If you’re interested, check it out here: https://github.com/ryansydnor/hugo-cd

This is how I do it.

1 Like

I have two questions:

  1. In the tutorial you have said:

On the server, there is another copy of Hugo; when Git on the server receives the pushed changes, it tells its neighbor Hugo to generate a fresh copy of the site.

Why do you need to have a copy of Hugo? when you do not need Hugo to be installed on the server at all?

  1. The second question is, in the post-receive file you have:

path/to/hugo

Is this where the local hugo site is, or where the hugo site is going on the VPS? Also in the Jekyll example it simply has

jekyll build

which doesnt appear to be a path, Im just wandering why this is.

Why do you need to install Hugo on the VPS, seeing as the beauty of a static site generator is that it generates the static and then you upload the static site to anywhere?

1 Like

One reason for installing Hugo on the server, is to avoid having to re-upload the entire website in case of small changes.

When Hugo re-generates a site, it re-generates all HTML files, even if nothing on that page has changed. The “modified” time-stamp is re-set to “now”.

If using rsync or similar to upload your site, which relies on “modification timestamp” to determine if a file has changed, that means that every HTML file must be re-uploaded, even if nothing has actually changed.

By installing Hugo on the server, you only need to send the content changes (small) to the server, which can then do all the generation server-side.

Edit: Adding to the literature on this topic -

I’ve written a guide on how to deploy a website to NearlyFreeSpeech using a git post-receive hook. Hopefully helpful to NearlyFreeSpeech.Net users, and everyone else in general.

lws - this is half (sort of) correct:

By default rsync will check based on mod-time OR file size. You can skip mod-time check by passing --size-only, which would make the above a non-issue.

Also, the selling point of rsync is efficient delta transfers is it not? rsync doesn’t just send the whole file, it’ll only send the differences. So even if the files were marked for transfer thanks to changing mod-time, the actual transfer of data would be tiny as the difference between the files would be zero (it’s the same file, just different mod-time).

So it’s doubly a non-issue.

In either case, you can skip that whole issue entirely by simply passing --checksum into your rsync command to skip the files based on checksum and not size or mod-time. That’ll checksum each file being transferred and on the server to compare what needs to be updated/deleted.

There’s a case to make for doing this anyway on the off chance something has altered the files on the server and using this method you know you are replacing files with exactly what you’ve generated on your machine, but I digress.

There is also a performance penalty for using --checksum so if you have a massive site to transfer be aware of that, but the default options are more sophisticated than just sending everything if the mod-time changes, but I digress further.

Upshot is, there may be reasons to run Hugo on the server (good ones too) but rsync transfers isn’t really one of them.

I have rebuilt my own portfolio and blog website recently and documented my workflow in doing so.

I started out experimenting with getting a development environment working using Docker and Docker Compose.

I then started familiarising myself with Hugo itself and how it works. I tied this in to Gulp, which I use as my task runner for static assets.

I then wanted to come up with a way of deploying content using CircleCI to a server where I would host the site on a $5 Digital Ocean VPS with Let’sEncrypt installed.

I documented the whole process in a series of six blog posts which can be seen at https://mattfinucane.com/blog/.

Hope somebody finds this helpful and constructive feedback is always welcome.

hi matfin,

thanks for sharing your code. i wonder if you might find that an inspiring repo:

best

Hi Stefan,

This looks great. I was taking a quick look at your site and the source for it. It’s good to see the approach that somebody else took when trying to tackle a similar problem.

I couldn’t find anyone else who had done what I was trying to do so a lot of my approaches might seem unorthodox or unusual but they work from me now.

I will be making improvements to my own site soon and redoing certain sections that I was unsure about before. Seeing your approaches will help with that.

In the meantime, I have started with a new site, a pet project of mine so I can upload my photos. You can see the source at https://github.com/matfin/cinematt/tree/develop and I think I have made improvements to the process for setting up development and local build environments. I will be bringing these changes in to my portfolio project.

Matt

HI matfin,

great to hear that you like it. But – honor to whom honor is due - I’m not Stefan and it’s not my code. I just came across his blog when I was searching for tutorials to level up my Gulp skills and found it very, very helpful to get my own Hugo site into a Gulp workflow.

I’ll have a look at your code. Thanks for sharing.

Ah ok.

I took a quick look at the code for that repo and there were some things I would do differently. His solution works for sure, but if you have a look at my gulp set up I have made mine much simpler.

With that said, there are different needs for different projects. I am someone who prefers to keep things as simple as possible.

If you check out the link for the photography site, I have a good set up going for creating local development and build environments.

If you haven’t checked out Docker, I would highly recommend doing so. It makes things so much simpler :slight_smile:

yep, stefan’s solution seems indeed a little bit over engineered :wink:

i’ll have a look into your code and docker. thanks again.