Hugo -t THEME behavior

macbook:hugo aizanfahri$ hugo version
Hugo Static Site Generator v0.13-DEV

Installed hugo by using the brew on OSX.

I’ve been experimenting with hugo about an hour or so, figuring how it really works. This is my setup:

  • Generate hugo (hugo -t THEME)
  • cd public/ && git push origin master (yes, I use git to deploy Hugo on my server).
  • On my server, the traffic is served by nginx. Below is my post-receive hook.
GIT_REPO=$HOME/source
TMP_GIT_CLONE=$HOME/tmp/source
PUBLIC_WWW=$HOME/serve

git clone $GIT_REPO $TMP_GIT_CLONE
cp -R $TMP_GIT_CLONE/* $PUBLIC_WWW
rm -Rf $TMP_GIT_CLONE
exit

As you can see here, I don’t have hugo on my server, instead I just use the command cp -R to copy the static files to the serve/ folder, then having the nginx to serve traffic to the folder.

After testing with 4 themes (tinyce, liquorice, purehugo, herring-cove; the rest are quite broken), I found that that hugo behaves the way I don’t really sit well with. After I issued the command hugo -t THEME, all the posts and theme are generated into the public/ folder. Say that when I changed the theme to something else, hugo doesn’t delete the old theme files, thus creating bunch of CSS and JS files. This looks a bit untidy.

Any thoughts on this?

1 Like

There’s an open issue for that.

Me, I’m fine with adding an extra clean-step to my deploy-script. I appreciate it this way, as it gives me more control.

1 Like

I agree and am not satisfied with nuking the directory via command-line approach… I also think that this is something Hugo should be responsible for as it is the one that is putting the files in its own directories-it should be capable of managing those resources better, or at minimum, clean up after itself.

The current approach leads to problems for me in development, mainly due to browser caching (I think), and has led me on several wild goose chases where I have already fixed a problem but keep looking for the source of it somewhere else because I forgot to clear out the public directory, which I’m not entirely satisfied with its nomenclature either. However, those are largely self-inflicted pain due/

Just generating content and writing it to the published directory creates general site issues. Leaving stale content on web accessible directories can lead to people going to the wrong page, obsolete content, or wrong the version, search engine indexing problems, and, in worst case scenario, a potential security issue.

I was going to add a simple delete flag to handle this, but it seems inadequate, especially with live reload.

My ideal is the ability to archive and delete either the contents of the directory that the published content gets written to or a portion of the contents, e.g. only files modified since last site generation. This would include handling content generated via live reload. Unfortunately, this is not currently supportable-I’ve been contemplating how best to address this.

Having some solution to cleaning up the public folder, even if it is just a delete process, is on my Hugo todo list.

@mohae, I’ve just added a comment to the github thread about this https://github.com/spf13/hugo/issues/379 with a suggestion to solve this issue. Please let me know what you think.

Tip: use rsync instead - it only copies things that have changed. Using cp is going to give you grief.

To mirror my local directory to my server, I use something like this:

rsync -av --delete htdocs/ myserver:/home/websites/sitex

Also, you don’t have to use this across the network, you can just use it as a local recursive copy instead.

1 Like