I host a simple personal blog with hugo (yay). I have (ulp) ~ 18 years of content that I ported out of Wordpress, Twitter, Facebook, and Octopress. Hugo’s the only thing I trust. But I feel like my workflow is sub-optimal. I’m looking for the workflow that’s envisioned for:
- Composing a local blog post
- Adding images to the post body
- Verifying the post + its images locally
- Deploying the rich-media post to a “public” web server
I’ve tried a number of options, but they all seem to slow down my build times to make Hugo un-fun to use. I know I must be at fault. There’s no way that a simple text edit should take a minute for the
hugo server to process and refresh (right?).
My Process That’s Not Working
- Think of a post
git checkout local (a few commits on top of
master that keep
content/posts/* small in pursuit of fast reloads)
hugo new content/posts/derp-de-derp.markdown
- Edit, also place images in
git checkout master
git add content/posts; git commit -m 'adds derp-de-derp.markdown'; git push
rsync static/images/* remote-host:/path/for/images/
http://my-site/posts/YYY-MM-DD-derp-de-derp/ is available with images working
- Is this the imagined golden path?
- This works OK, but I’m using a lot of
git magic to try to make things easier. Is this needless work if I configure my
- If ever i try to use the full complement of posts (~4,000), the
hugo server process gets incredibly slow. I realize this might be pushing a “simple local web server” beyond what’s reasonable to expect. Is that unreasonable? What factors could I tweak to make it operate faster?
How can I unleash Hugo’s true power? I love the templating and the rational template resolution hierarchy, but for something that’s simple, joyful, and beautiful, I just feel like I’m doing it wrong and don’t know how to fix it.
Mostly by reading the documentation from start to end at least once. The feature you seem to either drop or not know is called “Page Bundles”. You put everything into the same content folder per blog post.
Then you add these to the repo, run hugo on the remove server and done.
Then there are plenty of CLI options to find out, why the development server is getting slow. Most times (in my cases) it’s the pagination that runs very often individually and other layouts that can be cached.
That’s interesting, I do use page bundles on another site, but I view their use case as much narrower e.g. I use them for my “cookbook” site where I have a recipe and a picture of the outcome.
But in the case of a long-running blog, since the images are put under revision control as well, that would mean that this approach would bloat out the repo as they’re added to the
.git/objects store. Over 18 years with “pictures from maui (high-resolution and thumbnail),” “screen shot of something,” “video of something,” etc. that’s not-ideal.
Obviously I can think of pretty advanced approaches (e.g. when developing locally deploy the assets to a server on the internet so that they can be accessed locally and after the site is rebuilt and deployed to production), but this feels rather unnecessarily convoluted.
git has something called lfs (large file system, not sure) where you save larger files. These are files that you add and never change in the end. I think that makes sense even if it’s a “nice photos of my last holiday in highest quality”-site. especially then.
Convolution is only what happens between your server and the visitor of your site, keep it simple and small here. Anywhere else what counts is speed and preparedness.
I did a screen recording after I switched from Wordpress to Hugo, and the workflow hasn’t changed much for me since then.
There are a few other posts that talk a bit about how I manage the site and publish content: