Is this a separate run or using watch?
Also how large is the output directory?
Is this a separate run or using watch?
Also how large is the output directory?
Just “hugo”, no watch.
Running du -h tells me the public/ folder is 609 Mb.
My “version” of your blog:
271M ./public
Ha! Fixed!
You are right @bjornerik, that’s the right size. I had the .7z copy of the blog inside public. I deleted it, and then went and deleted each orfan file I had laying around inside public and now, after many months, it seems to be fixed! It was done before I finished lifting the finger from the enter key
Maybe having a hundred random files in subfolders inside public/ causes a big delay the first time you run hugo?
It tries to synchronize the directories. If there are a lot of files or really large files it will take some time to copy them over.
But that would not explain why it’s slow only once after booting, even if the files did not change, or?
The synchronizing only seems to add new and changed files, but does not delete extra files. My impression is that it was those extra files that caused the slowness.
I just tried again with hugo 0.14. The more verbose output is useful. I can see that it’s stuck at “INFO: 2015/07/16 syncing from /home/xxx/www/yyy.com/static/ to /home/xxx/www/yyy.com/public/” for about half minute. static is 245 Mb.
@spf13 - How does hugo determine if a file has changed? A brief explanation and a pointer to that could would be useful.
I’m curious if that interaction is part of the problem. Hashing large files or some such…