Does having a lot of static files cause this, and if so, I'm just curious why, and is there any workaround?

Lately, hugo server takes several minutes to finish building my site locally, and begins to consume up to 10gb of RAM on my laptop. The Hugo version doesn’t matter, I even downgraded to an old version and it didn’t change anything.

However, today I noticed when I started to copy an old site to a new directory and new repo, and started experimenting with my static folder outside of unignored (git) directories, killing and restarting hugo server was instantaneous again … but when I moved the static folder back to where it should be, then it takes many minutes and gigs of RAM to finish building.

My static folder has 5gb of audio, video and image files for what it’s worth.

If a lot of static files is in fact what slows down hugo server and makes it consume tons of RAM, is there any way around this other than not having static files?

Try hugo server --renderToDisk.

We have a planned new flag called --renderStaticToDisk, but that was harder than we (I) first thought.

5 Likes

Hi @ddg

I’m not sure what you mean by “slow” because there’s no mention of the time it takes to build. I experienced slow build time previously, when a wedding photography website with a lot of videos and photographs, 60+ GB of data, took a very long time to build.

I’ve witnessed 2x and 3x faster builds for the same project since upgrading to Hugo version 0.96.0. I would advise upgrading rather than downgrading because the most recent version has a number of major benefits. have you tested the same project with the latest version to see the positive build time difference?

Thanks

1 Like

@pitifi9191 I’m pretty sure his problem is many/big /static files – which is a known issue with the default server (rendering everything to memory).

3 Likes

I had a lot of unserved files in a folder (for processing/converting video and audio into the final files being served on the site) that was added to my .gitignore file, so while they were not committed to my repo, the folder was still being consumed by the local Hugo server command. The render to disk command is interesting, but since I have them on one part of my disk already, this idea seemed to me like it would just duplicate the files elsewhere on my disk. I just moved that folder outside of the repo folder so the local Hugo server would never even see it, and this sped things up a lot and also used a lot less RAM. I guess I should’ve done that in the first place, but I didn’t realize how the local Hugo server was using all of those files.

I think another interesting convenience architectural idea would be to have something like a .hugo_ignore file so I could keep content that is still in progress closer to its final, served location, so, e.g.:

  • I don’t have to cd so much to use ffmpeg
  • assets do not get locally created in the resources folder (which I do not have in .gitignore which causes git status to show new stuff in resources before I have the relevant in-progress page bundle folders moved outside of my .gitignore'd content-in-progress subfolders) until it’s actually time for them to be committed in git

This topic was automatically closed 2 days after the last reply. New replies are no longer allowed.