Slow to work with large number of static files

Hi,

First, hugo is awesome, it works really well, but now I’m running into an issue. It looks relates to this older post

I my case, my site has 89,687 images in the static/img folder which amounts to 4.6GB of data

[website]$ find static/ -name "*.jpg" | grep -c jpg
89687
[website]$ du -sh static/
4.6G    static/

The files in the static/img folder don’t change while I’m developing the site. I get fresh images once a week, but when I change a css file or js, hugo seems to sync all files in the static folder, which takes about 10 minutes.

Is there a way to tell hugo to ignore the static/img folder?

I found that using cp -r static public/ takes 4 minutes, so I could do that initial syncing on my own, and then just have hugo update the files in static/css , static/js which are the files I’m changing.

Let me know if you need more details

[website]$ hugo version
Hugo Static Site Generator v0.14 BuildDate: 2015-05-25T21:29:16-04:00

Thanks

Diego

There is a IgnoreFiles regexp thing for /content, and it should probably also ignore files in static, too … Create an issue for that.

The obvious workaround for you is to keep /img somewhere else on the file system, and just copy it directly to /public (or wherever you build your site).

Thank you!, I entered

My suggestion is to separate whole static content into different sub domain like cdn.example.com. It will help to speed up your site overall anywhere.

You can create another repository for that, without using any static generator.

Do not forgot to use exact URL like cdn.example.com/images/pic.jpg in your post, otherwise, you can’t see proper images on your localhost server.