Content adapters: examples and performance

This really isn’t a “tips & tricks” topic, but rather some notes about the content adapter feature introduced in v0.126.0.

Here’s an example site to explore capabilities:

git clone --single-branch -b hugo-github-issue-12440 hugo-github-issue-12440
cd hugo-github-issue-12440
hugo server

This site uses content adapters to create pages from remote data sources.

On my very average laptop, with a terrible network connection, the initial build is 20 seconds, or about 0.001 seconds/page. Of that, approximately 9 seconds is spent downloading the 49MB remote data file. Subsequent builds are 11 seconds (the data file is cached), or about 0.0006 seconds/page.

Note that the “posts” section is paginated with 10 pages per pager, creating 2000 pagers in addition to the 20,000 posts. So the number is closer to 0.0005 seconds per page.

While this example site creates pages from remote data, you can also create pages from local data files and global resources (e.g., create a page for each image in the assets directory).


One benefit of local files (e.g. in assets) is that it works better for edits when running the server (we need to improve this for remote files, but that needs some thinking).


It builds in 5,5 seconds on my MacBook Pro M1 32 GB.


It also scales in a linear fashion. If I duplicate the “posts” section 4 times to generate 100,000 pages, the build time per page remains the same.

1 Like

On a Mac Studio M2 64 GB with a 2.5 Gbit connection, built the first time in 9.753 seconds and the second time in 5.070 seconds. Remarkable.

1 Like

I feel I’m being tech-shamed. :smile: In my defense, I’m testing in a VM that (obviously) shares resources.


Your data is perhaps more applicable to other users and more real-world business environments, however.

1 Like

Thanks for posting this example! This is a brilliant new feature in Hugo and the performance is impressive indeed.

Now I more than ever want to try this out on a real project.


So, I suspect my 4 year old laptop is not performant. While the build was running, I saw random spikes of 100% on CPU and Disk in the task manager. RAM usage by GIT was peaking around 1200MB. (Also on a ‘terrible’ 10MBPs connection)

$ git clone --single-branch -b hugo-github-issue-12440 hugo-github-issue-12440
cd hugo-github-issue-12440
hugo server
Cloning into 'hugo-github-issue-12440'...
remote: Enumerating objects: 65, done.
remote: Counting objects: 100% (52/52), done.
remote: Compressing objects: 100% (45/45), done.
remote: Total 65 (delta 9), reused 33 (delta 3), pack-reused 13
Receiving objects: 100% (65/65), 15.72 KiB | 58.00 KiB/s, done.
Resolving deltas: 100% (10/10), done.
port 1313 already in use, attempting to use an available port
Watching for changes in C:\Users\arif\sites\hugo-github-issue-12440\{archetypes,assets,content,layouts,static}
Watching for config changes in C:\Users\arif\sites\hugo-github-issue-12440\hugo.toml
Start building sites …
hugo v0.126.0-32c967551be308fbd14e5f0dfba0ff50a60e7f5e+extended windows/amd64 BuildDate=2024-05-14T13:24:11Z VendorInfo=gohugoio

                   |  EN
  Pages            | 20021
  Paginator pages  |  1999
  Non-page files   |     4
  Static files     |     1
  Processed images |     4
  Aliases          |     7
  Cleaned          |     0

Built in 309229 ms
Environment: "development"
Serving pages from disk
Running in Fast Render Mode. For full rebuilds on change: hugo server --disableFastRender
Web Server is available at http://localhost:53957/ (bind address
Press Ctrl+C to stop
$ hugo env
hugo v0.126.0-32c967551be308fbd14e5f0dfba0ff50a60e7f5e+extended windows/amd64 BuildDate=2024-05-14T13:24:11Z VendorInfo=gohugoio
Device name	HP-Elitebook-820-G2
Processor	Intel(R) Core(TM) i5-5300U CPU @ 2.30GHz   2.30 GHz
Installed RAM	8.00 GB (7.88 GB usable)
System type	64-bit operating system, x64-based processor
(HDD 500GB)

I noticed that you are running windows. From a recent topic on this forum, is there any chance that you have a virus scanner running?

Second run Built in 172424 ms.

Windows Defender is running…and now that you mention it, I also see some spikes in the Task Manager when hugo server is running…I am beginning to suspect if it has been behind my slow builds in recent months.

Add an exclusion and see what happens.

We changed the default in Hugo 0.123.0 from writing to memory to writing to disk when running the server. This is almost always a good thing (esp. if you’re low on memory), but obviously not if you’re having a virus scanner checking all files written to disk.

You could try running hugo server --renderToMemory and see if that makes a difference.

@jmooring I am assuming the whole hugo/bin folder is to be excluded?

@bep hugo --renderToMemory first build is 39674ms. Second is 12337 ms and the third is 11290ms. Impressive! I had already cloned the project, so I just run the server command after deleting the public and resources folder before running the first command.

So, the antivirus is indeed to blame! But hugo server --renderToMemory is also faster! (And maybe hugo server -M should be an alias for the latter)

1 Like

Yea, this particular example may be that. But once you throw some big files into the mix (lots of images etc.) you start getting memory constrained.

With Microsoft Defender Antivirus:

Start > Settings > Privacy & security > Windows Security > Open Windows Security > Virus & threat protection > Manage settings > Add or remove exclusions > Add an exclusion > Process

Then type hugo.exe add press the Add button.

I tested before and after.

Adding hugo.exe as a process exclusion reduced build times by 75%, and the build times are about the same as I reported earlier (16 seconds initial build, 9 seconds subsequent builds).

1 Like

Thanks. I have just gotten one of my data sources working with the new method (and it was a learning experience). I am impressed! Congrats to you all.


Virus scanning and performance:

1 Like

@bep @jmooring & co

Great work! :+1:

Took me 10s to build the Content Adapters example on an M1 Macbook Pro (that is heavily loaded with hundreds of opened tabs in Safari as well as other layout and graphics software).

Pretty good performance.

Also the implementation is much more elegant and straightforward than what was being proposed in the past.

Thanks! I will be using this in production, as soon as time permits.