Large Data Files - Timeout

Hello,
I am running into an issue with system resources and large data files. For context, we are ramping up the number of pages we are building 10x by 10x until we hit hundreds of thousands. Every page must be accompanied by a data file (we are using .json format). The problem as I understand it is that when Hugo is building the site with the data files it is trying to load all (or a lot) of the data files in memory and causing our build to be killed.

Is there a way to lazy load the data files, process in serial instead of parallel, or any other way of possibly reducing the system requirements for building a site like this?

I have also been reading this for more context: github bug

I don’t think there’s a way other than parallel according to the docs. Is there something that can be set before running Hugo? I know MacOS has ulimit.