Index long pages using hugo templates

I need to index my site and generate some json data for Algolia search. My pages are mostly long pages, so I need to break them up into smaller chunks, as described here: Indexing long documents | Algolia

I have been looking at using hugo templates to do the indexing, e.g., something like this:

{{- $index := slice -}}
{{- range $page := $.Site.RegularPages -}}
    {{- $index = $index | append (dict "objectID" $page.File.UniqueID "title" $page.Title "date" $page.Date "href" $page.Permalink "content" $page.Plain "tags" $page.Params.tags ) -}}
{{- end -}}
{{- $index | jsonify -}}

But how can I break up the content of each page into smaller chunks, e.g. based on paragraphs or headings in the page? I cannot see any variables or functions that can help me do this. All is see is page.Content, page.Plain and page.RawContent

Any help much appreciated…

You could split the .Content at h2 elements.

Thanks, will try it.

I also found you previous answer to another post, which works well for my case