Algolia, SEO and pre-render pages on the backend

I’m in the process of implementing Algolia using their instantsearch.js library in my Hugo project. My project is a basic site that showcases products.

As SEO is important to the site’s success, I’ve been reading up on the subject and came across this article by Algolia, The Impact of Algolia on SEO.

As all search results will come through the front-end, served by JavaScript, they recommend that you pre-render pages on the backend and serve it up to the search engine crawlers.

This is inline with Google best practices for building indexable progressive web apps: “Use server-side or hybrid rendering so users receive the content in the initial payload of their web request.”

My questions is, how do I do this hybrid rendering in Hugo?

1 Like

Question: If I search for product A in your search field - is the result from that search, delivered by Algolia, the only item on your website, that has product A or do you have a single post item with product A?

If you have a public post with product A there is nothing to worry about. Google is able to follow javascript and if you add a link to the post google will find and index it.

If you however have your product database only in the Algolia index I would turn around and make them into posts if you want Google to rank them.

Other than that there are no SEO implications from a site search. It’s a function, not content. In my own projects I would “noindex, follow” the search page and forget about it.

I think the Algolia search is usable with more than just static sites and those use cases, like a products database only in the algolia search index, there are things to think about if you would want the content indexed.

It seems to me that you don’t have to implement any pre-rendering in this case. Hugo has already created the content, you are just hooking in Algolia to index it for local site search.

It’s however different if you have a PWA where the content is served from API etc.

Thanks for the reply.

On my website, all products have a public post. A single.html is generated for each product. However, not all products link to the public post. Some products link directly to the manufacturers website.


Search for product A --> click on product A -->
Search for product B --> click on product B -->

But product B does have a page on my website -->

There is a reason I do this even though it seems strange. I assume from your post that I should be fine. I also submit a sitemap to Google.

Can you please elaborate on the following:

In my own projects I would “noindex, follow” the search page and forget about it.

Does Google automatically index search page results and therefore noindex, follow is required?

As an example if you create a search query:, will this page be indexed by Google?

Also, is it beneficial to add a no-js class so if javascript is disabled, my website will still show products, but the user won’t be able to search?

Appreciate the help. As you can probably tell, I’m quite new to this.

Google indexes everything it is not forbidden to index. So if somebody links to your search with a term (http://domain.ext/search/?q=searchterm) Google might index it. Depending on if you want this or not you add some meta tags or disallow /search/ via robots.txt. The follow part still allows the google bot to follow the search results which then again can be indexed.

Regarding disabled JS - that’s kind of last centuries pet peeve :wink: I never thought about that deeper, but assume somebody with disabled javascript must be used to non functioning websites. I would use a noscript-tag with a nice text or maybe a sitemap to show for this case. I wonder if anybody has some insight/analytics into how many visitors come with disabled javascript. If it’s below 1% on your site put a sitemap into a noscript tag and concentrate on the 99% remaining visitors. But that’s just my opinion…

Thanks for the insight. Appreciate the help.