Llms.txt for Generative Engine Optimization (GEO) with Hugo

I wanted to share a workflow for making your Hugo site AI-friendly using the llms.txt standard. This is similar in spirit to robots.txt or RSS, but designed specifically for Large Language Models (LLMs). It provides a Markdown-formatted summary of your site, including: Website title, Base URL, Description, Index of content (articles/pages) with summaries in MD.

llms.txt is human-readable and machine-parsable. It is placed in root to help AI agents quickly understand your site content so that you improve your Generative Engine Optimization (GEO).

I created a minimal Hugo repo with instructions on how to generate llms.txt automatically:
GitHub repo: https://github.com/roverbird/llms-hugo

It includes:

  • config.toml setup with a custom llms output format

  • Example homepage content

  • Template layouts/index.llms.txt to generate a structured Markdown output

You can see a live Hugo-generated example on a cybersecurity-focused site (the site is built with Hugo) here:
Live llms.txt: https://kibervarnost.si/llms.txt

This is a simple way to make your Hugo ready for AI SEO (GEO).

As much as it is an interesting approach, just a heads up.

John Muller, from Google, recently, in his video, advised that there is no need for llms.txt for the website to be discovered, crawled. Also, other AIs ignore it, even if you tell them to stop using your website for AI learning.

Watch from 5:15

2 Likes

He also compared it to the keywords meta element, which has been ignored by everyone for a very long time due to abuse.

Basically, if you have to crawl a site to verify the content of an llms.txt file, why read the llms.txt file in the first place?

https://www.reddit.com/r/TechSEO/comments/1k0kcx9/comment/mnev9c0/

1 Like