Do you have a single domain with /de/robots.txt and /en/robots.txt? If so, that’s incorrect; the robots.txt has to be a single file at the root of the site.
From that tree you shared, I think the expectation would be to upload those into two separate folders that are the root of the de and en sites, mapping to http://example.de and http://example.com.
thank you for your answer. Which solution would you prefer? example.de shows the German site and example.com the English site. For the France and Swiss side I order two new domains, the ch and fr domains. At the moment I have the first solution. example.com/de and example.com/en. The example.de moves to example.com
In Japan, people accept the local .co.jp domains as canonical for companies, so I prefer separating. It’s up to you, but it looks like you’re set for separate domains. The point is, if you’ve configured hugo as multihost you need to put each of the subfolders of public, in its own domain. I can’t see what you are doing because you haven’t linked the repo so I cannot say for sure.
I have configured Hugo as multihost. The domain example.com is linked to the
public folder. The example.de domain is linked to the example.de domain. I have
changed the structure to:
public/
│
├── robots.txt
├── sitemap.xml
│
├── de
│ ├── index.html
│
└── en
├── index.html
Your idea is to move the folder /de to example.de and the /en folder to example.com?
The problem is the sitemap.xml. This sitemap.xml produces Hugo:
If you configure hugo as multihost, it expects a different domain per language. You need to publish your public/de to your German domain, and your public/en to your English domain, however you do that. There should be a robots.txt and sitemap.xml in each of those folders. Whatever is in de is accessible at the German domain, and the same for the en.
I don’t understand where the ssl.webpack.com domain is coming in. Your domains and baseurl are the example.de and example.com right? If so, the files related to those are what you should be submitting to Google.
I am not sure of the purpose of the ssl proxy is supposed to be vis-à-vis your other sites, or, what the problem is.
If you need to produce a site that has to be accessible under a different domain, you can specify both the config file and the baseURL at the command line when you generate it using hugo.
The sitemaps are alright (it’s XML that the browser renders, check the source and it will look alright).
The robots.txt MUST be in siteroot. https://ssl.webpack.de/robots.txt for instance. The way you do it (in a subfolder) the robots.txt is not accepted.
The way you host your domains will result in your robots.txt not being available. I am not sure that the SSL is configured correct, it should be https://developer.kwpse.com instead of the ssl.webpack.de in it.
I suggest you drop this portion for the purposes of troubleshooting your issue. That is confusing folks, and isn’t really needed, correct? We are just checking files, so it is okay if it isn’t over an encrypted connection.