I have a site which offers two languages, English and Spanish. When the user navigates to the home page, let’s say www.site.com the page redirects you to either /es if your browser language is Spanish or English otherwise.
At the moment the robots.txt I have is:
User-agent: * Allow: / Sitemap: https:// www.site.com/sitemap_index.xml
because I’m defining all
alternate URLs in the
sitemap_languages.xml and all URLs are listed also in the
sitemap.xml. My question is more towards the configuration of the
robots.txt because I’m not sure if I should be allowing any user agent to crawl the / page. As that page always redirects to the home of either /en or /es I believe that should be disallowed.
Should I then do:
User-agent: * Disallow: / Allow: /es Allow: /en Sitemap: https:// www.site.com/sitemap_index.xml
I’m not sure if that could cause crawl issue of whether there is another way to achieve the same result.
Thanks in advance!