Hello everyone,
I am working on an application which is built in Astro.js together with Netlify and the Algolia Crawler integration. I am running into an issue where indexing, after automatic builds, seems to ignore the robots.txt and the sitemap specified inside it.
Since I am using Astro (and its sitemap generation plugin) the sitemap file is called sitemap-index.xml. This file is also specified in the robots.txt file. But the crawler seems to ignore it, and tries some default options like sitemap_index and others. Since my page doesn’t have a “/” route, only “/de” and "/en, and it can’t find the sitemap it doesn’t index any of the pages.
However, if I trigger a rebuild of the same commit, then the crawler tries to index again and this time it is without a problem. It somehow (presumably from the robots.txt file) knows, to look for the sitemap-index.xml file, and indexes all pages.
Has anyone had something like this happen to them? Is there any way to specify the sitemap file name for the plugin?
Thanks for any help in advance.