We want to use Query Rules (great feature by the way!) to promote a specific webpage result.
We are crawling our webpages and indexing them as records in Algolia - Algolia creates the objectID. We plan to re-crawl each night, delete the index content and re-index with the new results of the crawl. Much of the data won’t have changed but we’re doing it this way because it’s more straightforward than having to keep track, across our many websites, when new webpages have been created or their content updated.
When we reindex, of course the Algolia generated objectIDs, mapping to the webpage records, will differ from what they were before and therefore the existing query rules will work incorrectly. This means that we will have to delete these query rules and add them again.
We could generate our own objectIDs and map them to specific webpages, then when we reindex we’d have ensure that the same ID maps to the same URL. But rather than generate our own IDs and keep a track of their mapping to certain URLs, why not use the webpage URL itself as the objectID? Has anyone else done this? Any gotchas using a URL as an objectID?