I’m keeping the Algolia index up to date when we make content changes by running a script from our publishing platform which clears and re-uploads the content to the Algolia index. This process also chunks the content as per the Algolia recommendation. But each time we do this, key information about the object is removed from the analytics system.
The above behaviour implies that we’re supposed to map then maintain the Algolia object ID to the source material. That’s not something we’d like to do. Are there other strategies I could pursue in order to maintain the integrity of the Algolia analytics data?
The Algolia documentation says the analytics system is technically a separate instance of the index so it’s rather bizarre that they replicate the deletion action over the analytics data. Surely the analytics should be a snapshot of the data at a point in time that can’t be retrospectively updated because of a reindexing action.