I’m currently using ElasticSearch to perform search, rank, and filter operations on a huge index.
By huge, I mean:
- +10 000 entities
- 9 fields for each entity
- and +1 field corresponding to an array that contains between 100 and 3000 keywords.
A friend told me about Algolia which seems great but I’ve encountered some troubles relative to the size of my records. What is the best strategy for my case ? I’ve heard that the most common solution for big records is to split them into smaller records but this will be much more expensive…
Thanks for your help and sorry for the bad English !