Indexing records with an huge array of keywords

Hi !

I’m currently using ElasticSearch to perform search, rank, and filter operations on a huge index.
By huge, I mean:

  • +10 000 entities
  • 9 fields for each entity
  • and +1 field corresponding to an array that contains between 100 and 3000 keywords.

A friend told me about Algolia which seems great but I’ve encountered some troubles relative to the size of my records. What is the best strategy for my case ? I’ve heard that the most common solution for big records is to split them into smaller records but this will be much more expensive…

Thanks for your help and sorry for the bad English !

Christopher.

Hi there!

The Essential plan gives you 50k records and 100k operations for 35$/month.
So your index will definitely fit! More info on the pricing here: https://www.algolia.com/pricing

However the limitation with this plan is 20KB per record.
If your records reaches this limit, the solution would be to split your records and use our “Distinct” feature: https://www.algolia.com/doc/guides/ranking/distinct/?language=javascript#distinct-for-de-duplication

If that doubles the number of entities you have, it will still fit in the Essential plan!