We want to sync a table having more than 2M rows. We have integrated the algolia django app with our project but the initial sync is very slow How can we bulk update all the records quickly.
We’re sorry you’re experiencing slowness with our Django integration.
Before moving forward, we need to know a few extra information:
- What’s the version of the algoliasearch dependency in your project?
- How much time does it currently take to sync your 2 million records?
I think its updating 20K record per day. Please give me your number so I can call you.
We don’t provide phone support, only email for plans that include it, and this forum for everyone.
Under the hood, our Python integration uses our Python client. The latest major version of the Python client implements the
saveObjects method (which is used to index records) with automatic batching. If your transitive dependency to this client isn’t up to date, it may be worth updating it.
From what I’m reading in your message, you’re currently updating 20K records per day (so, 20K over the course of 24 hours) when initiating a indexing, is this correct? If yes, this is definitely not normal, but I would first recommend you profile your application to see where the bottlenecks are. If the network calls to Algolia aren’t abnormally long, then it may come from your implementation.
Let me know.