Hope you all are doing well. We have been testing an application for a while and finally started putting our production data on to Algolia. We are using Laravel Scout.
I have few questions.
Is there a rate limit on how many requests can be made to the API with scout import? We have around 500K records and it’s been about 10 hours now since we invoked the scout:import command and it only transferred about 300K so far. Is it normal to take that long on indexing request?
We are coming from a Sphinxsearch background and the indexing was a matter of minutes (although closer network)
I’m getting a ton of requests rejected due to object size being too large. It’s kind of mission critical now and I cannot afford to re-structure how the application runs. We will re-structure the application with a proper plan, but in the mean time how can we increase the limit placed on the object size temporarily - say 30 or 45 days?
Appreciate the help of anyone who had a similar experience or some internal expert advice