The magento extension used to work just fine, but I started getting 400 errors : record is too big size.
So I have enabled Algolia Search Queue Runner and in effect 80% of my product descriptions were removed from index and I can’t get more than 200 products indexed.
I read that objects that are too big are being automatically broken down to small chunks, but instead I can see that each product batch that contains any records that are too big has been removed from index and I can’t add this products back to index.
Can you please advise?
I found a temporary solution - instead of indexing 100 elements per job I am indexing 1 element per job. This allows me to push most of my products to search index.
However products with long description are now completely excluded from search index /record too big error-400/
How can I have records longer than 10 000 bytes indexed as well?
currently it’s not possible to index larger products then 10kb. The extension doesn’t support splitting of records.
But what shouldn’t happen is that the whole chunk of products is skipped when one product is bigger than 10kb. I’ll take a look on that.
@vivaroltd, can I ask you what Magento version and extension version do you use? I tried the batch skipping and I’m not able to reproduce. Thanks for your help!
I am using Magento ver. 188.8.131.52 and Algolia extension 1.12.0 with Indexing queue enabled.