Re-indexing issue with ReplaceAllObjects Method

I am using the Java API, and i am trying to reindex all my products from my database. I only have 3068 products in this database.

I process them in the normal way as instructed in the documentation creating a List of POJO.
I get the client object and then the index object from the client so that then i can use the ReplaceAllObjects() Method.

I use the above method, no errors come back and also no errors to report on the dashboard, but i only have 2068 records in the index. I have performed this same reindexing a few times and always end up with the same amount of records.

I have thoroughly debugged my application and at the point of sending the List of POJO’s i have 3068 to send.

I thought maybe this is a transaction cap, but in the documentation this is set at 1GB of data.
the 2068 records that landed in algolia have an average size of 683.82B which overall means what arrived to algolia was just 1.41MB .

has anyone else had a similar observation and know what could be the issue?
Its tempting to keep trying different things but then these just add up on your operations.

thanks for any help

Hi @vectriccdn,

My first thought is that some of your objects may have the same object ID and are overwriting each other?

Note that the replaceAllObjects method uses a temporary index. First, it copies your index’s settings, synonyms, and query rules to the temporary index. Then, it adds the objects you passed to the temporary index. Finally, it replaces your index with the temporary one.

Is the temporary index still in your application? If so, this may mean that the operation failed.

Note also that this can be an operation expensive method. Using this method can significantly increase your indexing operations count. It costs the number of new records + 2 operations (copySettings and moveIndex) . For example, replacing all objects of an index with a new set of a million objects costs one million (and two) operations. If you’re on a Free plan, make sure you don’t exceed your record limit. If you’re on a paid plan, be careful of the impact on your operations count.

Hi CIndy,

thanks for your reply.
All of the objectIDs i set to a record are coming directly from a MYSQL database using the unique IDs they are assigned to within this database. So i would have thought this was not issue, although I will double check now and get back.

I have just created a set of the objectIDs that i wish to index onto Algolia and they are all unique as i succesfully managed to add all 3068 product ids to the set, which in JAVA a set has to contain all unique information else it will fail.

Also, i do still have the temporary index in my indices as you mentioned, the temporary index only contains a 1.00k records, not the 2.07k records my main index has, and i am missing, strangely enough, a 1000 records from my main index, as i am trying to reindex 3068 records.
I am going to check if this temporary index contains all of the missing records from my main index.

I can confirm the temporary index that has been left that has a 1000 records in it, is the remaining 1000 records that havent made it across into my main index, judging by a quick analysis.
The problem I have here is that there is no error messages returned with this transaction which would be of great help to determining what this issue is.

Hi @vectriccdn,

Can you check the logs in your dashboard for the temporary index as well as the other index to see if you see any errors? Can you tell if the transfer may still be processing? It should have finished, but it’s a possibility that something is slowing it down. If you are still having issues could you write in to so that we can get your application ID and try to track down your issue?