How to update only new or changed records using docker algolia crawler?

I am following the guide here: Run your own | DocSearch by Algolia

I am on free tier. When I do:

$ docker run -it --env-file=.env -e "CONFIG=$(cat config.json | jq -r tostring)" algolia/docsearch-scraper

The first time, it uploads records or indices successfully. I get a response:

Nb hits: 5,719

Now, in my static site, I add one single markdown file. I again do:

$ docker run -it --env-file=.env -e "CONFIG=$(cat config.json | jq -r tostring)" algolia/docsearch-scraper

Hoping, it would just add some more records and Nb hits would increase by only a few.

Instead, I get response:

  File "/root/.local/share/virtualenvs/root-BuDEOXnJ/lib/python3.6/site-packages/algoliasearch/http/transporter.py", line 92, in retry
    raise RequestException(content, response.status_code)
algoliasearch.exceptions.RequestException: You have exceeded your Record quota. You’ll need to change your plan for more capacity, or delete records. See more details at https://www.algolia.com/account/billing/overview?applicationId=SO040MI8P2

Nb hits: 4341

This leads me to believe, it’s all getting duplicated which is not ideal.

The workaround is deleting index again from algolia site and then do docker each time, which is not so convenient.

What’s the solution to make the crawler work without dupplication? Thanks.