Best practice for using DocSearch on a closed-source site?

The latest DocSearch docs make the point that v3 uses Algolia’s crawler instead of the docsearch-scraper. This is great for open-source projects, but in our case we have a closed-source project we’re rolling out (using Docusaurus for the doc site) and it’s entirely opaque how we should best utilize Algolia.

  1. Is our only option using the deprecated docsearch-scraper? If so, what’s the most accurate configuration because using the legacy configuration is not properly indexing our content.
  2. Are we forced to use the Algolia Crawler?
  3. Is there a non-obvious third option?

Really want to use Algolia but frustrated there’s no clear path on how to do so for closed-source docs.

Hi Ray! The DocSearch is indeed for open source and not-for-profit documentation sites.
For businesses, the Algolia Crawler is the correct tool.

What is your use case? Is this API docs? What sort of tool is it?

Also, what issues are you seeing using the legacy scraper? The most common issue is the hierarchy not matching up with the hierarchy of your site. Are you using a docs framework like Docusaurus or Slate?