Is there a way to add single pages to the crawler?

Our crawler runs once a week, but often we have mid-week releases where we want to update the index. We use a static-site generator and have solely relied on the crawler to update the index. Our options appear to be:

  1. Develop an integration with our build process to push updates to the index (large effort).
  2. Rerun the crawler manually after a mid-week release (seems excessive for minor updates).
  3. ??? Is there a way to point the crawler at individual URLs to update the index? (ideal)

Absolutely, you can add individual pages to the crawler’s scope. Most web crawling tools allow you to specify specific URLs or pages you want to include in the crawl. This way, you can focus on particular content without replacement crawling the entire website. Check the documentation of your chosen tool for instructions on how to do this effectively.