Single letter and postcodes showing in popular and no results


We are running algolia on our site but we are getting strange results in the analytics. We are getting single letter results consistently for the same letters everyday. The same single letters. Now, we know the results served and there cannot be users getting the 10 results and finding the product they need from the list.

The same goes for partial postcodes being entered and getting no results. Same thing, the same partials; W1, LS1, RG1 etc.

We have blocked bot in the robot text and added in a check for bots in the user agent and stopped any form input showing if that is true.

Is anyone else having any problems like this? It takes up more than 70% of our searches and if we switch to the new pricing model we would be paying for rubbish.

Any advice is welcome!

Indeed, the problem of bots is a common threat to the internet. But there is no “right way” to fix it. You seem to have already tried to implement the robots.txt file.

Your web host/infrastructure provider is also the right point of contact about mitigating the bot searches on your website. Removing bots is best assisted by your web provider too.

Definitely try a few other approaches: