Web lists-archives.com

Re: baido.com




On Wed, May 15, 2019 at 09:29:15AM -0400, Gene Heskett wrote:
> Greetings all;
> 
> I just caught the baido-spider crawling my site for about 15 minutes.
> 
> Is putting www.baido.com in my hosts.deny enough to shut that down?  Or 
> is there a deny function in apache2 I should be using instead?

https://www.google.com/search?q=robots.txt+baido

In general, use the robots.txt file in your web server first and foremost.
If that doesn't work, THEN move on to heavier solutions like firewalls.