Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

Table of Contents

General Information

What are bad bots?

Bad bots are automated agents that attack websites at the application level. They are crawling the website, creating an unwanted and avoidable load.
To avoid this load, we recommend to block all these bots that aren't needed.

Difference between good and bad bots

Good bots observe the Robots robots.txt exclusion standard, so : they read out the robots.txt and adhere to the defined areas that are allowed to be scanned.
Bad bots don't observe robots.txt and can steal data, may break into user accounts, submit junk data through online forms, and perform other malicious activities.
Types of bad bots include credential stuffing bots, content scraping bots, spam bots, and click fraud bots.

Robots.txt

Advantages and disadvantages

The advantage of the robots.txt is that good bots stick to the Robots exclusion standard, so they read out the robots.txt and adhere to the defined areas that are allowed to be scanned.
You should use your robots.txt to prevent duplicate content from appearing in Search Engine Result Pages search engine result pages or just to keep entire sections of a website private from good bots.

The disadvantage is that bad bots ignore these robots.txt-files.
Therefore the robots.txt can only be used to restrict the access of the good bots to the site.

How to use the robots.txt

Create the a file called robots.txt in your document root like in the following example. To see how to use the robots.txt, visit the official robotstxt.org site. Here is an example robots.txt managing access for some bots that are identified by their user agent strings:

Code Block
User-agent: examplebot
Crawl-delay: 120
Disallow: /mobile/
Disallow: /api/
Disallow: /plugin/
User-agent: foobot
Disallow: /
User-agent: barbot
Crawl-delay: 30

To see how to use the robots.txt, visit the official robotstxt.org site.

...

Block bots when using Apache: deny list via .htaccess

If you want to your environment uses Apache as its web server, you can block specific bots via your .htaccess you could do it -files like in the following this example:

Code Block
#====================================================================================================
# Block BadBots
#====================================================================================================
RewriteEngine On
RewriteCond %{HTTP_USER_AGENT} ^.*(SCspider|Textbot|s2bot).*$ [OR]
RewriteCond %{HTTP_USER_AGENT} ^.*AnotherBot.*$ [NC]
RewriteRule .* - [F,L]

Block bots

...

when using nginx

If you use nginx as your webserver, you cannot use .htaccess and we have to block these bots for you.
To do this, you please create a ticket in our ticket system or send an email to service@root360.de and send us a list of user agents to block. We already have a ready-made pattern for this scenario, which we can adapt and set up for your environment.

Projects with Content Delivery Network

...

Block bots when using Cloudfront as a full page cache

When you use Cloudfront as a full page cache to deliver all of your application’s assets (in contrast to only using it with a subdomain as a media content delivery network), we advise to use a AWS’ Web Application Firewall (WAF) to block these bots and prevent protect your application for against other attacks such sql injection or cross site scripting. To find out the full range of functions of the WAF, visit AWS WAF
The Web Application Firewall is the .

A WAF is our preferred solution for this szenario scenario and has the advantage that the requests doesn't reached the web serverblocked requests cannot reach the application instances.

If you are interested about in setting up a WAF create a ticket or send an email to service@root360.de.We will advise you about the WAF service and we will build an optimal solution for your project.

Application Plugins

Wordpress

If you use Wordpress, you can use Blackhole for Bad Bots.
With this tool you have the possibility to can block bad bots directly via the application.

Magento 2

If you use Magento 2, you can use Spam Bot Blocker by Nublue.
With this blocker you can block user agents, single IP addresses or ranges of IP addresses.
You can simply work with it from the backend and don't have to do this on the command line.

Info

This solution is on These solutions are on the application level. Please note , that the request has already passed through the webserver and in most cases was processed by php-fpm. This means that resources have already been used up before the request is blocked.
We do not prefer recommend the solution, but wanted to mention it for the sake of completeness.

...

Verify blocking rules

To validate check whether the bot was blocked successfully, you can use curl to access your website and specify the blocked with a custom user agent.
You can find out See how to do this in the following example:

Blocked user agent:

Code Block
languageautoit
curl -I https://www.domain.tld -A "SemrushBot -BA" 

HTTP/2 403 
date: Tue, 01 Sep 2020 15:03:24 GMT
content-type: text/html
content-length: 162
server: nginx

...