Preventing Site Cloning and DoS with Fail2Ban

This one came up as a result of the DoS attack last week when a site I administer was hit repeatedly for page downloads at the rate of 2 or 3 a second for a couple of hours. The same principle applies to sites that are being hit by a recursive wget or some other sort of cloning.

The protections for this seems to orientate around Fail2Ban and Apache mod_evasive. Both of them require that allowable limits for repeated downloads are set. I chose to go with F2B because it’s already in use elsewhere on the blog in question, and after a bit of fiddling, I decided that 30 page downloads per minute was an acceptable limit – that’s full pages, not all the elements that make up a page (CSS, JS, images).

Here’s the jail.local entry:

[apache-dos]
action = %(action_mwl)s
enabled = true
port    = http,https
filter  = apache-dos
logpath = /var/log/apache*/*access.log
maxretry = 30
findtime = 60
bantime = 3600

I use a custom action to use sendEmail rather than plain old sendmail, and ban the visitor for one hour. The filter looks like this:

/etc/fail2ban/filter.d/apache-dos.conf

# Fail2Ban configuration file
#
# Author: Mark White
#
# $Revision: 728 $
#

[Definition]

# Option:  failregex
# Notes.:  log all lines
#
failregex = ^ -.*"GET.*/ HTTP.*"

# Option:  ignoreregex
# Notes.:  regex to ignore. If this regex matches, the line is ignored.
# Values:  TEXT
#
ignoreregex =

So this is only tracking full page requests and ignoring JS, CSS, images and any other elements that make up the full page response. It seems to be working fine, and even though it’s running on the access.log, the fail2ban process is running nicely at about 0.3%.

Comments are closed.