14

I would like to only allow one IP to use up to, say 1GB, of traffic per day, and if that limit is exceeded, all requests from that IP are then dropped until the next day. However, a more simple solution where the connection is dropped after a certain amount of requests would suffice.

Is there already some sort of module that can do this? Or perhaps I can achieve this through something like iptables?

Thanks

4 Answers4

6

This is my iptables solution for this kind of issue. Adjust --seconds --hitcount as you need, also iptables table.

iptables -A FORWARD -m state --state NEW -m recent --rcheck --seconds 600 --hitcount 5 --name ATACK --rsource -j REJECT --reject-with icmp-port-unreachable
iptables -A FORWARD -d 192.168.0.113/32 -o eth1 -p tcp -m tcp --dport 80 -m recent --set --name ATACK --rsource -j ACCEPT

Explained:

  1. iptables check if source IP is listed on /proc/net/ipt_recent/ATACK file for 5 or more times in 600 seconds interval and if it's a NEW request. If it is, do a reject; else

  2. iptables check if request is destinated to port 80. If so, print IP and timestamp to /proc/net/ipt_recent/ATACK and forward packet.

It's working fine for my needs.

GregL
  • 9,030
  • 2
  • 24
  • 35
Gustavo Feijo
  • 61
  • 1
  • 1
5

If you want a pure Apache solution bw_mod for Apache 2.0 and mod_bandwidth for Apache 1.3. They can throttle the bandwidth of your server to limit bandwidth usage.

There is also mod_limitipconn, which prevents one user from making lots of connections to your server. mod_cband is another option, but I have never used it.

If you don't want to mess with your Apache installation you can put a squid proxy in front of Apache. It gives you more control also over the throttling.

However, in most cases the problem is a few large objects when you want to limit bandwidth per IP, and you want to give a sane error message when a user pulls too much data and you block him. In that case it might be easier to write a PHP script and store the access information in a temporary table in a database.

pehrs
  • 8,749
  • 29
  • 46
  • 2
    Thanks for the suggestions. I've already looked into bw_mod and mod_limitipconn, but neither (as far as I can tell) do what I want. mod_limitipconn simply limits them to one connection at a time, and bw_mod only allows me to limit the download rate per IP. I want to block them after a certain amount of data transfer/requests. I'm actually trying to defend against certain users who feel the need to crawl my entire site and download everything. I'll take a look into the squid proxy, sounds interesting. If that doesn't work out, I think I'll resort to modifying the bw_mod source. –  Apr 11 '10 at 13:52
  • Have you set your robots.txt to disallow spiders? – pehrs Apr 11 '10 at 14:22
  • 1
    The problem with robots.txt, is that (much like the RFC 3514) only nice robots respect it. – Scott Pack Apr 11 '10 at 14:41
  • True, but you will find that the majority of the people spidering your site uses standard tools. And many of them, like wget, respects robots.txt. Robots.txt is also the correct way to inform your users you don't want them to spider. – pehrs Apr 11 '10 at 14:51
  • 1
    I've tried that. At first, robots.txt was enough, then they told Wget to ignore robots.txt, so I resorted to blocking "unrecognized" user-agents, but then they spoofed the user agent. They tend to make a lot of head requests, whereas legitimate browsers do not, so I may look into limiting head requests or disabling it altogether (less desirable). –  Apr 11 '10 at 16:24
  • Squid is probably the next step. That or simply banning them. If they bypass robots.txt I don't see any reason to service them. – pehrs Apr 11 '10 at 21:25
3

Have you looked at a tool like fail2ban? If might be a bit heavy handed for you, but it would let you limit the number of requests any given IP is allowed. It works by looking at the logs, and you set rules for how many violations per time are allowed, so for you that might be requests per day. Once they go over that it can do things like block them using ipchains.

I've used it to block DDoS attacks against a mail server very successfully. It can consume a significant amount of processor power though.

1

try mod_dosevasive or mod_security

mod_dosevasive can be configured to ban an IP after a specified number or page requests to a site in specified time frame.

Lucas Kauffman
  • 16,818
  • 9
  • 57
  • 92