limit_req per subnet?
Grant
emailgrant at gmail.com
Wed Dec 14 21:58:06 UTC 2016
> I'm no fail2ban guru. Trust me. I'd suggest going on serverfault. But my other post indicates semrush resides on AWS, so just block AWS. I doubt there is any harm in blocking AWS since no major search engine uses them.
>
> Regarding search engines, the reality is only Google matters. Just look at your logs. That said, I allow Google, yahoo, and Bing. But yahoo/bing isn't even 5% of Google traffic. Everything else I block. Majestic (MJ12) is just ridiculous. I allow the anti-virus companies to poke around, though I can't figure out what exactly their probes accomplish. Often Intel/McAfee just pings the server, perhaps to survey hosting software and revision. Good advertising for nginx!
I would really prefer not to block cloud services. It sounds like an
admin headache down the road.
nginx limit_req works great for a single IP attacker, but all it takes
is 3 IPs for an attacker to triple his allowable rate, even from
sequential IPs? I'm surprised there's no way to combat this.
- Grant
>> Did you see if the IPs were from an ISP? If not, I'd ban the service using the Hurricane Electric BGP as a guide. At a minimum, you should be blocking the major cloud services, especially OVH. They offer free trial accounts, so of course the hackers abuse them.
>
>
> What sort of sites run into problems after doing that? I'm sure some
> sites need to allow cloud services to access them. A startup search
> engine could be run from such a service.
>
>
>> If the attack was from an ISP, I can visualize a fail2ban scheme blocking the last quad not being too hard to implement . That is block xxx.xxx.xxx.0/24. Or maybe just let a typical fail2ban set up do your limiting and don't get fancy about the IP range.
>>
>> I try "traffic management" at the firewall first. As I discovered with "deny" in nginx, much CPU work is still done prior to ignoring the request. (I don't recall the details exactly, but there is a thread I started on the topic in this list.) Better to block via the firewall since you will be running one anyway.
>
>
> It sounds like limit_req in nginx does not have any way to do this.
> How would you accomplish this in fail2ban?
>
>
>> I recently suffered DoS from a series of 10 sequential IP addresses.
>> limit_req would have dealt with the problem if a single IP address had
>> been used. Can it be made to work in a situation like this where a
>> series of sequential IP addresses are in play? Maybe per subnet?
More information about the nginx
mailing list