Slow down, but not stop, serving pages to bot that doesn't respect robots.txt delay
rastrano
nginx-forum at nginx.us
Thu Aug 11 15:37:22 UTC 2011
Hi all.
I want to differentiate the nginx awesome rate limiting for requests, in
order to enforce bots to respect my directives (but not to block them):
so i created these limit_req_zone:
limit_req_zone $binary_remote_addr zone=antiddosspider:1m rate=1r/m;
limit_req_zone $binary_remote_addr zone=antiddosphp:1m rate=1r/s;
limit_req_zone $binary_remote_addr zone=antiddosstatic:1m rate=10r/s;
Now, i ask, is it possible to configure something like this?
if ( $http_user_agent ~* (?:bot|spider) ) {
limit_req zone=antiddosspider burst=1;
}
location / {
limit_req zone=antiddosphp burst=100;
proxy_pass http://localhost:8181;
include /etc/nginx/proxy.conf;
}
I don't know many spider that are crawling my site, but i don't wanna
lose the possibility to be indexed if they are not malware! ...But 1
page for minute please ;)
Best regards,
Stefano
Posted at Nginx Forum: http://forum.nginx.org/read.php?2,213704,213704#msg-213704
More information about the nginx
mailing list