location /robots.txt conflict / issue
c0nw0nk
nginx-forum at forum.nginx.org
Thu Sep 29 17:09:47 UTC 2016
So this is one of those issues it is most likely a bad configuration but my
robots.txt file is returning a 404 because of another location because I am
disallowing people to access any text files but I do want to allow only the
robots.txt to be accessed.
location /robots.txt {
root 'location/to/robots/txt/file';
}
#This is to stop people digging into any directories looking for files that
only PHP etc should read or serve
location ~* \.(txt|ini|xml|zip|rar)$ {
return 404;
}
I have not tested but I believe I could fix the above by changing my config
to this.
location /robots.txt {
allow all;
root 'location/to/robots/txt/file';
}
#This is to stop people digging into any directories looking for files that
only PHP etc should read or serve
location ~* \.(txt|ini|xml|zip|rar)$ {
deny all;
}
Is that the only possible way to deny access to all text files apart from
the robots.txt ?
Posted at Nginx Forum: https://forum.nginx.org/read.php?2,269960,269960#msg-269960
More information about the nginx
mailing list