Serving an alternate robots.txt for SSL requests.
Nick Pearson
nick.pearson at gmail.com
Wed Jan 7 18:23:26 MSK 2009
Hi Juan,
Try using two server directives -- one for http and one for https. The
server directive chosen depends on the port that is requested. Something
like this:
server {
listen 80; # http
server_name www.yoursite.com;
[...]
location /robots.txt {
break;
}
}
server {
listen 443; # https
server_name www.yoursite.com;
[...]
location /robots.txt {
rewrite (.*) /robots_ssl.txt;
}
}
On Wed, Jan 7, 2009 at 9:06 AM, Juan Fco. Giordana
<juangiordana at gmail.com>wrote:
> Hello list,
>
> I'm using nginx-0.6.34 with the try_files patch applied and I'm trying to
> serve an alternate robots.txt for requests on port 443 so pages under secure
> connections are not shown by web crawlers.
>
> I've tried many different approaches and couldn't get any of them to work
> as I expected:
>
> if ($server_port = 443)
> if ($remote_port = 443)
> if ($scheme = https)
> With and without the location block.
> if () blocks inside @try_files rule.
> redirect flags: break, last, pemanent.
> All rewrite rules disabled except the one in question.
>
> server {
> [...]
> location /robots.txt {
> if ($server_port = 443) {
> rewrite ^robots\.txt$ robots_ssl.txt last;
> }
> }
> [...]
> }
>
> Most of these approaches returned always robots.txt on both SSL and NON-SSL
> while others 404 error under SSL. None robots_ssl.txt.
>
> Am I doing something wrong?
>
> Thanks!
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nginx.org/pipermail/nginx/attachments/20090107/eeef493c/attachment.html>
More information about the nginx
mailing list