Serving an alternate robots.txt for SSL requests.

Juan Fco. Giordana juangiordana at gmail.com
Wed Jan 7 18:06:00 MSK 2009


Hello list,

I'm using nginx-0.6.34 with the try_files patch applied and I'm trying 
to serve an alternate robots.txt for requests on port 443 so pages under 
secure connections are not shown by web crawlers.

I've tried many different approaches and couldn't get any of them to 
work as I expected:

  if ($server_port = 443)
  if ($remote_port = 443)
  if ($scheme = https)
  With and without the location block.
  if () blocks inside @try_files rule.
  redirect flags: break, last, pemanent.
  All rewrite rules disabled except the one in question.

server {
     [...]
     location /robots.txt {
         if ($server_port = 443) {
             rewrite ^robots\.txt$ robots_ssl.txt last;
         }
     }
     [...]
}

Most of these approaches returned always robots.txt on both SSL and 
NON-SSL while others 404 error under SSL. None robots_ssl.txt.

Am I doing something wrong?

Thanks!





More information about the nginx mailing list