loading a different robots.txt file for a different sub domain? (Ray)

Igor Sysoev igor at sysoev.ru
Sat May 22 10:03:26 MSD 2010


On Sat, May 22, 2010 at 10:12:45AM +0800, zhys99 wrote:

> load different robots in the same one "server" block
>  
>    location ~ /robots.txt
>      {
>        if ($host = 'first.domain') {
>          rewrite ^/robots\.txt /path/to/another/robots.txt last;
>        }
>      }

This is highly unrecommneded way to handle different servers.

I use it only once on a site with more than hundred locaitons
that has some unofficial but public names, to prevent crawling:

        location = /robots.txt {
            if ($http_host ~* ^.......$) {
                root   /data/w3;
            }
        }

Otherwise these sites should be the same.
If your sites are lesser or have more differences, you should use
the right way:

     server {
        ...
     }

     server {
        ...
     }


> ------------------ Original ------------------
> From:  "nginx-request"<nginx-request at nginx.org>;
> Date:  Sat, May 22, 2010 01:33 AM
> To:  "nginx"<nginx at nginx.org>; 
> 
> Subject:  nginx Digest, Vol 7, Issue 47
> 
>  
>  Send nginx mailing list submissions to
> 	nginx at nginx.org
> 
> To subscribe or unsubscribe via the World Wide Web, visit
> 	http://nginx.org/mailman/listinfo/nginx
> or, via email, send a message with subject or body 'help' to
> 	nginx-request at nginx.org
> 
> You can reach the person managing the list at
> 	nginx-owner at nginx.org
> 
> When replying, please edit your Subject line so it is more specific
> than "Re: Contents of nginx digest..."
> 
> 
> Today's Topics:
> 
>    1. Re: loading a different robots.txt file for a different sub
>       domain? (Ray)
>    2. Re: loading a different robots.txt file for a different sub
>       domain? (Ray)
>    3. Re: loading a different robots.txt file for a different sub
>       domain? (Ilan Berkner)
>    4. Re: loading a different robots.txt file for a different sub
>       domain? (Igor Sysoev)
>    5. nginx 0day exploit for nginx + fastcgi PHP (Avleen Vig)
>    6. Re: nginx 0day exploit for nginx + fastcgi PHP (Avleen Vig)
>    7. Re: nginx 0day exploit for nginx + fastcgi PHP (Michael Shadle)
>    8. Re: nginx 0day exploit for nginx + fastcgi PHP (Michael Shadle)
>    9. Re: nginx 0day exploit for nginx + fastcgi PHP (Igor Sysoev)
> 
> 
> ----------------------------------------------------------------------
> 
> Message: 1
> Date: Fri, 21 May 2010 21:53:45 +0800
> From: Ray <gunblad3 at gmail.com>
> To: nginx at nginx.org
> Cc: nginx at sysoev.ru
> Subject: Re: loading a different robots.txt file for a different sub
> 	domain?
> Message-ID:
> 	<AANLkTilS4qcF-LEDsGXyml1zu2YspTj6wSrRQHGq48-N at mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
> 
> Yes.
> 
> server {
>     listen 80 default;
>     server_name www;
> 
>     location /robots.txt {
>         alias /path/to/the/file1;
>     }
> }
> 
> server {
>     listen 80;
>     server_name server2;
> 
>     location /robots.txt {
>         alias /path/to/the/file2;
>     }
> }
> 
> 
> Ray.
> 
> 
> On Fri, May 21, 2010 at 9:43 PM, Ilan Berkner <iberkner at gmail.com> wrote:
> > Hi All,
> > We have 2 sub-domain groups setup for processing incoming requests:
> > 1. "server2"
> > 2. all others, for example: "www"
> > The 2 sub-domains share the same directory for delivery of static files
> > (html, images, swf, etc.) but use different PHP backends.
> > Is there a way, using nginx configuration to load a different robots.txt
> > file when requested for one group vs. the other?
> > Thanks!
> >
> >
> > _______________________________________________
> > nginx mailing list
> > nginx at nginx.org
> > http://nginx.org/mailman/listinfo/nginx
> >
> >
> 
> 
> 
> ------------------------------
> 
> Message: 2
> Date: Fri, 21 May 2010 21:53:45 +0800
> From: Ray <gunblad3 at gmail.com>
> To: nginx at nginx.org
> Cc: nginx at sysoev.ru
> Subject: Re: loading a different robots.txt file for a different sub
> 	domain?
> Message-ID:
> 	<AANLkTilS4qcF-LEDsGXyml1zu2YspTj6wSrRQHGq48-N at mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
> 
> Yes.
> 
> server {
>     listen 80 default;
>     server_name www;
> 
>     location /robots.txt {
>         alias /path/to/the/file1;
>     }
> }
> 
> server {
>     listen 80;
>     server_name server2;
> 
>     location /robots.txt {
>         alias /path/to/the/file2;
>     }
> }
> 
> 
> Ray.
> 
> 
> On Fri, May 21, 2010 at 9:43 PM, Ilan Berkner <iberkner at gmail.com> wrote:
> > Hi All,
> > We have 2 sub-domain groups setup for processing incoming requests:
> > 1. "server2"
> > 2. all others, for example: "www"
> > The 2 sub-domains share the same directory for delivery of static files
> > (html, images, swf, etc.) but use different PHP backends.
> > Is there a way, using nginx configuration to load a different robots.txt
> > file when requested for one group vs. the other?
> > Thanks!
> >
> >
> > _______________________________________________
> > nginx mailing list
> > nginx at nginx.org
> > http://nginx.org/mailman/listinfo/nginx
> >
> >
> 
> 
> 
> ------------------------------
> 
> Message: 3
> Date: Fri, 21 May 2010 10:03:10 -0400
> From: Ilan Berkner <iberkner at gmail.com>
> To: nginx at nginx.org
> Subject: Re: loading a different robots.txt file for a different sub
> 	domain?
> Message-ID:
> 	<AANLkTikoFCyB0tcDBCeXAeO-fhYM290irFASrg9h90Q1 at mail.gmail.com>
> Content-Type: text/plain; charset="iso-8859-1"
> 
> Worked like a charm, thanks!
> 
> 
> On Fri, May 21, 2010 at 9:53 AM, Ray <gunblad3 at gmail.com> wrote:
> 
> > Yes.
> >
> > server {
> >    listen 80 default;
> >    server_name www;
> >
> >    location /robots.txt {
> >        alias /path/to/the/file1;
> >    }
> > }
> >
> > server {
> >    listen 80;
> >    server_name server2;
> >
> >    location /robots.txt {
> >        alias /path/to/the/file2;
> >    }
> > }
> >
> >
> > Ray.
> >
> >
> > On Fri, May 21, 2010 at 9:43 PM, Ilan Berkner <iberkner at gmail.com> wrote:
> > > Hi All,
> > > We have 2 sub-domain groups setup for processing incoming requests:
> > > 1. "server2"
> > > 2. all others, for example: "www"
> > > The 2 sub-domains share the same directory for delivery of static files
> > > (html, images, swf, etc.) but use different PHP backends.
> > > Is there a way, using nginx configuration to load a different robots.txt
> > > file when requested for one group vs. the other?
> > > Thanks!
> > >
> > >
> > > _______________________________________________
> > > nginx mailing list
> > > nginx at nginx.org
> > > http://nginx.org/mailman/listinfo/nginx
> > >
> > >
> >
> > _______________________________________________
> > nginx mailing list
> > nginx at nginx.org
> > http://nginx.org/mailman/listinfo/nginx
> >
> -------------- next part --------------
> An HTML attachment was scrubbed...
> URL: <http://nginx.org/pipermail/nginx/attachments/20100521/dc7ff991/attachment-0001.html>
> 
> ------------------------------
> 
> Message: 4
> Date: Fri, 21 May 2010 18:24:01 +0400
> From: Igor Sysoev <igor at sysoev.ru>
> To: nginx at nginx.org
> Subject: Re: loading a different robots.txt file for a different sub
> 	domain?
> Message-ID: <20100521142401.GD72328 at rambler-co.ru>
> Content-Type: text/plain; charset=koi8-r
> 
> On Fri, May 21, 2010 at 10:03:10AM -0400, Ilan Berkner wrote:
> 
> > Worked like a charm, thanks!
> 
> You may also use:
>    location = /robots.txt {
>  
> > On Fri, May 21, 2010 at 9:53 AM, Ray <gunblad3 at gmail.com> wrote:
> > 
> > > Yes.
> > >
> > > server {
> > >    listen 80 default;
> > >    server_name www;
> > >
> > >    location /robots.txt {
> > >        alias /path/to/the/file1;
> > >    }
> > > }
> > >
> > > server {
> > >    listen 80;
> > >    server_name server2;
> > >
> > >    location /robots.txt {
> > >        alias /path/to/the/file2;
> > >    }
> > > }
> > >
> > >
> > > Ray.
> > >
> > >
> > > On Fri, May 21, 2010 at 9:43 PM, Ilan Berkner <iberkner at gmail.com> wrote:
> > > > Hi All,
> > > > We have 2 sub-domain groups setup for processing incoming requests:
> > > > 1. "server2"
> > > > 2. all others, for example: "www"
> > > > The 2 sub-domains share the same directory for delivery of static files
> > > > (html, images, swf, etc.) but use different PHP backends.
> > > > Is there a way, using nginx configuration to load a different robots.txt
> > > > file when requested for one group vs. the other?
> > > > Thanks!
> > > >
> > > >
> > > > _______________________________________________
> > > > nginx mailing list
> > > > nginx at nginx.org
> > > > http://nginx.org/mailman/listinfo/nginx
> > > >
> > > >
> > >
> > > _______________________________________________
> > > nginx mailing list
> > > nginx at nginx.org
> > > http://nginx.org/mailman/listinfo/nginx
> > >
> 
> > _______________________________________________
> > nginx mailing list
> > nginx at nginx.org
> > http://nginx.org/mailman/listinfo/nginx
> 
> 
> -- 
> Igor Sysoev
> http://sysoev.ru/en/
> 
> 
> 
> ------------------------------
> 
> Message: 5
> Date: Fri, 21 May 2010 10:07:00 -0700
> From: Avleen Vig <avleen at gmail.com>
> To: nginx at sysoev.ru
> Subject: nginx 0day exploit for nginx + fastcgi PHP
> Message-ID:
> 	<AANLkTilDMa5NUSbwpGgBn3TDeG46ft-2fPwU9obxN8hA at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
> 
> This is currently doing the rounds, so I thought it pertinent to post
> it here too.
> 
> http://www.webhostingtalk.com/showthread.php?p=6807475#post6807475
> 
> I don't know what nginx should do to fix this, but there are two
> workarounds given.
> If you allow file uploads (especially things like images) and use PHP
> FastCGI in the back end, you should take a loot at this now.
> The exploit allows for any arbitrary file which is uploaded, to be
> executed as PHP.
> 
> 
> 
> ------------------------------
> 
> Message: 6
> Date: Fri, 21 May 2010 10:27:14 -0700
> From: Avleen Vig <avleen at gmail.com>
> To: nginx at sysoev.ru
> Subject: Re: nginx 0day exploit for nginx + fastcgi PHP
> Message-ID:
> 	<AANLkTik8cJNceX3z-E7NLs4ZYEZ11Y51XUETLVO_7MAA at mail.gmail.com>
> Content-Type: text/plain; charset=ISO-8859-1
> 
> On Fri, May 21, 2010 at 10:07 AM, Avleen Vig <avleen at gmail.com> wrote:
> > This is currently doing the rounds, so I thought it pertinent to post
> > it here too.
> >
> > http://www.webhostingtalk.com/showthread.php?p=6807475#post6807475
> >
> > I don't know what nginx should do to fix this, but there are two
> > workarounds given.
> > If you allow file uploads (especially things like images) and use PHP
> > FastCGI in the back end, you should take a loot at this now.
> > The exploit allows for any arbitrary file which is uploaded, to be
> > executed as PHP.
> 
> I should add that this isn't a bug in the traditional broken-code sense.
> More that this is a gaping configuration hole which is now widely
> published, and could lead to many people being exploited.
> 
> 
> 
> ------------------------------
> 
> Message: 7
> Date: Fri, 21 May 2010 10:28:16 -0700
> From: Michael Shadle <mike503 at gmail.com>
> To: nginx at nginx.org
> Cc: nginx at sysoev.ru
> Subject: Re: nginx 0day exploit for nginx + fastcgi PHP
> Message-ID:
> 	<AANLkTimrR_aMpB2hhatNaDRkixYO-8LQDOmw5zPOoRrH at mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
> 
> Question is, what functionality is lost by changing
> 
> cgi.fix_pathinfo = 0
> 
> Looks like the other workaround is something like this:
> 
> if ( $fastcgi_script_name ~ \..*\/.*php ) {
>  return 403;
> }
> 
> Which i basically saying what exactly? If there is a period and slash
> somewhere prior to the last "filename" to return a 403?
> 
> Ideally while this is being thought out it would be cool to fix the
> common "no input file specified" issue that a lot of people have -
> have it return a 404 instead. Not sure if it's a simple php.ini change
> (perhaps the path info?) or change fastcgi_param REDIRECT_STATUS 200?
> 
> 
> On Fri, May 21, 2010 at 10:07 AM, Avleen Vig <avleen at gmail.com> wrote:
> > This is currently doing the rounds, so I thought it pertinent to post
> > it here too.
> >
> > http://www.webhostingtalk.com/showthread.php?p=6807475#post6807475
> >
> > I don't know what nginx should do to fix this, but there are two
> > workarounds given.
> > If you allow file uploads (especially things like images) and use PHP
> > FastCGI in the back end, you should take a loot at this now.
> > The exploit allows for any arbitrary file which is uploaded, to be
> > executed as PHP.
> >
> > _______________________________________________
> > nginx mailing list
> > nginx at nginx.org
> > http://nginx.org/mailman/listinfo/nginx
> >
> 
> 
> 
> ------------------------------
> 
> Message: 8
> Date: Fri, 21 May 2010 10:28:16 -0700
> From: Michael Shadle <mike503 at gmail.com>
> To: nginx at nginx.org
> Cc: nginx at sysoev.ru
> Subject: Re: nginx 0day exploit for nginx + fastcgi PHP
> Message-ID:
> 	<AANLkTimrR_aMpB2hhatNaDRkixYO-8LQDOmw5zPOoRrH at mail.gmail.com>
> Content-Type: text/plain; charset=UTF-8
> 
> Question is, what functionality is lost by changing
> 
> cgi.fix_pathinfo = 0
> 
> Looks like the other workaround is something like this:
> 
> if ( $fastcgi_script_name ~ \..*\/.*php ) {
>  return 403;
> }
> 
> Which i basically saying what exactly? If there is a period and slash
> somewhere prior to the last "filename" to return a 403?
> 
> Ideally while this is being thought out it would be cool to fix the
> common "no input file specified" issue that a lot of people have -
> have it return a 404 instead. Not sure if it's a simple php.ini change
> (perhaps the path info?) or change fastcgi_param REDIRECT_STATUS 200?
> 
> 
> On Fri, May 21, 2010 at 10:07 AM, Avleen Vig <avleen at gmail.com> wrote:
> > This is currently doing the rounds, so I thought it pertinent to post
> > it here too.
> >
> > http://www.webhostingtalk.com/showthread.php?p=6807475#post6807475
> >
> > I don't know what nginx should do to fix this, but there are two
> > workarounds given.
> > If you allow file uploads (especially things like images) and use PHP
> > FastCGI in the back end, you should take a loot at this now.
> > The exploit allows for any arbitrary file which is uploaded, to be
> > executed as PHP.
> >
> > _______________________________________________
> > nginx mailing list
> > nginx at nginx.org
> > http://nginx.org/mailman/listinfo/nginx
> >
> 
> 
> 
> ------------------------------
> 
> Message: 9
> Date: Fri, 21 May 2010 21:33:02 +0400
> From: Igor Sysoev <igor at sysoev.ru>
> To: nginx at nginx.org
> Subject: Re: nginx 0day exploit for nginx + fastcgi PHP
> Message-ID: <20100521173302.GF72328 at rambler-co.ru>
> Content-Type: text/plain; charset=koi8-r
> 
> On Fri, May 21, 2010 at 10:07:00AM -0700, Avleen Vig wrote:
> 
> > This is currently doing the rounds, so I thought it pertinent to post
> > it here too.
> > 
> > http://www.webhostingtalk.com/showthread.php?p=6807475#post6807475
> > 
> > I don't know what nginx should do to fix this, but there are two
> > workarounds given.
> > If you allow file uploads (especially things like images) and use PHP
> > FastCGI in the back end, you should take a loot at this now.
> > The exploit allows for any arbitrary file which is uploaded, to be
> > executed as PHP.
> 
> I do not see why this is treated as nginx bug ?
> Why is anyone able at all to upload images to /scripts directory ?
> Why does PHP have cgi.fix_pathinfo option ?
> BTW, I'm just curious how does lighttpd resolve this issue ?
> 
> Also instead of
> 
> if ( $fastcgi_script_name ~ \..*\/.*php ) {
>     return 403;
> }
> 
> it should be worked around as
> 
> location ~ \..*/.*\.php$ {
>     return 403;
> }
> 
> location ~ \.php$ {
>     return 403;
> }
> 
> 
> -- 
> Igor Sysoev
> http://sysoev.ru/en/
> 
> 
> 
> ------------------------------
> 
> _______________________________________________
> nginx mailing list
> nginx at nginx.org
> http://nginx.org/mailman/listinfo/nginx
> 
> 
> End of nginx Digest, Vol 7, Issue 47
> ************************************

> _______________________________________________
> nginx mailing list
> nginx at nginx.org
> http://nginx.org/mailman/listinfo/nginx


-- 
Igor Sysoev
http://sysoev.ru/en/



More information about the nginx mailing list