Allow internal redirect to URI x, but deny external request for x?

j94305 nginx-forum at
Tue Sep 10 18:46:54 UTC 2019

Robots exclusion is generally quite unreliable. Exclusions based on user
agents are also not really reliable. You can try all of the options for
robots exclusion and may still get undesired crawlers on your site.

The only way you can keep robots out is to require authentication for those
parts you don't want to have crawled.


Posted at Nginx Forum:,285463,285599#msg-285599

More information about the nginx mailing list