Allow internal redirect to URI x, but deny external request for x?
nginx-forum at forum.nginx.org
Tue Sep 10 18:46:54 UTC 2019
Robots exclusion is generally quite unreliable. Exclusions based on user
agents are also not really reliable. You can try all of the options for
robots exclusion and may still get undesired crawlers on your site.
The only way you can keep robots out is to require authentication for those
parts you don't want to have crawled.
Posted at Nginx Forum: https://forum.nginx.org/read.php?2,285463,285599#msg-285599
More information about the nginx