Problems with large object sets?
David Miller
dmiller at metheus.org
Wed Apr 23 15:39:02 UTC 2014
I’m having trouble with an nginx setup built to serve search engines.
Based on the user agent, all bots are served only from cache. We populate the cache with our own set of spiders so we can control the overall load.
Total cache size is ~450 GB in ~12 million files.
The problem is that about 1/3 of the requests coming in live from the bots are misses, even though the requested page was requested by our spider a mere hour previously.
Configured limits should be safe:
proxy_cache_path /var/www/cache levels=1:2 keys_zone=my-cache:2500m max_size=800000m inactive=800h;
Where should I be looking for why these requests were misses?
Thanks,
— David
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.nginx.org/pipermail/nginx/attachments/20140423/ffbafebe/attachment-0001.html>
More information about the nginx
mailing list