How can the number of parallel/redundant open streams/temp_files be controlled/limited?

Maxim Dounin mdounin at mdounin.ru
Tue Jul 1 16:40:01 UTC 2014


Hello!

On Tue, Jul 01, 2014 at 10:15:47AM -0400, Paul Schlie wrote:

> Then how could multiple streams and corresponding temp_files 
> ever be created upon successive requests for the same $uri with 
> "proxy_cache_key $uri" and "proxy_cache_lock on"; if all 
> subsequent requests are locked to the same cache_node created by 
> the first request even prior to its completion?

Quoting documentation, http://nginx.org/r/proxy_cache_lock:

: When enabled, only one request at a time will be allowed to 
: populate a new cache element identified according to the 
: proxy_cache_key directive by passing a request to a proxied 
: server. Other requests of the same cache element will either wait 
: for a response to appear in the cache or the cache lock for this 
: element to be released, up to the time set by the 
: proxy_cache_lock_timeout directive.

So, there are at least two cases "prior to its completion" which 
are explicitly documented:

1. If the cache lock is released - this happens, e.g., if the 
   response isn't cacheable according to the response headers.

2. If proxy_cache_lock_timeout expires.

-- 
Maxim Dounin
http://nginx.org/



More information about the nginx mailing list