proxy_cache_lock allow multiple requests to remote server in some cases
mdounin at mdounin.ru
Fri Jan 29 13:47:21 UTC 2016
On Thu, Jan 28, 2016 at 07:13:11PM -0500, jeeeff wrote:
> My understanding of proxy_cache_lock is that only one request should be
> passed to the proxied server for a given uri, even if many requests from the
> same uri/key are hitting nginx while it is being refreshed.
> When the cache folder specified in the proxy_cache_path is empty, it works
> well and behave like I described above.
> However, if the element in the cache already exists, but is expired
> (according to proxy_cache_valid configuration), all concurrent requests will
> hit the proxied server and the resource will be downloaded multiple times.
That's expected, see http://nginx.org/r/proxy_cache_lock:
: When enabled, only one request at a time will be allowed to
: populate a new cache element identified according to the
: proxy_cache_key directive by passing a request to a proxied
Note "a new cache element". It does not work while updating cache
elements, this is not implemented.
> Using "proxy_cache_use_stale updating" is also not an option since I want
> all requests that are coming simultaneously to wait and use the new resource
> when there is a new one returned from the proxied server.
> Is there something I am doing wrong, or is this the expected behavior? Is
> there a way to do what I am trying to do with nginx?
See above, this is expected behaviour - as of now proxy_cache_lock
does nothing while updating cache elements. Using
"proxy_cache_use_stale updating" is recommended if you need to
reduce concurrency while updating cache elements.
If "proxy_cache_use_stale updating" doesn't work for you, you may
try extending "proxy_cache_lock" to also cover updating. Current
implementation doesn't try to do this to reduce complexity.
More information about the nginx