nginx with caching
Maxim Dounin
mdounin at mdounin.ru
Thu Sep 1 13:31:41 UTC 2016
Hello!
On Thu, Sep 01, 2016 at 01:34:39PM +0200, Łukasz Tasz wrote:
> Hi all,
> since some time I'm using nginx as reverse proxy with caching for serving
> images files.
> looks pretty good since proxy is located per each location.
>
> but I noticed problematic behaviour, when cache is empty, and there will
> pop-up a lot of requests at the same time, nginx don't understand that all
> request are same, and will fetch from upstream only onece and serve it to
> the rest, but all requests are handovered to upstream.
> side effects?
> - upstream server limit rate since there is to much connections to one
> client,
> - in some cases there are issues with temp - not enough space to finish all
> requests
>
> any ideas?
> is it known problem?
>
> I know that problem can be solved with warming up caches, but since there
> is a lot of locations, I would like to keep it transparent.
There is the proxy_cache_lock directive to address such use cases,
see http://nginx.org/r/proxy_cache_lock.
Additionally, for updating cache items there is
"proxy_cache_use_stale updating", see
http://nginx.org/r/proxy_cache_use_stale.
--
Maxim Dounin
http://nginx.org/
More information about the nginx
mailing list