Serve multiple requests from a single proxy request

Roman Arutyunyan arut at nginx.com
Fri Aug 31 12:07:13 UTC 2018


Hi,

On Fri, Aug 31, 2018 at 05:02:21AM -0400, traquila wrote:
> Hello,
> I'm wondering if nginx is able to serve multiple requests from a single
> proxy request before it completes.
> 
> I am using the following configuration:
> 
>     proxy_cache_lock on;
>     proxy_cache_lock_timeout 5s;
>     proxy_cache ram;
>     proxy_pass myUpstream;
> 
> My upstream uses chunked transfer encoding and serve the request in 10 sec.
> Now if I try to send 2 requests to nginx, the first one starts responding
> immediately but the second will starts 5 sec later (lock timeout) and then
> perform a second request to my upstream.

This is all normal considering you have proxy_cache_lock enabled.  Your second
request waits until the first request completes and fills up the cache entry.
This cache entry is supposed to be used to serve the second request.  But it
expires after 5 seconds and makes another request to the upstream.

> Is there a way to configure nginx to immediately respond to multiple
> requests with a single request to my upstream?

There is no proxy/cache option to enable that behavior.  However, if your
response is big enough, you can try using the slice module:

http://nginx.org/en/docs/http/ngx_http_slice_module.html

With this module you can slice the response in pieces and cache each one
separately.  While the same logic as above is true for the slices too,
the fact that they are usually much smaller than the entire response makes
it look like the response is proxied to multiple client simultaneously.

-- 
Roman Arutyunyan


More information about the nginx mailing list