Serve multiple requests from a single proxy request
nginx-forum at forum.nginx.org
Fri Aug 31 15:29:42 UTC 2018
Thank you for your answer.
This means nginx is not compatible with CMAF and low latency streaming.
I tried the slice module and read its code but it does not cover my needs.
I guess I have to develop a new proxy module.
Roman Arutyunyan Wrote:
> On Fri, Aug 31, 2018 at 05:02:21AM -0400, traquila wrote:
> > Hello,
> > I'm wondering if nginx is able to serve multiple requests from a
> > proxy request before it completes.
> > I am using the following configuration:
> > proxy_cache_lock on;
> > proxy_cache_lock_timeout 5s;
> > proxy_cache ram;
> > proxy_pass myUpstream;
> > My upstream uses chunked transfer encoding and serve the request in
> 10 sec.
> > Now if I try to send 2 requests to nginx, the first one starts
> > immediately but the second will starts 5 sec later (lock timeout)
> and then
> > perform a second request to my upstream.
> This is all normal considering you have proxy_cache_lock enabled.
> Your second
> request waits until the first request completes and fills up the cache
> This cache entry is supposed to be used to serve the second request.
> But it
> expires after 5 seconds and makes another request to the upstream.
> > Is there a way to configure nginx to immediately respond to multiple
> > requests with a single request to my upstream?
> There is no proxy/cache option to enable that behavior. However, if
> response is big enough, you can try using the slice module:
> With this module you can slice the response in pieces and cache each
> separately. While the same logic as above is true for the slices too,
> the fact that they are usually much smaller than the entire response
> it look like the response is proxied to multiple client
> Roman Arutyunyan
> nginx mailing list
> nginx at nginx.org
Posted at Nginx Forum: https://forum.nginx.org/read.php?2,281058,281062#msg-281062
More information about the nginx