Forward single request to upstream server via proxy_store !!
shahzaib shahzaib
shahzaib.cb at gmail.com
Mon Sep 29 13:06:11 UTC 2014
Also, removing arguments after "?" also disabled the pseudo streaming. So i
think i can't apply this method !!
On Mon, Sep 29, 2014 at 6:05 PM, shahzaib shahzaib <shahzaib.cb at gmail.com>
wrote:
> @RR, i would like to inform you that the issue regarding failed stream for
> 1st request is solved. Varnish was removing content-length header for 1st
> request . Enabling Esi processing has resolved this issue.
>
> set beresp.do_esi = true;
>
>
> http://stackoverflow.com/questions/23643233/how-do-i-disable-transfer-encoding-chunked-encoding-in-varnish
>
> thanks !!
>
> On Sat, Sep 27, 2014 at 10:41 AM, shahzaib shahzaib <shahzaib.cb at gmail.com
> > wrote:
>
>> >>In general it shouldn’t since the ‘?start=’ is handled by nginx and not
>> varnish, but I’m not exactly sure how the mp4 module of nginx handles a
>> proxied request.
>> You have to test it.
>>
>> Sure, i'll test it.
>>
>> sub vcl_fetch {
>> return (pass);
>> }
>>
>> You're right about return(pass), coalescing doesn't work with pass.
>>
>> >>In worst case scenario imho only the first request (before landing on
>> the proxy_store server) will “fail” eg play from the beginning instead of
>> the time set.
>> Well, i am facing more worse scenario that first request always fail to
>> stream and player(HTML5) keeps on loading.
>>
>> I'm already checking if there's some config issue with varnish or this is
>> the default behaviour(Which i don't think it is).
>>
>> Thanks @RR
>>
>> Shahzaib
>>
>>
>> On Fri, Sep 26, 2014 at 2:36 AM, Reinis Rozitis <r at roze.lv> wrote:
>>
>>> It will also prevent users seeking the video because the arguments after
>>>> "?" will remove whenever user will try to seek the video stream, isn't it ?
>>>>
>>>
>>> In general it shouldn’t since the ‘?start=’ is handled by nginx and not
>>> varnish, but I’m not exactly sure how the mp4 module of nginx handles a
>>> proxied request.
>>> You have to test it.
>>>
>>> In worst case scenario imho only the first request (before landing on
>>> the proxy_store server) will “fail” eg play from the beginning instead of
>>> the time set.
>>>
>>>
>>>
>>> Well, only proxy_store is able to fulfill my requirements that is the
>>>> reason i'll have to stick with it.
>>>>
>>>
>>> Well you can try to use varnish as the streamer, just need some
>>> (web)player supporting byte-range requests for the seeking (
>>> http://flash.flowplayer.org/plugins/streaming/pseudostreaming.html ).
>>>
>>>
>>> I am bit confused about the varnish. Actually, i don't need any kind of
>>>> caching within the varnish as nginx already doing it via proxy_store. I
>>>> just need varnish to merge the subsequent requests into 1 and forward it to
>>>> nginx and i think varnish is doing it pretty well. Nevertheless, i am
>>>> confused if malloc caching will have any odd effect on the stream behavior ?
>>>>
>>>
>>>
>>> You can try to pass the request without caching:
>>>
>>> sub vcl_fetch {
>>> return (pass);
>>> }
>>>
>>> (maybe even do it in the vcl_recv stage but again I'm not exactly sure
>>> if in that case the request coalescing works).
>>>
>>>
>>>
>>> rr
>>> _______________________________________________
>>> nginx mailing list
>>> nginx at nginx.org
>>> http://mailman.nginx.org/mailman/listinfo/nginx
>>>
>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.nginx.org/pipermail/nginx/attachments/20140929/fb5bc87f/attachment.html>
More information about the nginx
mailing list