proxy_cache
Ryan Malayter
malayter at gmail.com
Tue Apr 26 18:24:03 MSD 2011
On Mon, Apr 25, 2011 at 3:13 PM, Richard Kearsley
<Richard.Kearsley at m247.com> wrote:
> Ideal situation is initiate one request to back-end then for each additional request start sending the (still incomplete) cached file at the same rate it's being downloaded into the cache (I cache after just 1 hit)
>
> I'm talking video files 400-500mb crossing the Atlantic.... Not fast at all and can get seriously congested once it starts getting the same file over and over in this "first cache" syndrome
>
> Anything I can do to help/test please let me know :)
> Keep up the good work
If this is an immediate problem (i.e. it is killing your monthly
bandwidth bill), I would suggest setting up varnish behind nginx, and
add a specific location for the video files in question. It will give
you behavior closer to what you need (only making one request for the
file over the slow link). You could then choose to expand varnish
usage to other files after careful testing. This will complicate your
stack, but such is life. Varnish just meets your particular use case
better at this point.
I'm no C developer, just an nginx user, but I suspect any change in
this area would be a lot of work and slow in coming for nginx. I think
it might be possible to do something all-nginx by having nginx proxy
to itself for these files, and using something like
http://wiki.nginx.org/HttpLimitReqModule on the "backend" nginx server
which then proxies to your real origin... but it would be kind of
hackish.
--
RPM
More information about the nginx
mailing list