<div dir="ltr">Hi,<div><br></div><div> Thanks for the help guys. Regarding @ryd994 suggestion, the reason we don't want to deploy this structure is that the Caching node will have to respond for each client's request and even it will be only doing proxy for most of the requests(without caching them), high i/o will still be required to serve the big proxy request(700MB mp4) to the user and that way caching node will eventually become the bottleneck between user and storage node, isn't it ?</div><div><br></div><div>@steve thanks for tmpfs point. but we're using caching node with 1TB+ SSD storage and will prefer SSD cache over RAM(though RAM is faster but not as big as SSD).</div><div><br></div><div>Using redirect URL we believe would be only pointing specific requests towards the cachind node and than this node will fetch requested file using proxy_cache.</div><div><br></div><div>Regards.</div><div>Shahzaib.</div><div><br></div><div><br></div></div><div class="gmail_extra"><br><div class="gmail_quote">On Mon, Jun 15, 2015 at 9:13 AM, ryd994 <span dir="ltr"><<a href="mailto:ryd994@163.com" target="_blank">ryd994@163.com</a>></span> wrote:<br><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div dir="ltr">Does a nginx reverse proxy with cache fit you need?<div><br></div><div>Client -> Caching server (with SSD and nginx proxy cache configured) -> Storage server(s) (Slow)</div><div><br></div><div>You can add even more storage server by utilizing nginx upstream module.</div><div dir="ltr"><div><br><div class="gmail_quote"><div><div class="h5"><div dir="ltr">On Sun, Jun 14, 2015 at 1:12 PM shahzaib shahzaib <<a href="mailto:shahzaib.cb@gmail.com" target="_blank">shahzaib.cb@gmail.com</a>> wrote:<br></div></div></div><blockquote class="gmail_quote" style="margin:0 0 0 .8ex;border-left:1px #ccc solid;padding-left:1ex"><div><div class="h5"><div dir="ltr">Hi,<div><br></div><div> We're using Nginx to serve videos on one of our Storage server(contains mp4 videos) and due to high amount of requests we're planning to have a separate caching Node based on Fast SSD drives to serve "Hot" content in order to reduce load from Storage. We're planning to have following method for caching :<br><br></div><div>If there are exceeding 1K requests for <a href="http://storage.domain.com/test.mp4" target="_blank">http://storage.domain.com/test.mp4</a> , nginx should construct a Redirect URL for rest of the requests related to test.mp4 i.e <a href="http://cache.domain.com/test.mp4" target="_blank">http://cache.domain.com/test.mp4</a> and entertain the rest of requests for test.mp4 from Caching Node while long tail would still be served from storage.</div><div><br></div><div>So, can we achieve this approach with nginx or other like varnish ?<br><br>Thanks in advance.<br><br>Regards.</div><div>Shahzaib<br><br><br></div></div></div></div><span class="">
_______________________________________________<br>
nginx mailing list<br>
<a href="mailto:nginx@nginx.org" target="_blank">nginx@nginx.org</a><br>
<a href="http://mailman.nginx.org/mailman/listinfo/nginx" rel="noreferrer" target="_blank">http://mailman.nginx.org/mailman/listinfo/nginx</a></span></blockquote></div></div></div></div>
<br>_______________________________________________<br>
nginx mailing list<br>
<a href="mailto:nginx@nginx.org">nginx@nginx.org</a><br>
<a href="http://mailman.nginx.org/mailman/listinfo/nginx" rel="noreferrer" target="_blank">http://mailman.nginx.org/mailman/listinfo/nginx</a><br></blockquote></div><br></div>