streaming large nginx generated page

Sorin Manole sorin.v.manole at gmail.com
Sun Nov 15 11:55:22 UTC 2015


If the data is generated during the requests and there is no way to
precompute it, I guess you best bet would be to generate and feed the data
from a filter handler. Basically, the content handler for a particular
location could be a the empty_gif module, and while the response is sent to
the client you ignore the data generated by empty_gif, but instead send
your stream data to the client. And when all the generated data is sent to
the client, mark the empty_gif buffers as read.
empty_gif can be replaced with static file serving or anything really.
I'm not sure this would work at all, and how you will deal with the
headers, but looking forward to see your progress.

Also, why not consider generating this data in a separate daemon and just
using proxy_module or something.

2015-11-09 20:35 GMT+02:00 Maksim Yevmenkin <maksim.yevmenkin at gmail.com>:

> hello,
>
> suppose i need to export large amount of nginx generated data (static
> page). i have used content handler (or content phase handler) to
> create all the in-memory chain and fed it to ngx_http_output_filter().
> however, i would very much like to avoid allocating large chunk of
> memory to keep all the data before sending.
>
> is there a way to stream generated page (potentially applying
> different transfer encoding)?
>
> thanks!
> max
>
> _______________________________________________
> nginx-devel mailing list
> nginx-devel at nginx.org
> http://mailman.nginx.org/mailman/listinfo/nginx-devel
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.nginx.org/pipermail/nginx-devel/attachments/20151115/709ea15f/attachment.html>


More information about the nginx-devel mailing list