Gzip compression and Transfer-Encoding
dave at cheney.net
Mon Oct 22 14:52:27 MSD 2007
Thanks for your reply
> What problem with chunked encoding ?
No real problem with well behaved browsers, but I have heard that
older versions of IE can't pipeline chunked requests (I may be
working from old information)
>> Is their a buffer setting that specifies the size of the initial
>> buffer to be compressed, the idea being that if the whole response
>> body fits in that one buffer, it can be compressed in one go, and the
>> resulting content length discovered.
> The problem is that gzip is filter: the header sent to client before
> compression even starts. However, it is possible to postpone header
> processing to know the compressed size.
The way lighttpd does it is to have a small buffer, 8k by default
where the compressed representation is sent. If the compressed body
fits in this buffer completely then the content-length is known,
otherwise chunked encoding is activated and compression continues in
8k blocks (i think).
> Igor Sysoev
More information about the nginx