<html><head><meta http-equiv="content-type" content="text/html; charset=utf-8"></head><body dir="auto"><div>Hi</div><div><br></div><div>On 4 Jun 2015, at 08:16, Xavier Noria <<a href="mailto:fxn@hashref.com">fxn@hashref.com</a>> wrote:<br><br></div><blockquote type="cite"><div><div dir="ltr">I have used gzip_static for some years without any issue that I am aware of with the default gzip_vary off.<div><br></div><div>My reasoning is that the HTTP spec says in</div><div><br></div><div> <a href="http://tools.ietf.org/html/rfc2616#page-145">http://tools.ietf.org/html/rfc2616#page-145</a></div><div><br></div><div>that "the Vary field value advises the user agent about the criteria that were used to select the representation", and my understanding is that compressed content is not a representation per se. The representation would be the result of undoing what Content-Encoding says.</div></div></div></blockquote><div><br></div><div>This is fine to do. However, there's a chance a proxy may cache an uncompressed version if a client does not support compression and its response ends up in a proxy cache. Any subsequent user also behind that cache, even if it accepts compression, would be served it uncompressed in most cases.</div><br><blockquote type="cite"><div dir="ltr"><div>So, given the same .html endpoint you could for example serve content in a language chosen according to Accept-Language. That's a representation that depends on headers in my understanding. If you serve the same .css over and over again no matter what, the representation does not vary. The compressed thing that is transferred is not the representation itself, so no Vary needed.</div><div><br></div><div>Do you guys agree with that reading of the spec?</div></div></blockquote><div><br></div><div>This bit of the spec (same page at bottom) explains it better I think:</div><div><br></div><div><pre class="newpage" style="margin-top: 0px; margin-bottom: 0px; page-break-before: always;"><font face="UICTFontTextStyleBody"><span style="white-space: normal; background-color: rgba(255, 255, 255, 0);">An HTTP/1.1 server SHOULD include a Vary header field with any
cacheable response that is subject to server-driven negotiation.
Doing so allows a cache to properly interpret future requests on that
resource and informs the user agent about the presence of negotiation on that resource.</span></font></pre><pre class="newpage" style="margin-top: 0px; margin-bottom: 0px; page-break-before: always;"><font face="UICTFontTextStyleBody"><span style="white-space: normal; background-color: rgba(255, 255, 255, 0);"><br></span></font></pre><pre class="newpage" style="margin-top: 0px; margin-bottom: 0px; page-break-before: always;"><font face="UICTFontTextStyleBody"><span style="white-space: normal; background-color: rgba(255, 255, 255, 0);">I would say compression is a server driven negotiation. I would also say, based on my understanding, that when the spec says representation it means including encoding such as compression. That is, you can represent a resource with gzip or without gzip.</span></font></pre></div><div><br></div><br><blockquote type="cite"><div dir="ltr"><div>Then, you read posts about buggy proxy servers. Have any of you founded a real (modern) case in which the lack of "Vary: Accept-Encoding" resulted in compressed content being delivered to a client that didn't support it? Or are those proxies mythical criatures as of today?</div></div></blockquote><div><br></div><div>Proxy are bound by the spec too so yes it would be a buggy proxy. They can't send a Content-Encoding gzip unless the client sends Accept-Encoding. I'm not entirely sure what would happen though - I guess either bypass the compressed cache version or replace it uncompressed. Most likely up to the proxy implementation.</div><div><br></div>Jason</body></html>