How to disable output buffering with PHP and nginx

Ben Johnson ben at
Thu Oct 10 16:11:58 UTC 2013

On 10/10/2013 11:26 AM, Maxim Dounin wrote:
> Hello!
> On Thu, Oct 10, 2013 at 11:13:40AM -0400, Ben Johnson wrote:
> [...]
>> Well, after all of the configuration changes, both to nginx and PHP, the
>> solution was to add the following header to the response:
>> header('Content-Encoding: none;');
> Just in case: this is very-very wrong, there is no such 
> content-coding.  Never use this in real programs.
> But the fact that it helps suggests you actually have gzip enabled 
> somewhere in your nginx config - as gzip doesn't work if it sees 
> Content-Encoding set.
> All this probably doesn't matter due to you only used it as a 
> debugging tool.
> [...]
>> The whole reason for which I was seeking to disable output buffering is
>> that I need to test nginx's ability to handle multiple requests
>> simultaneously. This need is inspired by yet another problem, about
>> which I asked on this list in late August: "504 Gateway Time-out when
>> calling curl_exec() in PHP with SSL peer verification
>> Some folks suggested that the cURL problem could result from nginx not
>> being able to serve more than one request for a PHP file at a time. So,
>> that's why I cooked up this test with sleep() and so forth.
>> Now that output buffering is disabled, I am able to test concurrency.
>> Sure enough, if I request my concurrency test script in two different
>> browser tabs, the second tab will not begin producing output until the
>> first tab has finished. I set the test time to 120 seconds and at
>> exactly 120 seconds, the second script begins producing output.
>> Also, while one of these tests is running, I am unable to request a
>> "normal PHP web page" from the same server (localhost). The request
>> "hangs" until the concurrency test in the other tab is finished.
>> I even tried requesting the test script from two different browsers, and
>> the second browser always hangs until the first completes.
>> These observations lend credence to the notion that my cURL script is
>> failing due to dead-locking of some kind. (I'll refrain from discussing
>> this other problem here, as it has its own thread.)
>> Is this inability to handle concurrent requests a limitation of nginx on
>> Windows? Do others on Windows observe this same behavior?
> Your problem is that you only have one PHP process running - and 
> it can only service one request at a time.  AFAIK, php-cgi can't 
> run more than one process on Windows (on Unix it can, with 
> PHP_FCGI_CHILDREN set).  Not sure if there are good options to run 
> multiple PHP processes on Windows.

Thank you for clarifying this crucial point, Maxim. I believe that this
is indeed the crux of the issue.

> Quick-and-dirty solution would be to run multiple php-cgi 
> processes on different ports and list them all in an upstream{} 
> block.
>> I did see the Windows limitation, "Although several workers can be
>> started, only one of them actually does any work", but that isn't the
>> problem here, right? One nginx worker does not mean that only one PHP
>> request can be satisfied at a time, correct?
> Correct.  One nginx process can handle multiple requests, it's one 
> PHP process which limits you.

Understood. This is so hard to believe (the lack of support for multiple
simultaneous PHP processes on Windows) that I had overlooked this as a

And, now that you've explained the problem, finding corroborating
evidence is much easier:

An interesting excerpt from the above thread:

"I did look deeper into the PHP source code after that and found that
the section of code which responds to PHP_FCGI_CHILDREN has been
encapsulated by #ifndef WIN32 So the developers must be aware of the issue."

For the time being, I'll have to run these cURL scripts using Apache
with mod_php, instead of nginx. Not the end of the world.

Thanks again for your valuable time and for clearing-up this major
limitation of PHP (NOT nginx) on Windows.

Best regards,


More information about the nginx mailing list