How to disable output buffering with PHP and nginx
Maxim Dounin
mdounin at mdounin.ru
Thu Oct 10 15:26:26 UTC 2013
Hello!
On Thu, Oct 10, 2013 at 11:13:40AM -0400, Ben Johnson wrote:
[...]
> Well, after all of the configuration changes, both to nginx and PHP, the
> solution was to add the following header to the response:
>
> header('Content-Encoding: none;');
Just in case: this is very-very wrong, there is no such
content-coding. Never use this in real programs.
But the fact that it helps suggests you actually have gzip enabled
somewhere in your nginx config - as gzip doesn't work if it sees
Content-Encoding set.
All this probably doesn't matter due to you only used it as a
debugging tool.
[...]
> The whole reason for which I was seeking to disable output buffering is
> that I need to test nginx's ability to handle multiple requests
> simultaneously. This need is inspired by yet another problem, about
> which I asked on this list in late August: "504 Gateway Time-out when
> calling curl_exec() in PHP with SSL peer verification
> (CURLOPT_SSL_VERIFYPEER) off".
>
> Some folks suggested that the cURL problem could result from nginx not
> being able to serve more than one request for a PHP file at a time. So,
> that's why I cooked up this test with sleep() and so forth.
>
> Now that output buffering is disabled, I am able to test concurrency.
> Sure enough, if I request my concurrency test script in two different
> browser tabs, the second tab will not begin producing output until the
> first tab has finished. I set the test time to 120 seconds and at
> exactly 120 seconds, the second script begins producing output.
>
> Also, while one of these tests is running, I am unable to request a
> "normal PHP web page" from the same server (localhost). The request
> "hangs" until the concurrency test in the other tab is finished.
>
> I even tried requesting the test script from two different browsers, and
> the second browser always hangs until the first completes.
>
> These observations lend credence to the notion that my cURL script is
> failing due to dead-locking of some kind. (I'll refrain from discussing
> this other problem here, as it has its own thread.)
>
> Is this inability to handle concurrent requests a limitation of nginx on
> Windows? Do others on Windows observe this same behavior?
Your problem is that you only have one PHP process running - and
it can only service one request at a time. AFAIK, php-cgi can't
run more than one process on Windows (on Unix it can, with
PHP_FCGI_CHILDREN set). Not sure if there are good options to run
multiple PHP processes on Windows.
Quick-and-dirty solution would be to run multiple php-cgi
processes on different ports and list them all in an upstream{}
block.
> I did see the Windows limitation, "Although several workers can be
> started, only one of them actually does any work", but that isn't the
> problem here, right? One nginx worker does not mean that only one PHP
> request can be satisfied at a time, correct?
Correct. One nginx process can handle multiple requests, it's one
PHP process which limits you.
--
Maxim Dounin
http://nginx.org/en/donation.html
More information about the nginx
mailing list