FastCGI PHP - unable to prematurely close connection to browser

Rob Schultz rschultz7 at
Sun Feb 7 20:05:08 MSK 2010

	Not really to change you method on doing things. but you still will encounter slow response times for users eventually since that PHP process is *still* processing things after that user is done and won't be ready for the next user until it is done processing that script. So theoretical if you had 5 max PHP processes to handle client requests and say each PHP script take 500ms to complete but only 100ms is until your Connection: close.  If 6 users hit your server at once. That user would have to wait the full 500ms until a free PHP process is available to process his request and then take the additional 100ms for it to see content. So now his experience for the page to load up is it takes almost 600ms.

Something i would suggest is using a messaging queue and a "few PHP process's that are just designed to process the queue" this allows you to separate your user processing and your offline processing. Which will also allow you to offload your work to another server if it starts growing and such. here is website with some links to some message queues.


On Feb 7, 2010, at 8:50 AM, cactus wrote:

> First of all - thank you for your answer, I really appreciate it. 
> This is interesting - I have re-checked and it takes me 32ms to get the result on my development machine:
> Server	Apache/2.2.9 (Debian) PHP/5.2.6-1+lenny4 with Suhosin-Patch
> X-Powered-By	PHP/5.2.6-1+lenny4
> (PHP is installed as mod_php)
> Still, it is very important for me to be able to postpone processing. If I can't do it on NginX I will probably have to find another server that can do this (overall load on the server is less important than quick response times in this case). I'm really hoping it's just a config thing... :)
>> From what I understand:
> - PHP offers no way to prematurely close the connection except by setting and reaching Content-Length
> - it is the web server (NginX / Apache) that can (should?) close the connection in this case
> I can't make PHP close the conenction, so my only hope is NginX. Can it be told to do so? Am I missing something?
> Thanks again!
> Hoang Hoang Wrote:
> -------------------------------------------------------
>> Hi,
>> I just tested your script and found that I needed
>> to wait 10s to see the 
>> response. I am using Windows XP, Apache 2.1.11
>> worker MPM, PHP 5.3.1
>> It did not work asynchronously as you expected. Am
>> I missing something 
>> here?
>> Regards
>> cactus wrote:
>>> Hi all!
>>> I am optimizing a few of the PHP scripts by:
>>> - doing all that generates output to browser
>>> - closing connection to browser
>>> - doing some more processing
>>> The processing bit cannot be avoided, but the
>> speed of execution (from 
>>> visitors' point of view) is awesome this way.
>> The problem is that we 
>>> have tried migrating the script to NginX +
>> FastCGI (it works on Apache + 
>>> mod_php) but it doesn't close the connection
>> anymore, it keeps it open 
>>> until the end... Which makes scripts slow again.
>>> I am using Content-Length to allow server to
>> figure out that all content 
>>> was already generated. This is the test case:
>>> <?
>>>  header("Connection: close"); // not sure we
>> need this one
>>>  header("Content-Encoding: none");
>>>  ignore_user_abort(true);
>>>  ob_start();
>>>  echo ('Lets output something.');
>>>  // output Content-Length and flush buffers:
>>>  $size = ob_get_length();
>>>  header("Content-Length: $size");
>>>  ob_end_flush();
>>>  flush();
>>>  // ****************
>>>  // do some heavy processing here:
>>>  sleep(5);
>>>  function postproc() {
>>>    flush();
>>>    sleep(5);
>>>  }
>>>  register_shutdown_function('postproc');
>>> ?>
>>> The browser shows this page immediately in
>> Apache+mod_php but it waits 
>>> for 10 seconds in NginX + FastCGI.
>>> I am a bit stuck here and would appreciate some
>> help... Is this a 
>>> problem with FastCGI? Is there another way to do
>> it?
>>> Thanks!
>>> Posted at Nginx Forum: 
>> 51310
>> -- 
>> Posted via
>> _______________________________________________
>> nginx mailing list
>> nginx at
> Posted at Nginx Forum:,51310,51672#msg-51672
> _______________________________________________
> nginx mailing list
> nginx at

More information about the nginx mailing list