nginx, fcgi+perl forking a new process per request ...
nunomagalhaes at eu.ipp.pt
Mon Mar 15 19:23:34 MSK 2010
On Mon, Mar 15, 2010 at 12:43, Stefan Parvu
<sparvu at systemdatarecorder.org> wrote:
> In my case nginx-fcgi does currently the job and it is not
> a very solid solution. Correct ?
A single page load can - and most often does - have more than one
request; a webserver gets hits by more than oner person... The higher
the traffic the higher the chances of you fork-bombing your own
server. It might reach the maximum allowed number of processes, if
defined, or whichever maximum size for the PID variable (will-guessing
this one)? Nice academic exercise.
Anyway, you should look for a fastcgi daemon instead of a script;
anything that won't spwan 1 process per 1 request. On Debian i use the
php-cgi package and a fastcgi script to spawn the daemon on
/etc/init.d. It uses 4-6 processes top, it's attached.
() ascii-rubanda kampajno - kontraŭ html-a retpoŝto
/\ ascii ribbon campaign - against html e-mail
-------------- next part --------------
A non-text attachment was scrubbed...
Size: 4042 bytes
Desc: not available
More information about the nginx