nginx, fcgi+perl forking a new process per request ...

Stefan Parvu sparvu at
Mon Mar 15 19:38:54 MSK 2010

> A single page load can - and most often does - have more than one
> request; a webserver gets hits by more than oner person... The higher
> the traffic the higher the chances of you fork-bombing your own
> server. It might reach the maximum allowed number of processes, if
> defined, or whichever maximum size for the PID variable (will-guessing
> this one)? Nice academic exercise.

sure, makes sense. 
> Anyway, you should look for a fastcgi daemon instead of a script;
> anything that won't spwan 1 process per 1 request. On Debian i use the
> php-cgi package and a fastcgi script to spawn the daemon on
> /etc/init.d. It uses 4-6 processes top, it's attached.

Thanks. Probable a fastcgi daemon written in C or Perl. Found
out the Catalyst::Engine::FastCGI , FCGI::Engine. I would
stick with the current version nginx-fcgi for time being.

It would be nice to have a std fastcgi daemon written in C,
part of NGINX. But most likely this is not the purpose of 
the project. 

If you guys have any ideas about a simple C based fastcgi daemon
let me know. Would be cool if would compile across Solaris, Linux,


More information about the nginx mailing list