Nginx Can't handeled More Perl CGI concurrent Request
bhavik
nginx-forum at forum.nginx.org
Mon Jun 27 05:36:50 UTC 2016
Hello
We are using Fastcgi module in nginx with perl.
We are trying to send 200 concurrent perl cgi request, but after increase
the 250 concurrent request to specific script, We are getting below error in
nginx log file.
[error] 23526#0: *3291 connect() to unix:/var/run/fcgiwrap.socket failed
(11: Resource temporarily unavailable) while connecting to upstream, client:
::1, server: _, request: "POST /cgi-bin/example/example-xml.cgi HTTP/1.1",
upstream: "fastcgi://unix:/var/run/fcgiwrap.socket:", host: "localhost"
In my nginx file content following configuration.
location /cgi-bin/ {
gzip off;
# Set the root to /usr/lib (inside this location this means that we are
# giving access to the files under /usr/lib/cgi-bin)
root /usr/lib/;
# Fastcgi socket
fastcgi_pass unix:/var/run/fcgiwrap.socket;
# fastcgi_pass 127.0.0.1:8999;
# Fastcgi parameters, include the standard ones
include /etc/nginx/fastcgi_params;
# Adjust non standard parameters (SCRIPT_FILENAME)
fastcgi_param SCRIPT_FILENAME $document_root$fastcgi_script_name;
}
The Same things Apache can handle 500+ concurrent request without any extra
configuration.
Can any one suggest me how to achieve this OR Is there any mistake in my
configuration to make it working with more concurrent request ?
If you require more log feel free to ask.
Posted at Nginx Forum: https://forum.nginx.org/read.php?2,267885,267885#msg-267885
More information about the nginx
mailing list