To many open files...
dave at cheney.net
Fri Jan 16 00:40:58 MSK 2009
Each fast CGI connection will use 2 file descriptors, one for the
client, one for the proxy.
As others have advised, make sure your ulimit settings at the OS level
allow that many file descriptors.
On 14/01/2009, at 3:15 AM, Ilan Berkner wrote:
> This morning, our server (www.spellingcity.com) went down and I
> can't figure out why. The Nginx log file has this error in it (all
> over the place and is growing):
> 2009/01/13 10:07:24 [alert] 26159#0: accept() failed (24: Too many
> open files) while accepting new connection on 184.108.40.206:80
> Which of course follows up with endless:
> 2009/01/13 10:14:56 [error] 26159#0: *13007 upstream timed out (110:
> Connection timed out) while connecting to upstream, client:
> 220.127.116.11, server: www.spellingcity.com, request: "GET / HTTP/
> 1.0", upstream: "fastcgi://127.0.0.1:9000", host: "www.spellingcity.com
More information about the nginx