To many open files...

István Szukács leccine at
Wed Jan 14 17:11:12 MSK 2009

at first you have to separate the layer where you have the problem. during
the last week i made a small test with nginx to able to reach the 50K req/s
on a single host using CentOS and nginx

linux level:


* hard nofile 10000
* soft nofile 10000

It might solve your problem.


On Wed, Jan 14, 2009 at 12:36 PM, Thomas <iamkenzo at> wrote:

> On Tue, Jan 13, 2009 at 5:50 PM, Ilan Berkner <iberkner at> wrote:
> > Thanks for the fast response.  Our site is back up :-).  Our tech support
> > (dedicated server support) did something to fix this issue, I will find
> out
> > later what.  I'll keep an eye on the open files as we currently have it
> set
> > pretty high.
> >
> I remember Zed Shaw talking about such issue back in the days when
> people were running Rails through fastcgi. It had something to do with
> keep alive connections. The connections would actually never close
> themselves.
> --
> Self training videos the field of IT:

the sun shines for all
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <>

More information about the nginx mailing list