To many open files...
leccine at gmail.com
Wed Jan 14 17:11:12 MSK 2009
at first you have to separate the layer where you have the problem. during
the last week i made a small test with nginx to able to reach the 50K req/s
on a single host using CentOS and nginx
* hard nofile 10000
* soft nofile 10000
It might solve your problem.
On Wed, Jan 14, 2009 at 12:36 PM, Thomas <iamkenzo at gmail.com> wrote:
> On Tue, Jan 13, 2009 at 5:50 PM, Ilan Berkner <iberkner at gmail.com> wrote:
> > Thanks for the fast response. Our site is back up :-). Our tech support
> > (dedicated server support) did something to fix this issue, I will find
> > later what. I'll keep an eye on the open files as we currently have it
> > pretty high.
> I remember Zed Shaw talking about such issue back in the days when
> people were running Rails through fastcgi. It had something to do with
> keep alive connections. The connections would actually never close
> Self training videos the field of IT: http://www.digiprof.fr
the sun shines for all
-------------- next part --------------
An HTML attachment was scrubbed...
More information about the nginx