To many open files...
István Szukács
leccine at gmail.com
Wed Jan 14 17:11:12 MSK 2009
at first you have to separate the layer where you have the problem. during
the last week i made a small test with nginx to able to reach the 50K req/s
on a single host using CentOS and nginx
linux level:
/etc/security/limits.conf
* hard nofile 10000
* soft nofile 10000
It might solve your problem.
Regards,
Istvan
On Wed, Jan 14, 2009 at 12:36 PM, Thomas <iamkenzo at gmail.com> wrote:
> On Tue, Jan 13, 2009 at 5:50 PM, Ilan Berkner <iberkner at gmail.com> wrote:
> > Thanks for the fast response. Our site is back up :-). Our tech support
> > (dedicated server support) did something to fix this issue, I will find
> out
> > later what. I'll keep an eye on the open files as we currently have it
> set
> > pretty high.
> >
>
> I remember Zed Shaw talking about such issue back in the days when
> people were running Rails through fastcgi. It had something to do with
> keep alive connections. The connections would actually never close
> themselves.
>
>
> --
> Self training videos the field of IT: http://www.digiprof.fr
>
>
--
the sun shines for all
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nginx.org/pipermail/nginx/attachments/20090114/10502ab2/attachment.html>
More information about the nginx
mailing list