Too many open files

Davy Campano dcampano at gmail.com
Thu May 1 07:27:14 MSD 2008


After a little more research, it looks like the php-cgi processes that are
being proxied to by nginx may be the problem.  It looks like the php-cgi
processes had a lot of files left open.

I ran `lsof | wc -l` and the result was 139566

After a restart of the php-cgi processes, the result was only 12045

On Wed, Apr 30, 2008 at 10:46 PM, Davy Campano <dcampano at gmail.com> wrote:

> Davy Campano <dcampano at ...> writes:
>
> >
> > I'm getting a "Too many open files in system" error after running for
> about 10
> minutes.  I'm in the process of testing out nginx and it's a pretty busy
> server
> doing around 200 requests/sec.  My current open file limit is 131072.
> Does
> anyone know a safe amount to set this to for a 32-bit linux system with
> 6GB ram?
>
> ulimit -n actually returns 1024, i'm thinking this should be higher
>
> cat /proc/sys/fs/file-max   returns 131072
>
>
>
>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nginx.org/pipermail/nginx/attachments/20080430/6948d78b/attachment.html>


More information about the nginx mailing list