Too many open files errors

Arash Ferdowsi arash at getdropbox.com
Thu Jan 3 02:21:13 MSK 2008


You're probably hitting your OS's file descriptor limit:

To check and modify system limits.
% cat /proc/sys/fs/file-max

[To temporarily increase this to 65535 (as root)]
# echo "65535" > /proc/sys/fs/file-max

If you want this new value to survive across reboots you can at it to
/etc/sysctl.conf

# Maximum number of open files permited
fs.file-max = 65535



On Jan 2, 2008 3:15 PM, Mustafa Toraman <lists at ruby-forum.com> wrote:
> Hello all ,
> nice to see a nginx topic on here :) also greetings to mail-listers!
>
> I have problem with nginx+fast-cgi. Sometimes CPU useage getting higher
> then %100.  I checked error.log and found something.
>
> On startup (nginx) , there is showing an error more then 5 like that;
>
> 2008/01/03 01:07:54 [alert] 12755#0: sendmsg() failed (9: Bad file
> descriptor)
>
> After a few minutes later (10 or 15 minutes) , i am having Too many open
> files error too. Then cpu usage getting over %100 for a few minutes...
> also HTTP 500 error...
>
> I am using Centos5 + 64bit with nginx-0.5.34 + fastcgi. Also this is a
> BitTorrent tracker.
>
> I have reinstalled my box and i still having same problem.
>
> Willing to get answered.
> Happy new year all!
> Regards!
> --
> Posted via http://www.ruby-forum.com/.
>
>



-- 
Arash Ferdowsi
CTO, Dropbox
913.707.5875 (m)





More information about the nginx mailing list