nginx serving large files - performance issues with more than ~800-1000 connections

Tomasz Chmielewski mangoo at
Thu May 24 07:28:56 UTC 2012


I have a cluster of 10 nginx 1.2.0 servers, on Linux. They primarily serve large files.

Whenever the number of ESTABLISHED connections to nginx is above 800-1000, the things get very slow.

I.e. it can take a minute or more before nginx starts serving such a connection; then, the file is served very slow (started from a server in the same rack):

wget -O /dev/null http://server/content/7a35859b7d91ca48fef7a3e2a9bc6fc8.dat

I've tried different tuning parameters (nginx, sysctl etc.), but they don't seem to change much.

The only thing which helps is starting one more nginx instance, on a different port.

Then, this second instance serves the files just fine. I.e. with the number of established connections above 800-1000, this one is slow:

PORT=80; wget -O /dev/null http://server:$PORT/content/7a35859b7d91ca48fef7a3e2a9bc6fc8.dat

The second instance running on port 82 will reply fast and serve files fast:

PORT=82; wget -O /dev/null http://server:$PORT/content/7a35859b7d91ca48fef7a3e2a9bc6fc8.dat

Does it suggest nginx issues? Because the second nginx instance serves the files fine.

Or maybe some system / sysctl parameters?

Tomasz Chmielewski

More information about the nginx mailing list