nginx in high concurrency setups
rasmus at notion.se
Tue Dec 15 18:56:35 MSK 2009
Richard Jones of Last.fm fame has written some about this kind of testing:
On Mon, Dec 14, 2009 at 21:16, merlin corey <merlincorey at dc949.org> wrote:
> On Mon, Dec 14, 2009 at 11:16 AM, Dennis J. <dennisml at conversis.de> wrote:
>> I'm currently experimenting how many concurrent connections nginx can
>> handle. The problem I'm running into is that for each request I send to the
>> server I get a connection in TIME_WAIT state. If I do this using
>> benchmarking tools like httperf or ab I quickly seem to hit a ceiling. Once
>> the number of TIME_WAIT connections reaches about 16000 the benchmarking
>> tools just freeze and I have to wait until that number comes down again.
>> What is the reason for these TIME_WAIT connections and how can I get rid of
>> them faster? I'm only serving small static files and the delivery is not
>> supposed to take longer than say 300ms so any connection that takes longer
>> than that can be aborted if that is necessary to make room for new incoming
>> Does anyone have experience with serving lots of small static requests using
>> nginx mailing list
>> nginx at nginx.org
> You will need to tune your OS's TCP and socket settings, I do believe.
> It is dependent on your OS what exactly you must do.
> Also, keep in mind, that when you are doing these tests, ideally you
> should be sending the test-load from multiple machines that are not
> the same machine that is serving. This is to rule out the
> benchmarking program fighting for resources with nginx and to rule out
> a single machine's ceilings.
> nginx mailing list
> nginx at nginx.org
More information about the nginx