<div dir="ltr"><div>Hi,<br><br>Recently we added a 'thread_pool' directive to our main configuration. A few hours later we saw a huge increase in the connections_writing stat as reported by stub_status module. This number reached +- 3800 and is stuck there since. The server in question is operating normally, but this is very strange.<br><br>Any hints on what this could be?<br><br><br>Some info:<br><br>- Here is a graph of the stats reported, for a server with thread_pool and another without: <a href="http://imgur.com/a/lF2EL">http://imgur.com/a/lF2EL</a><br><br>- I don`t have older data anymore, but the jump from <100 to +- 3800 connections_writing happened in two sharp jumps. The first one following a reload;<br><br>- The machines' hardware and software are identical except for the thread_pool directive in their nginx.conf. They live in two different data centers;<br><br>- Both machines are performing normally. Nothing unusual in CPU or RAM usage. Nginx performance is about the same.<br><br>- Reloading Nginx with 'nginx -s reload' does nothing. Restarting the process brings connections_writing down.<br><br><br>Debug stuff:<br><br>mallmann# uname -a<br>Linux xxx 3.8.13-98.5.2.el6uek.x86_64 #2 SMP Tue Nov 3 18:32:04 PST 2015 x86_64 x86_64 x86_64 GNU/Linux<br><br>mallmann# nginx -V<br>nginx version: nginx/1.8.0<br>built by gcc 4.4.7 20120313 (Red Hat 4.4.7-16) (GCC)<br>built with OpenSSL 1.0.1e-fips 11 Feb 2013<br>TLS SNI support enabled<br>configure arguments: --prefix=/usr/share/nginx --sbin-path=/usr/sbin/nginx --conf-path=/etc/nginx/nginx.conf --error-log-path=/var/log/nginx/error.log --http-log-path=/var/log/nginx/access.log --http-client-body-temp-path=/var/lib/nginx/tmp/client_body --http-proxy-temp-path=/var/lib/nginx/tmp/proxy --http-fastcgi-temp-path=/var/lib/nginx/tmp/fastcgi --http-uwsgi-temp-path=/var/lib/nginx/tmp/uwsgi --http-scgi-temp-path=/var/lib/nginx/tmp/scgi --pid-path=/var/run/nginx.pid --lock-path=/var/lock/subsys/nginx --user=nginx --group=nginx --with-ipv6 --with-http_ssl_module --with-http_realip_module --with-http_addition_module --with-http_xslt_module --with-http_image_filter_module --with-http_geoip_module --with-http_sub_module --with-http_flv_module --with-http_mp4_module --with-http_gunzip_module --with-http_gzip_static_module --with-http_random_index_module --with-http_secure_link_module --with-http_degradation_module --with-http_stub_status_module --with-http_perl_module --with-mail --with-mail_ssl_module --with-pcre --with-google_perftools_module --add-module=/builddir/build/BUILD/nginx-1.8.0/headers-more-nginx-module-0.25 --add-module=/builddir/build/BUILD/nginx-1.8.0/ngx_http_bytes_filter_module --add-module=/builddir/build/BUILD/nginx-1.8.0/echo-nginx-module-0.55 --with-threads --with-debug --with-cc-opt='-O2 -g -pipe -Wall -Wp,-D_FORTIFY_SOURCE=2 -fexceptions -fstack-protector --param=ssp-buffer-size=4 -m64 -mtune=generic' --with-ld-opt=' -Wl,-E'<br><br><br>Affected server:<br><br>mallmann# lsof -n -u nginx | awk '{print $5}' | sort | uniq -c | sort -nr<br> 4172 REG<br> 2140 IPv4<br> 100 unix<br> 30 CHR<br> 20 DIR<br> 20 0000<br> 3 sock<br> 1 TYPE<br>mallmann# curl <a href="http://127.0.0.1/status">http://127.0.0.1/status</a><br>Active connections: 5924<br>server accepts handled requests<br> 5864099 5864099 15527178<br>Reading: 0 Writing: 3883 Waiting: 2040<br><br><br>Normal server:<br><br>mallmann# lsof -n -u nginx | awk '{print $5}' | sort | uniq -c | sort -nr<br> 4454 REG<br> 1967 IPv4<br> 100 unix<br> 30 CHR<br> 20 DIR<br> 20 0000<br> 1 unknown<br> 1 TYPE<br> 1 sock<br>mallmann# curl <a href="http://127.0.0.1/status">http://127.0.0.1/status</a><br>Active connections: 2096<br>server accepts handled requests<br> 1136132 1136132 3464904<br>Reading: 0 Writing: 107 Waiting: 1989</div><div><br></div><div>-- <br><div>Marcelo Mallmann Dias<br></div>
</div></div>