[Probably bug] Memory usage when streaming data from a FCGI application
Pierre Bourdon
delroth at gmail.com
Thu Jul 1 03:12:10 MSD 2010
Hello,
I'm relatively new to nginx so I may be completely wrong about what I
assume here. Please excuse me if it is the case.
I'm streaming large quantities of data from a FCGI application through
nginx (large = several gigabytes). When streaming the data, nginx
starts to use more and more memory, with the worker process taking
more than a gigabyte of RAM at a point. The more I stream, the more
the nginx process use. I guess it should be categorized as a bug for
several reasons :
- It should not happen. nginx should not use that much memory when
streaming data from a FCGI application.
- It may even lead to DoS in a shared hosting situation. Someone on
the server may create a FCGI application which streams a lot of data,
making the nginx process take a lot of memory and finally be killed by
the kernel (OOM). I guess the worker would be restarted by the master
process in these cases, but it would exhaust memory and probably swap,
which is never good.
A simple python+flup script reproducing this problem :
def data():
while True:
yield ('x' * 4096)
def myapp(environ, start_response):
start_response('200 OK', [('Content-Type', 'text/plain')])
return data()
if __name__ == '__main__':
from flup.server.fcgi import WSGIServer
WSGIServer(myapp, bindAddress=('127.0.0.1',31337)).run()
When requesting the data from localhost (curl http://localhost:port/
>/dev/null) I got a memory usage increasing by more than 10M/s, which
really is a lot for something that should not in the first place
exhaust memory.
As I said, I'm new to nginx and this may be wanted "by design". I
personally think this is a bug.
Regards,
--
Pierre "delroth" Bourdon <delroth at gmail.com>
Étudiant à l'EPITA / Student at EPITA
http://bitbucket.org/delroth/
More information about the nginx-devel
mailing list