A question of fastcgi_pass method

SanCao Jie sancaojie at gmail.com
Sun Oct 17 17:22:29 MSD 2010

Thanks for reply.

Now, I have two new questions.

(1) ,  spawn-fcgi -f FILE -a IP -p PORT -F 6
       use -F flag to spawn 6 text.py processes.

       And in this webbench shows that more text.py processes does not have
a better
performance than single text.py process.


(2), Same as the question 1 , there is a multi-processes test .
      But this time I use nginx's   reverse proxy  and upstream .

      Like this:


     upstream test {

   location / {
           proxy_pass http://test;
           proxy_redirect off;
           proxy_set_header X-Real-IP $remote_addr;
           proxy_set_header X-Forwarded-For $proxy_add_x_forwarded_for;

   copy test.py twice  ,named test1.py   and  test2,py
   And run them in turn:
  python test.py
  python test1.py 8081
  python test2.py 8082

   This situation is totally  different from  question 1 .
   This use web.py's web server itself and ONE port with ONE single test.py

   But this time ,  webbench still show that  the performance does not
increase .


   Why multi-processes don't show a better performance than single one

   Is it that my  python web app is to simple ?  It just render some plain
   So ,this app is so easy that just one single process is enough?

   Today I used the ab tool  test the web .It shows that 'Request Per
Second' is just about 110 . Is this number too little ?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://nginx.org/pipermail/nginx/attachments/20101017/9fc3f19e/attachment.html>

More information about the nginx mailing list