How to wait for upstream server to start

kevin gill kevin at movieextras.ie
Wed Feb 24 13:27:57 MSK 2010


> could you describe how you got this working? i need something similar
> to make sure that a django app doesn't die.

I assume that you are modifing a working nginx/django configuration.

Mine is a zope server. For static content I go to varnish and for the rest
I use proxy_pass to go to zope. I inserted haproxy in front of zope.

nginx -> varnish -> haproxy -> zope
nginx ------------> haproxy -> zope

nginx is on port 80
varnish is on port 8082
zope is on port 8081
haproxy is on port 8085

#------------------------------------------------------------
nginx server configuration...

upstream varnish1 {
    server 127.0.0.1:8082;
}
upstream haproxy {
    server 127.0.0.1:8085;
}

#------------------------------------------------------------
haproxy configuration....

global
    log 127.0.0.1   local0
    log 127.0.0.1   local1 notice
    maxconn 4096
    user haproxy
    group haproxy
    daemon

    # enable debug to run in the foreground
    debug

defaults
    mode    http
    retries 100
    option redispatch
    timeout connect 60s

    # I have some very slow requests - generate 1000's of emails, printouts
    timeout queue 300s
    timeout client 1200s
    timeout server 1200s

    # this is just monitoring stuff
    monitor-uri /haproxy-ping
    stats enable
    stats uri /haproxy-status
    stats refresh 5s
    stats realm Haproxy statistics

listen  zope3 0.0.0.0:8085
    dispatch 127.0.0.1:8081 maxcon 3

#------------------------------------------------------------
Install haproxy...

$ apt-get install haproxy
edit /etc/default/haproxy and enable it
$ /etc/init.d/haproxy start

Regards,

Kevin

>
> --timball
>
> On Tue, Feb 23, 2010 at 6:59 PM, kevin gill <kevin at movieextras.ie> wrote:
>> I got this working using haproxy with the 'dispatch' option.
>>
>> Thanks
>>
>>
>>> I am running a low traffic site using nginx. I use one upstream
>>> multi-threadded server to serve the content. I use the proxy module to
>>> forward the requests.
>>>
>>> Occasionally, I need to restart the upstream server. I takes 20-30
>>> seconds
>>> to startup.
>>>
>>> I want requests to queue until the server is started, i.e. configure a
>>> 30
>>> second timeout before I get the "502 Bad Gateway" response.
>>>
>>> I am running nginx/0.7.64 on ubuntu.
>>>
>>> Is there an nginx configuration option to make it wait until and retry
>>> while the server restarts? Alternatively, is there another product I
>>> should put in the middle between nginx and my upstream server which
>>> provides this functionality.
>>>
>>> Thanks,
>>>
>>> Kevin
>>>
>>>
>>> _______________________________________________
>>> nginx mailing list
>>> nginx at nginx.org
>>> http://nginx.org/mailman/listinfo/nginx
>>>
>>>
>>>
>>
>>
>>
>> _______________________________________________
>> nginx mailing list
>> nginx at nginx.org
>> http://nginx.org/mailman/listinfo/nginx
>>
>
>
>
> --
>         GPG key available on pgpkeys.mit.edu
> pub  1024D/511FBD54 2001-07-23 Timothy Lu Hu Ball <timball at tux.org>
> Key fingerprint = B579 29B0 F6C8 C7AA 3840  E053 FE02 BB97 511F BD54
>
> _______________________________________________
> nginx mailing list
> nginx at nginx.org
> http://nginx.org/mailman/listinfo/nginx
>
>
>





More information about the nginx mailing list