SSL proxy slow....

Gabriel Ramuglia gabe at vtunnel.com
Tue Sep 9 06:41:59 MSD 2008


varnish can't act as an ssl server, not sure about being an ssl client.

On Mon, Sep 8, 2008 at 9:41 PM, James <thenetimp at gmail.com> wrote:
> Thanks Dave.  I'll look into both of those.
>
> Thanks,
> James
>
>
> On Sep 8, 2008, at 9:05 PM, Dave Cheney wrote:
>
>> The the dog slowness you are seeing is probably nginx renegitiation SSL on
>> every backend request. At the moment nginx will issue a connection close
>> after each request.
>>
>> If you are using nginx as an SSL load balancer you might need to use
>> something else (varnish? squid?) that can maintain persistant connections
>> to your backend, this might help, a bit.
>>
>> Cheers
>>
>> Dave
>>
>> On Mon, 8 Sep 2008 20:36:04 -0400, James <thenetimp at gmail.com> wrote:
>>>
>>> I do need to pass SSL back to my app from the front nginx server,
>>> because we are using EC2 forour servers, so I do need to encrypt them
>>> back to the 2 front end servers, as it's on a public network, and the
>>> network is public.
>>>
>>> James
>>>
>>>
>>> On Sep 8, 2008, at 8:05 PM, Dave Cheney wrote:
>>>
>>>> Hi James,
>>>>
>>>> If nginx is acting as your SSL handler then you don't need to pass
>>>> SSL back
>>>> to your app. This should be sufficient.
>>>>
>>>> location / {
>>>>  proxy_set_header X-FORWARDED_PROTO https;
>>>>  proxy_pass https://givvymain;
>>>> }
>>>>
>>>> Cheers
>>>>
>>>> Dave
>>>>
>>>> On Mon, 8 Sep 2008 19:50:30 -0400, James <thenetimp at gmail.com> wrote:
>>>>>
>>>>> Here is my server config.  When I go to http://prod.givvy.com  the
>>>>> result is normal.  When I go to https://prod.givvy.com it's dog slow.
>>>>>
>>>>> Any idea as to how to speed up the SSL side of it?  (right now I am
>>>>> using a local host change to point to the right IP address as
>>>>> prod.givvy.com points to a maintenance page.  We want to launch the
>>>>> site tomorrow, but this is a huge problem for us.  I'd hate to launch
>>>>> it with one server.
>>>>>
>>>>> Thanks
>>>>> James
>>>>>
>>>>> http {
>>>>>
>>>>>   upstream givvymain {
>>>>>       server 75.101.150.160:80        max_fails=1
>>>>> fail_timeout=30s;
>>>>>       server 67.202.3.21:80           max_fails=1
>>>>> fail_timeout=30s;
>>>>>   }
>>>>>
>>>>>   upstream givvymainssl {
>>>>>       server 75.101.150.160:443       max_fails=1
>>>>> fail_timeout=30s;
>>>>>       server 67.202.3.21:443          max_fails=1
>>>>> fail_timeout=30s;
>>>>>   }
>>>>>
>>>>>   server {
>>>>>       listen 80;
>>>>>       server_name prod.givvy.com;
>>>>>       location / {
>>>>>           proxy_pass http://givvymain;
>>>>>           proxy_next_upstream error timeout;
>>>>>       }
>>>>>   }
>>>>>
>>>>>
>>>>>   server {
>>>>>       listen 443;
>>>>>       server_name prod.givvy.com;
>>>>>
>>>>>       ssl on;
>>>>>       ssl_certificate /####PATH TO CERT###/
>>>>>       ssl_certificate_key /####PATH TO KEY###/
>>>>>       keepalive_timeout 70;
>>>>>
>>>>>       location / {
>>>>>           proxy_set_header X-FORWARDED_PROTO https;
>>>>>           proxy_pass https://givvymainssl;
>>>>>       }
>>>>>   }
>>>>> }
>>>>>
>>>>
>>>
>>
>
>
>





More information about the nginx mailing list