Program hangs when modifying u->buffer while using proxy module to talk to upstream backend

Ashish S ashishs.dev at gmail.com
Tue Apr 10 21:03:09 UTC 2012


Hi, I figured the issue.   Maxim's intuition was right. u->length was
not set correctly (the u->length set in the code i copy-pasted
earlier, was modified later by a filter method). After fixing the
issue, the code works correctly, no more hangs.

Thanks,
Ashish


On Tue, Apr 10, 2012 at 11:16 AM, Ashish S <ashishs.dev at gmail.com> wrote:
> Hi Maxim,
>
> Here is a simple test code i wrote, (for my process header method),
> which shows the same behavior.
>
> ============
> ngx_int_t
> ngx_http_my_process_header(ngx_http_request_t *r)
> {
>    ngx_http_upstream_t *u = r->upstream;
>
>    std::string tmpString ("Test string");
>    ngx_memcpy(u->buffer.start, tmpString.c_str(), tmpString.length());
>    u->buffer.pos = u->buffer.start;
>    u->buffer.last = u->buffer.pos + tmpString.length();
>
>    u->length = u->headers_in.content_length_n = tmpString.length();
>    r->headers_out.content_length_n = tmpString.length();
>    u->headers_in.status_n = NGX_HTTP_OK;
>    u->state->status = NGX_HTTP_OK;
>
>    return NGX_OK;
> }
> ============
>
> Thanks,
> Ashish
>
>
> On Tue, Apr 10, 2012 at 5:13 AM, Maxim Dounin <mdounin at mdounin.ru> wrote:
>> Hello!
>>
>> On Mon, Apr 09, 2012 at 07:02:10PM -0700, Ashish S wrote:
>>
>>> Hi,
>>>
>>> I currently use upstream to talk to a back-end, which sends me some
>>> response. In my module, i am trying to re-format the plain-text
>>> upstream backend response, to XML, and I also add some new data to it,
>>> based on an in-memory lookup within my module.  What would be the best
>>> way to do this?
>>
>> Unless you are working on custom protocol module, you may want to
>> use filter module to re-format data instead.
>>
>>> In my setup, I am able to parse response from upstream (u->buffer),
>>> construct a modified string and assign it back to u->buffer, and am I
>>> setting  u->headers_in.content_length_n and
>>> r->headers_out.content_length_n correctly. The issue i face is, every
>>> *second* request hangs, if i am trying the request through the
>>> browser. (However, trying through command-line & curl, works
>>> everytime). And I always see this error message in the logs: "client
>>> prematurely closed connection, so upstream connection is closed too
>>> while sending to client, client: 127.0.0.1".   What am i doing wrong?
>>
>> It's hard to say anything without seeing the code, but most likely
>> you fail to set u->length properly.  See e.g. memcached module for
>> a simple example of non-buffered protocol handler.
>>
>> Maxim Dounin
>>
>> _______________________________________________
>> nginx mailing list
>> nginx at nginx.org
>> http://mailman.nginx.org/mailman/listinfo/nginx



More information about the nginx mailing list