Event based implementation in http module
vivek goel
goelvivek2011 at gmail.com
Sat May 12 07:56:50 UTC 2012
Hi agentzh,
Thanks for detailed information. The IO operation we are doing doesn't
support blocking calls.
I am thinking about two approach
1. Either pre-forking multiple nginx worker(50).
or
2. Moving the IO operation I am doing to thread in my module so I can user
nginx event based API and having ony 2 nginx worker process.
What do you suggest? Which one will be good approach ?
Will having 50 or more nginx worker will cause extra time for client
connect ?
regards
Vivek Goel
On Tue, May 8, 2012 at 11:16 AM, vivek goel <goelvivek2011 at gmail.com> wrote:
> Sorry just clearing my doubt.
> Again I have one doubt.
> Work I am doing in clcf->handle is a blocking io call.
> Now if I am running nginx with 2 worker process and function I am calling
> in clcf->handle takes 200 ms to generate response.
> So it means that I will not able to server other clients from same worker
> process withing 200 ms time ?
>
> If yes ,
> How can I make it non-blocking so that I can server multiple clients ?
>
>
> Thanks in advance for your reply.
>
>
> regards
> Vivek Goel
>
>
>
> On Mon, May 7, 2012 at 10:57 PM, vivek goel <goelvivek2011 at gmail.com>wrote:
>
>> @Maxim
>> and what about handler function specified by clcf->handler ?
>> Is it also blocking ?
>> and what about my others questions. Can I server multiple client using
>> worker process ?
>>
>> regards
>> Vivek Goel
>>
>>
>>
>> On Mon, May 7, 2012 at 8:19 PM, vivek goel <goelvivek2011 at gmail.com>wrote:
>>
>>> I am working on http module using nginx.
>>> I have one question.
>>>
>>> 1. Is function specified in ngx_command_t will be blocking call ?
>>>
>>> If not
>>> My module description is as follow:
>>> It does read of file which is blocking call. That I think at same
>>> time worker process can't server the same client ?
>>>
>>> The solution I am thinking is that I can do a blocking operation in one
>>> thread and call a callback to send response when response is ready. Is
>>> there a way I can tell worker process to start accepting the connection and
>>> server the response for old request when response is ready for that client?
>>>
>>> Can you please suggest some better idea to server multiple client on
>>> blocking call with nginx http module ?
>>>
>>>
>>>
>>> regards
>>> Vivek Goel
>>>
>>>
>>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.nginx.org/pipermail/nginx-devel/attachments/20120512/83629252/attachment-0001.html>
More information about the nginx-devel
mailing list