stream ajax events, how do I handle long persistent connections?

Emit Sorrels emit.sorrels at gmail.com
Fri Feb 27 03:21:26 MSK 2009


Hi guys,

To get familiar with nginx's module api, I wrote a quick and
dirty handler module that for now simply spits out a text/html 
based on the uri.

Now, to go one step further, I want to send to the client a 
series of strings with delays in between them.
(My intermediate goal is to implement something like:
http://ajaxify.com/run/streaming/ --- basically print out
javascript on certain events and client updates some div..
nothing fancy.)

To get things started I just want to print "hello1" .. "hello50"
with 1 second delays in between.

Obviously, if I just loop inside my handler callback with calls
to sleep in between the ngx_http_output_filter calls,
it works, but the worker process won't accept any other clients
while it's in the loop.... I don't expect more than 30 simult
clients/conns but don't think having 30 worker processes is the 
answer :P

What is the proper way to do this? Should I spawn a separate
thread inside the handler? (but then how do I coordinate the life
cycle of the connection/request? seems pretty dangerous and
locking this and that appears hairy)

Is there some way to return from the handler with a flag saying 
"there's more data, so call me again with the same request context"?

I'm also looking at the upstream type module api, it seems very
interesting and I might be able to hook into that instead?
I suppose I can write a separate standalone process and hook w/ fastcgi
but I want see if I can have everything inside the module.

I'm still studying the source for clues... 
would appreciate any hints. TIA

-Emit.






More information about the nginx mailing list