apparent deadlock with fastcgi
Maxim Dounin
mdounin at mdounin.ru
Wed Oct 10 22:48:46 UTC 2012
Hello!
On Wed, Oct 10, 2012 at 06:18:59PM -0400, jkl wrote:
> I have a simple fastcgi responder that converts a JSON post to a CSV
> download. It works in a streaming fashion, writing the response before
> finishing reading the request. According to the FastCGI specification, such
> operation is allowed:
>
> http://www.fastcgi.com/devkit/doc/fcgi-spec.html
>
> "The Responder application sends CGI/1.1 stdout data to the Web server over
> FCGI_STDOUT, and CGI/1.1 stderr data over FCGI_STDERR. The application sends
> these concurrently, not one after the other. The application must wait to
> finish reading FCGI_PARAMS before it begins writing FCGI_STDOUT and
> FCGI_STDERR, but it needn't finish reading from FCGI_STDIN before it begins
> writing these two streams."
>
> Using a debugger I observe that my responder blocks while reading from
> FCGI_STDIN before I have received the whole request, which I think implies
> there is a bug on the nginx side.
>
> If I buffer the entire response in memory before sending it to back to
> nginx, I don't see the problem with the blocking read on the request.
>
> I only observe this problem when the request is "large enough", but I'm not
> sure what exact size triggers the problem.
>
> Does nginx support this sort of behavior for fastcgi responders? If not, is
> that fact documented somewhere that I missed?
It's not supported. As long as nginx sees full response headers -
it considers rest of request body unneeded and stops sending it.
--
Maxim Dounin
http://nginx.com/support.html
More information about the nginx
mailing list