答复: problems when use fastcgi_pass to deliver request to backend
林谡
linsu at feinno.com
Fri May 29 07:58:46 UTC 2015
/* we support the single request per connection */
2573<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2573>
2574<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2574>
case ngx_http_fastcgi_st_request_id_hi:
2575<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2575>
if (ch != 0) {
2576<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2576>
ngx_log_error(NGX_LOG_ERR, r->connection->log, 0,
2577<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2577>
"upstream sent unexpected FastCGI "
2578<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2578>
"request id high byte: %d", ch);
2579<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2579>
return NGX_ERROR;
2580<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2580>
}
2581<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2581>
state = ngx_http_fastcgi_st_request_id_lo;
2582<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2582>
break;
2583<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2583>
2584<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2584>
case ngx_http_fastcgi_st_request_id_lo:
2585<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2585>
if (ch != 1) {
2586<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2586>
ngx_log_error(NGX_LOG_ERR, r->connection->log, 0,
2587<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2587>
"upstream sent unexpected FastCGI "
2588<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2588>
"request id low byte: %d", ch);
2589<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2589>
return NGX_ERROR;
2590<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2590>
}
2591<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2591>
state = ngx_http_fastcgi_st_content_length_hi;
2592<http://trac.nginx.org/nginx/browser/nginx/src/http/modules/ngx_http_fastcgi_module.c#L2592>
break;
By reading source code, I saw the reason , so can nginx support multi request per connection in future?
发件人: 林谡
发送时间: 2015年5月29日 11:37
收件人: 'nginx-devel at nginx.org'
主题: problems when use fastcgi_pass to deliver request to backend
Hi,
I write a fastcgi server and use nginx to pass request to my server. It works till now.
But I find a problem. Nginx always set requestId = 1 when sending fastcgi record.
I was a little upset for this, cause according to fastcgi protocol, web server can send fastcgi records belonging to different request simultaneously, and requestIds are different and keep unique. I really need this feature, because requests can be handled simultaneously just over one connetion.
Can I find a way out?
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.nginx.org/pipermail/nginx-devel/attachments/20150529/c1332f43/attachment-0001.html>
More information about the nginx-devel
mailing list