nginx tries to allocate a huge number of memory and fails
Maxim Dounin
mdounin at mdounin.ru
Thu Aug 4 06:29:36 UTC 2011
Hello!
On Thu, Aug 04, 2011 at 11:47:08AM +0800, Mauro Stettler wrote:
> hi
>
> i'm having a strange case where nginx is trying to allocate some
> really huge amount of memory. we created a new vhost on nginx to
> provide an internal interface to flush the apc cache of php. in most
> of the cases i can request this and get what i expected, only in
> around 20% of the request nginx is trying to allocate a huge number of
> memory and fails, so it returns 500 internal server error.
>
> 2011/08/04 05:14:00 [emerg] 41529#0: *89012062 malloc()
> 18446744073709545066 bytes failed (12: Cannot allocate memory),
> client: 192.168.1.195, server: *.kaufmich.vpn, request: "GET
> /web1/apc/flushApc.php HTTP/1.1", host: "static.kaufmich.vpn"
>
>
>
> does anybody have an idea why that would happen? i thought this almost
> has to be a bug because the amount of bytes it is trying to allocate
> is just so huge it just cant be right.
[...]
> # nginx -V
> nginx version: nginx/0.7.65
[...]
> if ($request_uri ~* /(.+)/(.+\.php)) {
> set $prod_server $1;
> set $action $2;
> set $script $3;
> set $parameters "";
This is a problem with using unitialized $n variables. Note the
above regexp has only 2 captures but following "set" uses variable
"$3".
This problem is fixed in 0.8.25+. Please upgrade.
Maxim Dounin
More information about the nginx
mailing list