Redis storage for cache

Ragnar Rova rr at mima.x.se
Sun Dec 29 16:21:28 UTC 2019


Hello Sergey, the same wishes for you.

Thanks for the link to the redis module. My use case is that I want to
cache dynamic responses from a origin server just in case the origin is
unhealthy. The origin server already uses redis as a cache, I just wanted
to put nginx in front in case of a total failure of that component. Your
suggestion gave me some ideas; since the app server already writes query
responses to redis and caches internally, I can use your module to just try
to read all responses from redis on the same keys from nginx.

So, how do I configure nginx to do the following:

For each request:

1. Try to proxy each request directly to origin using proxy_pass
2. If origin server responds with http 5xx, or timeout/connection refused
occurs, use the data from redis (which the origin server has written itself
to redis previously)


Also, I need to calculate the same redis key in nginx as the origin server
does. Currently the redis key is a stringified JSON object... I might need
some additional modules to construct this key.

I will test the following based on your docs at
https://github.com/osokin/ngx_http_redis.

http
{
 ...
        server {
                location / {
                        proxy_pass      backed;
                        error_page      500 503 502 504 = @fallback; //
does this cover timeout and connect refused as well?
                }

                location @fallback {
                        set $redis_key  "$uri?$args"; // this key needs to
be calculated differently, some query params form a JSON key in case of GET
requests, for POST requests it is taken from the POST body.
                        redis_pass      127.0.0.1:6379;
                }
        }
}



On Sun, Dec 29, 2019 at 9:58 AM Sergey A. Osokin <osa at freebsd.org.ru> wrote:
>
> Hi Ragnar,
>
> hope you're doing well.
>
> On Sat, Dec 28, 2019 at 10:58:06AM +0100, Ragnar Rova wrote:
> > I want to use redis as the storage for the cache instead of the
filesystem.
> >
> > I found a third-party module which seems to offer this:
> > https://github.com/openresty/srcache-nginx-module#caching-with-redis, is
> > this the recommended solution? Hot requests should be served from
memory,
> > with redis as a fallback and use to populate the cache on startup.
>
> The solution you've mentioned uses ngx_http_redis module to get data from
> a redis database.  Usually, it's possible to increase a stability and per-
> formance of a web service by pushing a static content (html, graphics,
binary
> files) to a redis database and reroute requests from an application server
> to the redis.
>
> Please visit https://github.com/osokin/ngx_http_redis, Example 1 for
details.
>
> --
> Sergey Osokin
> _______________________________________________
> nginx mailing list
> nginx at nginx.org
> http://mailman.nginx.org/mailman/listinfo/nginx
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.nginx.org/pipermail/nginx/attachments/20191229/fc177b34/attachment.htm>


More information about the nginx mailing list