Question about proxy cache when it expires
Maxim Dounin
mdounin at mdounin.ru
Wed May 20 17:42:12 MSD 2009
Hello!
On Wed, May 20, 2009 at 04:36:12PM +0400, Igor Sysoev wrote:
> On Wed, May 20, 2009 at 02:32:39PM +0200, J?r?me Loyet wrote:
>
> > OK
> >
> > I'll try to look into the code to see what I can do. Do you have any
> > lead to guide me on this quest ? :)
>
> This is complex thing. It requires sending notifications from one worker
> to another when busy lock is being freed.
BTW, what about something like "in-process" busy locks? This will
effectively limit number of requests simulteneously send to
backends to number of worker processes. At least it looks much
better than nothing, and should be simpler.
Maxim Dounin
>
> > 2009/5/20 Igor Sysoev <is at rambler-co.ru>:
> > > On Tue, May 19, 2009 at 05:14:11PM +0200, J?r?me Loyet wrote:
> > >
> > >> > On Tue, May 12, 2009 at 10:38:04AM +0200, J?r?me Loyet wrote:
> > >> >
> > >> >> Hello igor,
> > >> >>
> > >> >> I have a question about the cache behaviour in proxy mode.
> > >> >>
> > >> >> I have nginx in front head which redirect to an apache back end. Nginx
> > >> >> caches eveything for M minutes.
> > >> >>
> > >> >> If I have a large number of requests for the same page and this page
> > >> >> is cached : nginx returns the cached page ... no problems
> > >> >> After M minutes, the cached page expires
> > >> >> The first request coming after the expiration makes nginx to ask the
> > >> >> backend for refresh
> > >> >> When nginx receives the backend fresh response, it's saved to cache
> > >> >> and then nginx serves the fresh cached page
> > >> >>
> > >> >> But what happen between the start of the request to the backend and
> > >> >> the end of the response from the backend ? (let's assume that the
> > >> >> backend serves the page in 5s ... and in 5s I can have a lot of
> > >> >> request to this page).:
> > >> >> - Are the request queued waiting for the backend response ?
> > >> >> - Every request makes try to refresh the cache from the backend ? (in
> > >> >> this case, I have multiple request for the same page to the backend
> > >> >> ... I can have a burst of request and my apache can be overflowed by
> > >> >> request -- that's why I'm using nignx with cache).
> > >> >> - Do the requests serve the cached page even if it's expired until the
> > >> >> backend response has been received ?
> > >> >> - Maybe something else :)
> > >> >
> > >> > Currently all requests which found that a cached response expired
> > >> > are proxied to backend. I plan to implement busy locks to pass the single
> > >> > request and leave others to wait the response up to specified time.
> > >> >
> > >>
> > >> Hi igor,
> > >>
> > >> about this feature. Do you know when you plan do implement it ? I
> > >> really need this feature. If you don't have enough time, I can look
> > >> into it if you explain to me briefly how you want to do it.
> > >
> > > This is complex thing that I plan to implement in 0.8.
> > >
> > >
> > > --
> > > Igor Sysoev
> > > http://sysoev.ru/en/
> > >
> > >
>
> --
> Igor Sysoev
> http://sysoev.ru/en/
>
More information about the nginx
mailing list