How to use BIG data in my module?
Igor Sysoev
igor at sysoev.ru
Fri May 6 08:44:10 MSD 2011
On Fri, May 06, 2011 at 12:38:55PM +0800, XueHJ wrote:
> ----- Original Message -----
> From: "Piotr Sikora" <piotr.sikora at frickle.com>
> To: <nginx-devel at nginx.org>
> Sent: Thursday, May 05, 2011 8:26 PM
> Subject: Re: How to use BIG data in my module?
>
>
> > Hi,
> >
> >> in which to load into memory a BIG data (about 800MB),
> >
> > 800MB isn't really "big data" [0].
> >
> >> the processing of each HTTP request must use the data,
> For me it really is a large data, huh, huh
> >
> > Each request must use all or just parts of the data?
> Yes, just parts of the data.
> >
> >> how do I store the data? What structure to use?
> >
> > That of course depends on the data and the processing.
> > There is no general answer.
> The reason why the data is loaded into memory, hoping to respond to requests faster.
> But how to achieve it? Use ngx_shared_memory?Asked me a suggestion.
If this is read only data, you can read at at configuration phase
and they will be shared among all workers due to fork().
Also, you can just mmap() the file.
--
Igor Sysoev
More information about the nginx-devel
mailing list