How to use BIG data in my module?
allegro at mychinamap.com
Fri May 6 10:14:37 MSD 2011
----- Original Message -----
From: "Igor Sysoev" <igor at sysoev.ru>
To: <nginx-devel at nginx.org>
Sent: Friday, May 06, 2011 12:44 PM
Subject: Re: How to use BIG data in my module?
> On Fri, May 06, 2011 at 12:38:55PM +0800, XueHJ wrote:
>> ----- Original Message -----
>> From: "Piotr Sikora" <piotr.sikora at frickle.com>
>> To: <nginx-devel at nginx.org>
>> Sent: Thursday, May 05, 2011 8:26 PM
>> Subject: Re: How to use BIG data in my module?
>> > Hi,
>> >> in which to load into memory a BIG data (about 800MB),
>> > 800MB isn't really "big data" .
>> >> the processing of each HTTP request must use the data,
>> For me it really is a large data, huh, huh
>> > Each request must use all or just parts of the data?
>> Yes, just parts of the data.
>> >> how do I store the data? What structure to use?
>> > That of course depends on the data and the processing.
>> > There is no general answer.
>> The reason why the data is loaded into memory, hoping to respond to requests faster.
>> But how to achieve it? Use ngx_shared_memory?Asked me a suggestion.
> If this is read only data, you can read at at configuration phase
> and they will be shared among all workers due to fork().
> Also, you can just mmap() the file.
Yes, The data is read only.
For the first solution, Can a rough guide the structure?
Before that I just wrote some simple module.
> Igor Sysoev
> nginx-devel mailing list
> nginx-devel at nginx.org
More information about the nginx-devel