guzman.braso at gmail.com
Tue Mar 20 16:43:19 UTC 2012
I've one simple question, how is performance affected when opening a
small text file (a small database).
I know if the file is small linux itself will cache into memory the
file, but I'm worried about how it will affect each url request to
open the file, even from memory reading again the whole file, then
querying, then closing the handler, etc. All in vain because this file
won't change more than once per hour.
My question is:
There is any way to store information in memory to use between
different requests within perl?
On Thu, Mar 15, 2012 at 4:55 PM, Alexandr Gomoliako <zzz at zzz.org.ua> wrote:
> Hello, everyone.
> It's been awhile since my last announce, it was 220.127.116.11 roughly two
> months ago. And now it's 18.104.22.168.
> There were couple of bugfixes and new features. But most of the work
> was done on automated testing to make it work properly on things like
> cpantesters and travis-ci.
> So, it is now possible to use Nginx::Perl as a dependency for perl
> modules and test against it on cpantesters and travis-ci. Here's an
> example of such module and its tests:
> Also wanted to mention one example, just to show how much you can do:
> "In nginx-perl every request object $r is created and destroyed with
> nginx's own request. This means, that it is possible to reorder
> natural request flow in any way you want. It can be very helpful in
> case of DDOS, unusual load spikes or anything else you can think of."
> nginx mailing list
> nginx at nginx.org
Guzmán Brasó Núñez
Senior Perl Developer / Sysadmin
Mobile: +598 98 674020
More information about the nginx