Can Nginx handle millions of static pages or pictures ?
Peter Booth
peter_booth at me.com
Fri Nov 2 15:02:31 UTC 2018
The too many files in a directory can be a pain in the backside if you get to the 100s of thousands - but that’s unto you to create relevant subdirectories.
Imagine that your website was a retail store selling millions of possible products.
For search results it depends upon whether results vary per user.
For one site I worked on, if I searched for “green jeans” I could get the same list of pages as you,
and so these pages would be cached sp that if we both requested green jeans nginx would only request the page once.
This is very useful for protecting against denial of service attacks as , for example you can configure nginx to only send
one request at a time for the same url to the back-end, and for other requests to wait and return the same content to each user.
The logic for this caching can be very subtle - if I had was logged in and had preference set “show all prices in Australian dollars”
then I’d expect a different page than you for the same item. It’s also possible that your pages might mix filtering and search -
so I might click predefined categories - cocktail dresses/size 4/red then add free text to search with.
nginx’s caching features are tremendous powerful and can be extended with lua code using, for example, the openresty bundle of nginx.
I was amazed that I never found a use case that couldn’t be solved with nginx functionality, to the point where a tv show
could invite viewers select a specific URL at some point and the hundreds of thousands of requests ended up
generating only one request for the backend, and the site stayed up under such spiky loads.
My tip is to start simple and add one feature at a time and understand your web server logs, which contain lots of information.
Peter
> On 2 Nov 2018, at 10:45 AM, yf chu <cyflhn at 163.com> wrote:
>
>
> Thank you for your advice. But may I ask you that how do you store your static web pages on your server? If there are too many pages in a directory, is it possible that the process of looking up the pages could affect the performance of the web server ?
> Here I have another question.
> For a dynamic website , some dynamic contents can be generated as static pages in advance. e.g the page showing the detail information of a certain product. We know that how many products are there in our websites.
> but some dynamic contents are difficult to be generated in advance such as the the contents of some search results. There are lots of search words for a website for which there are too many result items. How should we handle this issue?
>
>
>
> At 2018-11-02 21:16:18, "Peter Booth via nginx" <nginx at nginx.org> wrote:
> So this is a very interesting question. I started writing dynamic websites in 1998. Most developers don’t want to generate static sites. I think their reasons are more emotional than technical. About seven years ago I had two jobs - the day job was a high traffic retail fashion website. the side job was a very similar site, implemented as a static site that was recreated when content changed. The dynamic site had (first request) latencies of about 2 sec. The static site had typical latencies of 250ms. That’s almost 10x faster. It also cost about 2% of what the dynamic site cost to run.
>
> Sounds like you’re planning to do things the smart way. You haven’t said how busy your site is. Assuming that your hardware is Linux then your content will all be sitting in Linus’s page cache, so on a recent model server a well tuned Ng Inc can serve well over 100,000 requests per sec. the key is to use your browser cache whenever possible. As well as making good use if compute resources, a website like this is much more reliable than a dynamic site. There are few moving parts that can go wrong. Have fun!
>
> Pete
>
> Sent from my iPhone
>
> On Nov 2, 2018, at 1:18 AM, yf chu <cyflhn at 163.com <mailto:cyflhn at 163.com>> wrote:
>
>> I have a website with tens of millions of pages. The content on the page stored in database but the data is not changed very frequently. so for the sake of improving the performance of the website and reducing the costs of deployment of web applications, I want to generate the static pages for the dynamic content and refresh the pages if the contents are changed. But I am very concerned about how to manage these large amount of pages. how should I store these pages? I plan to use Nginx to manage these pages. Is it possible that it will cause IO problems when the web server handle many requests? What is the capability of handing requests for Nginx ? Is there any better solutions for this issue?
>>
>>
>>
>>
>> _______________________________________________
>> nginx mailing list
>> nginx at nginx.org <mailto:nginx at nginx.org>
>> http://mailman.nginx.org/mailman/listinfo/nginx <http://mailman.nginx.org/mailman/listinfo/nginx>
>
>
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://mailman.nginx.org/pipermail/nginx/attachments/20181102/b73cb508/attachment-0001.html>
More information about the nginx
mailing list