<div style="line-height:1.7;color:#000000;font-size:14px;font-family:Arial"><br><div>Thank you for your advice. But may I ask you that how do you store your static web pages on your server? If there are too many pages in a directory, is it possible that the process of looking up the pages could affect the performance of the web server ?</div><div>Here I have another question.</div><div>For a dynamic website , some dynamic contents can be generated as static pages in advance. e.g the page showing the detail information of a certain product. We know that how many products are there in our websites.</div><div>but some dynamic contents are difficult to be generated in advance such as the the contents of some search results. There are lots of search words for a website for which there are too many result items. How should we handle this issue?</div><br><br><div style="position:relative;zoom:1"></div><div id="divNeteaseMailCard"></div><br>At 2018-11-02 21:16:18, "Peter Booth via nginx" <nginx@nginx.org> wrote:<br> <blockquote id="isReplyContent" style="PADDING-LEFT: 1ex; MARGIN: 0px 0px 0px 0.8ex; BORDER-LEFT: #ccc 1px solid"><div dir="ltr"><span></span></div><div dir="ltr"><div dir="ltr"><span></span></div><div dir="ltr">So this is a very interesting question. I started writing dynamic websites in 1998. Most developers don¡¯t want to generate static sites. I think their reasons are more emotional than technical. About seven years ago I had two jobs - the day job was a high traffic retail fashion website. the side job was a very similar site, implemented as a static site that was recreated when content changed. The dynamic site had (first request) latencies of about 2 sec. The static site had typical latencies of 250ms. That¡¯s almost 10x faster. It also cost about 2% of what the dynamic site cost to run.<div><br></div><div> Sounds like you¡¯re planning to do things the smart way. You haven¡¯t said how busy your site is. Assuming that your hardware is Linux then your content will all be sitting in Linus¡¯s page cache, so on a recent model server a well tuned Ng Inc can serve well over 100,000 requests per sec. the key is to use your browser cache whenever possible. As well as making good use if compute resources, a website like this is much more reliable than a dynamic site. There are few moving parts that can go wrong. Have fun!</div><div><br></div><div>Pete<br><br><div id="AppleMailSignature" dir="ltr">Sent from my iPhone</div><div dir="ltr"><br>On Nov 2, 2018, at 1:18 AM, yf chu <<a href="mailto:cyflhn@163.com">cyflhn@163.com</a>> wrote:<br><br></div><blockquote type="cite"><div dir="ltr"><div style="line-height:1.7;color:#000000;font-size:14px;font-family:Arial"><p class="MsoPlainText"><span lang="EN-US">I have a website with tens of millions of pages.
The content on the page stored in database but the data is not changed very
frequently. so for the sake of improving the performance of the website and
reducing the costs of deployment of web applications, I want to generate the
static pages for the dynamic content and refresh the pages if the contents are
changed. But I am very concerned about how to manage these large amount of
pages. how should I store these pages? I plan to use Nginx to
manage these pages. Is it possible that it will cause IO problems when the web
server handle many requests? What is the capability of handing requests for Nginx ? Is there any better solutions for this issue? <o:p></o:p></span></p></div><br><br><span title="neteasefooter"><p> </p></span></div></blockquote><blockquote type="cite"><div dir="ltr"><span>_______________________________________________</span><br><span>nginx mailing list</span><br><span><a href="mailto:nginx@nginx.org">nginx@nginx.org</a></span><br><span><a href="http://mailman.nginx.org/mailman/listinfo/nginx">http://mailman.nginx.org/mailman/listinfo/nginx</a></span></div></blockquote></div></div></div></blockquote></div><br><br><span title="neteasefooter"><p> </p></span>