Strategies for large-file upload?

Jeffrey Walton noloader at gmail.com
Tue Feb 8 18:48:53 UTC 2022


On Tue, Feb 8, 2022 at 9:27 AM Alva Couch <ACouch at cuahsi.org> wrote:
>
> I’m new to this list but have been running an NGINX community stack including NGINX community/Gunicorn/Django for several years in production.
>
> My site is a science data repository. We have a need to accept very large files as uploads: 10 gb and above, with a ceiling of 100 gb or so.
>
> What strategies have people found to be successful for uploading very large files? Please include both “community” and “plus” solutions.

I hope I don't sound like a heretic, but I would consider another
solution like scp or sftp.

The reason for the suggestion... web servers serve web pages. They are
not file transfer agents. Use the right tool for the job.

And I am happy to concede web servers do a fine job with small files,
like 4KB or 4MB. But the limit is in place to protect the web server
and keep it on track with its primary mission of serving content, not
receiving it.

Jeff



More information about the nginx mailing list