Downloading large ( + 5GB ) files in a multipart fashion from S3

stuartweir nginx-forum at
Tue Aug 29 09:13:36 UTC 2017

Original Problem domain:

Rails/Rack send_data method attempts downloading whole file onto the server
in memory, to then send to the client. This would fail with large files
because the server ran out of memory.

Original Solution:

Use Passenger NGINX to create a proxy_pass between the client and the S3
bucket in a secure way (the client never sees the actual S3 bucket URL -
opted for this over timed URLs, because the clients might need the URL

New Problem domain:

NGINX handles the download properly, but then once the download gets to
about 5 GB, the download "fails" in Chrome and needs to be "retried" in
order to continue downloading the file. This is due to a restriction S3 has
with downloading files larger than 5 GB. 

What I'm hoping to have answered:

Is there a way to initiate a multipart-like download using only NGINX? I
know that there is such thing as a multipart upload, but I would like a
multipart download, of some sort. Because Rails makes a single response to a
request (without doing some very clunky magic to make it do otherwise) I'd
like to use something like the Range header with NGINX, except I don't want
to specify the exact range, because that means I have to make several
responses (which is the case with the header, it looks like, and forces me
into the clunky rails issue).

Thanks for any and all help!


Posted at Nginx Forum:,276177,276177#msg-276177

More information about the nginx mailing list