It is unclear as to whether Alamofire supports chunked data for large or progressive data sets. This is a much needed feature for my application, otherwise I may have to look into alternative methods.
On the Alamofire Github page, it states Progress Closure & NSProgress
but I'm not sure what that entails.
And per the description on Wikipedia on Chunked data transfer.
Senders can begin transmitting dynamically-generated content before knowing the total size of that content.
For clarity's sake, let me explain why I need this.
Basically I have a very large JSON file that is partially cached. The full JSON file is composed up of smaller JSON objects. I am using iojs
/ nodejs
to send the chunked data via res.write()
with Express
which knows not to send the Content-Length
header AND send it as chunked data. I have verified this works via html/js
.
Let me know if you would like for me to provide the code to demonstrate this!
Alamofire definitely supports
Transfer-Encoding: chunked
data due to the fact that it is already supported byNSURLSession
. Here is a quick example of downloading a chunked image from httpwatch.com.Since the content length of the image is not available, the
totalBytesExpected
will always be reported as-1
since it is unknown. ThebytesReceived
andtotalBytesReceived
are reported properly though. It seems to me that chunked downloads are probably not the best candidate for trying to present download progress to the user since it is of an undefined length.Another possible feature that could be of use is the new
stream
functionality on a request. It allows you to store each data chunk as it is downloaded.If these options don't suit all your needs, please file an issue on our Github project with the issues you are running into so we can investigate further.