I want to download large files (500mb) with hyper, and be able to resume if the download fails.
Is there any way with hyper to run some function for each chunk of data received? The send()
method returns a Result<Response>
, but I can't find any methods on Response that return an iterator over chunks. Ideally I'd be able to do something like:
client.get(&url.to_string())
.send()
.map(|mut res| {
let mut chunk = String::new();
// write this chunk to disk
});
Is this possible, or will map
only be called once hyper has downloaded the entire file?
Hyper's
Response
implementsRead
. It means thatResponse
is a stream and you can read arbitrary chunks of data from it as you would usually do with a stream.For what it's worth, here's a piece of code I use to download large files from ICECat. I'm using the
Read
interface in order to display the download progress in the terminal.The variable
response
here is an instance of Hyper'sResponse
.This is usually implemented with the HTTP "Range" header (cf. RFC 7233).
Not every server out there supports the "Range" header. I've seen a lot of servers with a custom HTTP stack and without the proper "Range" support, or with the "Range" header disabled for some reason. So skipping the Hyper's
Response
chunks might be a necessary fallback.But if you want to speed things up and save traffic then the primary means of resuming a stopped download should be by using the "Range" header.
P.S. With Hyper 0.12 the response body returned by the Hyper is a
Stream
and to run some function for each chunk of data received we can use thefor_each
stream combinator: