Unirest get large file fails due to out of memory exception

1.9k views Asked by At

I'm using unirest to perform a get operation from a server. My problem is that when a large file is downloaded the action fails with a out of memory exception.

HttpResponse<InputStream> responseGet = Unirest.get("http://localhost:8080/BigDataTEst")
        .header("content-type", "*/*")
        .asBinary();    

Is there a way to solve this issue using unirest?

1

There are 1 answers

0
Przemek Nowak On

On which JRE version you run the application?

I had the same issue (heap space / out of memory error) for large files (above 100 mb) when I was using Unirest. The problem was laying on Apache Http Components library (it was exactly Arrays.copyOf method which was used under the hood by Apache Http Components).

When I started testing it on JRE 8 x64 the problem disappear (I suspect that implementation copyOf has been changed or something like that).

So I suggest to try on different JRE or you could always use Apache Commons IO and FileUtils.copyURLToFile