Large File Download causing GC overhead limit exceeded

496 views Asked by At

I have two services first frontend_service and second backend_service and I'm getting the large file from backend_service and trying to forward to user via frontend_service using response.getBodyAsStream() but this is causing "java.lang.OutOfMemoryError: GC overhead limit exceeded" in frontend_service.

code for backend_service:

`

public static Result downloadLargeFile(String filePath){
   File file = new File(filePath);
   InputStream inputStream = new FileInputStream(file);
   return ok(inputStream);
}

`

code for frontend_service:

`

  public static F.Promise<Result> downloadLargeFile(String filePath) {
       //this will call backend_service downloadLargeFile method.
       String backEndUrl = getBackEndUrl(filePath);
       return getInputStream(backEndUrl);
    }

`

`

public static Promise<Result> getInputStream(String url) {
            return WS.url(url).get().map(
                    response -> {
                        InputStream inputStream =  response.getBodyAsStream();
                        return ok(inputStream);
                    }
            );
}

`

I tried the solution suggested here by reading few bytes at a time from inputStream and creating tmp file in frontend_service and sending the tmp file as output from frontend_service.

`

    public static Promise<Result> getInputStream(String url) {
            return WS.url(url).get().map(
                    response -> {
                        InputStream inputStream = null;
                        OutputStream outputStream = null;
                        try {
                            inputStream =  response.getBodyAsStream();
                            //write input stream to tmp file
                            final File tmpFile = new File("/tmp/tmp.txt");
                            outputStream = new FileOutputStream(tmpFile);

                            int read = 0;
                            byte[] buffer = new byte[500];
                            while((read = inputStream.read(buffer)) != -1){
                                outputStream.write(buffer, 0 , read);
                            }
                            return ok(tmpFile);
                        } catch (IOException e) {
                            e.printStackTrace();
                            return badRequest();
                        } finally {
                            if (inputStream != null) {inputStream.close();}
                            if (outputStream != null) {outputStream.close();}
                        }
                    }
            );

`

Above code also throwing java.lang.OutOfMemoryError. I'm trying 1 GB file.

1

There are 1 answers

0
Andriy Kuba On

I do not have the implementation "under the hand", so I will write the algorithm.

1. Play uses the AsyncHttpClient under the WS. You need to get it, or create it as described in the https://www.playframework.com/documentation/2.3.x/JavaWS#Using-WSClient

2. Then, you need to implement the AsyncCompletionHandler, like in the description of the class https://static.javadoc.io/org.asynchttpclient/async-http-client/2.0.0/org/asynchttpclient/AsyncHttpClient.html

3. In the onBodyPartReceived method of the AsyncCompletionHandler class, you need to push the body part to the chunked play response. Chanked responses described here: https://www.playframework.com/documentation/2.3.x/JavaStream#Chunked-responses

P.S.

The discussion of the similar solution but in oposite direction - streaming uploading to the "backend" (Amazon) service through the "frontend" (play 2) service: https://groups.google.com/d/msg/asynchttpclient/EpNKLSG9ymM/BAGvwl0Wby8J