java.util.zip.DataFormatException: invalid stored block lengths

140 views Asked by At

I have a Spring Boot application with an API accepting MultiparFile, i am reading the File inputStream and using XSSFWorkbook to process the stream.

Local it works fine with the same docker image running on container.

at org.apache.commons.compress.archivers.zip.ZipArchiveInputStream.readFromInflater(ZipArchiveInputStream.java:660)"
at java.base/java.util.zip.Inflater.inflate(Inflater.java:378)"
at java.base/java.util.zip.Inflater.inflateBytesBytes(Native Method)"
Caused by: java.util.zip.DataFormatException: invalid stored block lengths"

in AWS ECS cluster somehow i see the stream seems to be corrupted, i even added a dummy API to simply return the ByteArray to Response stream as attachment and saved to local machine to check and guess what ! the FIle when opened said its corrupted.

This is happening only for Excel Files and not txt files please suggest if you were able to somehow fix your issue, i can take some suggestions from your solution.

Tried using

@RequestParam("file") MultipartFile file

with

try (InputStream workbookStream = file.getInputStream();
 Workbook workbook = new XSSFWorkbook(workbookStream)) {
        Sheet sheet = workbook.getSheetAt(0);
        log.info("Sheet Name: "+sheet.getSheetName());
}
0

There are 0 answers