Memory efficient FastByteArrayOutputStream with minCapacity greater than max size of Integer i.e 2147483647

260 views Asked by At

FastByteArrayOutputStream has a write function which has addBuffer which accepts minCapacity in integer and allocates next block size as next power of 2 of minCapacity. So, block size keeps increasing in order to accomodate the file in buffer.

I have a file greater than max size, (Internally diving it to 3 files, pushing them to outputstream an finally creating it a file in azure storage) so while writing it to buffer the minCapacity goes over max integer value of 2147483647 and starts assigning block size in signed integer -2147483648, which is invalid and gives exception as in the image attached.

1

There are 1 answers

7
Stephen C On

FastByteArrayOutputStream will not work for your use-case. While it uses a Deque<byte[]> internally, that is just an optimization to reduce the amount of copying. If you look at the source code, you will see that there are a number of places that limit the size to the maximum size of a byte[] ... which is 2^31 - 1 bytes; i.e. 2GB - 1.

I've got a file greater than max size ...

Possible solutions:

  1. If you are outputting the file, write the data directly to the file or socket. There is no obvious reason to write to use a ByteArrayOutputStream variant for this. It won't improve performance!

  2. Take the source code for FastByteArrayOutputStream and modify it for your own purposes. However, you will run into the problem that the getByteArray and unsafeGetByteArray methods are unimplementable for 2GB and larger content. And similar issues.

There may be other solutions, but it is hard to say. You don't explain what you are actually doing.