I've implemented a custom BackupAgent and part of my data are images which are about 1 MB large. When creating the backup, every image is written as a separate entity. On restoring the images, I wanted to read the data in 4K (BUFFER_SIZE) chunks like this and write it to a file like this:
FileOutputStream out = new FileOutputStream(file);
byte[] buffer = new byte[BUFFER_SIZE];
int offset = 0;
int n = 0;
// readEntityData returns 0 when all data of entity is read
while (0 != (n = data.readEntityData(buffer, offset, BUFFER_SIZE))) {
out.write(buffer, 0, n);
offset += n;
}
However, this only reads the first 4K chunk correctly, on the second call of readEntityData
an IOException with error code 0xffffffff is thrown.
When I make the buffer as large as the entity's data size and read all the data at once, it works perfectly, but I think it would be safer to use a smaller buffer.
Has anybody experienced something like that? All examples I found read the data at once and not in multiple chunks.