I am retrieving a file in chunks from the server and downloading it to the system. The server response is in XML format with a buffer. To manage file size, I receive data in chunks, load it into browser memory, and then initiate the file download. However, for larger files, this approach leads to browser crashes due to memory exhaustion. I'm exploring alternatives to download the file without loading all the chunks into browser memory.
<Request>
<Function>cloud::readFile</Function>
<ArgList>
<Argument name="fileHandle">open-file-handle</Argument>
<Argument name="size">number-of-bytes-to-read</Argument>
</ArgList>
</Request>
and the response looks like this
<Response>
<Status>error-code</Status>
<StatusMessage>error-message</StatusMessage>
<ResultList>
<Result name="status">status-value</Result>
<Result name="buffer">buffer-in-base64</Result>
<Result name="size">number-of-bytes-read</Result>
</ResultList>
</Response>
loop through the file size in a while loop and, fetch the data, until all chunks are received, and then I perform download operation
like this
const a = document.createElement('a')
a.href = formatBase64Link(combinedResponse) // base 64 encoded file buffer
a.download = name
a.click()
Any insights or suggestions on troubleshooting and resolving this problem would be greatly appreciated. Thank you.