I am using the ReadableStream API to stream blocks of a large CSV file of 128 MB with more than 300,000 rows of data. I have followed the MDN ReadableStream documentation to create a ReadableStream and enqueue chunks, but somehow the ReadableStream stops reading halfway. It ends up reading only 40,000 rows. My code:
const reader = file.stream().getReader();
reader.read().then(function processText({ done, value}) {
if (done) {
console.log('done');
return;
}
// console.log('streaming');
console.log(value);
return reader.read().then(processText);
});