WriteStream doesn't write all data

2.4k views Asked by At

1st I read in a file line by line with my code. (Round about 1650 lines in the file)

2nd I reformate each line of the file into many lines.

3rd I would like to write the output in a new file. Unfortunately it doesn't write all of the more than 16800 lines. The output varied round about 15500 lines.

For 3rd I use folling code:

var inputArr; //Splited Input of one line 
var Text; //inputArr transformed to a String with many lines (per start line)
var lineCounter = 0; //counts the expacted number of output lines

const fs = require('fs');
const writeStream = fs.createWriteStream('./output.txt');

for(var i=0; i<= inputArr.length; i++) {
  writeStream.write(Text);

  lineCounter = lineCounter + 1;
}

writeStream.end();

What can I do to write all lines into my output file?

2

There are 2 answers

8
jfriend00 On

What can I do to write all lines into my output file?

You can't write large amounts of data without detecting when the stream is full and then waiting for it to say its OK to write again. There's a pretty detailed example of how to do that in the stream.writable doc.

Here's an excerpt from the doc that shows how to do it:

// Write the data to the supplied writable stream one million times.
// Be attentive to back-pressure.
function writeOneMillionTimes(writer, data, encoding, callback) {
  let i = 1000000;
  write();
  function write() {
    let ok = true;
    do {
      i--;
      if (i === 0) {
        // last time!
        writer.write(data, encoding, callback);
      } else {
        // see if we should continue, or wait
        // don't pass the callback, because we're not done yet.
        ok = writer.write(data, encoding);
      }
    } while (i > 0 && ok);
    if (i > 0) {
      // had to stop early!
      // write some more once it drains
      writer.once('drain', write);
    }
  }
}

Basically, you have to pay attention to the return value from stream.write() and when it says the stream is full, you have to then restart writing on the drain event.


You don't show your whole code for both the reading and writing. If you're just reading a stream, modifying it and then writing the result to a different file, you should probably use piping, perhaps with a transform and then the streams will handle all the reading, writing and back pressure detection for you automaticallly.

You can read about transform streams here as this sounds like probably what you really want. You would then pipe the output of the transform stream to your output stream file and all the back pressure will be handled for you automatically.

0
srghma On

how I use the write

const { finished } = require('node:stream')
const { once } = require('events')
const fs = require('fs')

async function writeWithAwait(writable, chunk) {
  if (!writable.write(chunk)) {
    // Handle backpressure
    await once(writable, 'drain')
  }
}

function finishedAwait(stream) {
  return new Promise((resolve, reject) => {
    finished(stream, err => {
      if (err) {
        reject(err)
      } else {
        resolve()
      }
    })
  })
}

const writer = fs.createWriteStream(`/tmp/my.xml`)

writer.on('error', err => console.error(err))

writer.on('open', async function() {
  await writeWithAwait(writer, `text`)
  // writer.on("end", () => {
  //   ...why not called?
  // })
  writer.end() // TODO: do I need to call it?
  const maybeFinishError = await finishedAwait(writer)
  if (maybeFinishError) {
    console.error('Stream failed.', maybeFinishError)
    return
  }
  console.log('Stream is finished.');
  // do other stuff
})