1st I read in a file line by line with my code. (Round about 1650 lines in the file)
2nd I reformate each line of the file into many lines.
3rd I would like to write the output in a new file. Unfortunately it doesn't write all of the more than 16800 lines. The output varied round about 15500 lines.
For 3rd I use folling code:
var inputArr; //Splited Input of one line
var Text; //inputArr transformed to a String with many lines (per start line)
var lineCounter = 0; //counts the expacted number of output lines
const fs = require('fs');
const writeStream = fs.createWriteStream('./output.txt');
for(var i=0; i<= inputArr.length; i++) {
writeStream.write(Text);
lineCounter = lineCounter + 1;
}
writeStream.end();
What can I do to write all lines into my output file?
You can't write large amounts of data without detecting when the stream is full and then waiting for it to say its OK to write again. There's a pretty detailed example of how to do that in the stream.writable doc.
Here's an excerpt from the doc that shows how to do it:
Basically, you have to pay attention to the return value from
stream.write()
and when it says the stream is full, you have to then restart writing on thedrain
event.You don't show your whole code for both the reading and writing. If you're just reading a stream, modifying it and then writing the result to a different file, you should probably use piping, perhaps with a transform and then the streams will handle all the reading, writing and back pressure detection for you automaticallly.
You can read about transform streams here as this sounds like probably what you really want. You would then pipe the output of the transform stream to your output stream file and all the back pressure will be handled for you automatically.