Node.js PassThrough stream not closing properly?

10.6k views Asked by At

I'm curious about my PassThrough stream and why it isn't closing after a resource I pipe it to closes. I'm using it as a mediator, one resource needs a ReadableStream and I need to pass the user a WriteableStream to allow them to write the underlying resource. At first a Duplex stream seemed ideal, but required some implementation, then I found a PassThrough stream.

EDIT: Best description of this problem here: https://gist.github.com/four43/46fd38fd0c929b14deb6f1744b63026a

Original example: Check this out:

const fs = require('fs');
const stream = require('stream');

const passThrough = new stream.PassThrough({allowHalfOpen: false});
const writeStream = new fs.createWriteStream('/tmp/output.txt');

passThrough.pipe(writeStream)
    .on('end', () => console.log('full-end'))
    .on('close', () => console.log('full-close'))
    .on('unpipe', () => console.log('full-unpipe'))
    .on('finish', () => console.log('full-finish'));
passThrough
    .on('end', () => console.log('passThrough-end'))
    .on('close', () => console.log('passThrough-close'))
    .on('unpipe', () => console.log('passThrough-unpipe'))
    .on('finish', () => console.log('passThrough-finish'));

passThrough.end('hello world');

Actual output:

passThrough-finish
passThrough-end
full-unpipe
full-finish
full-close

Seems like the write side does it's job, but the "read" side of the PassThrough stream doesn't propegate the close, even though the "allowHalfOpen" option was passed as false (and I can verify the option took in the debugger).

Am I going about this all wrong? How would I propagate the close of the writeStream?

Thanks.

Edit: I'm finding out the same is true of transform streams, they just aren't ended cone the pipe is closed. Is there a way to manually close them? transform.end() never causes the stream to toss a "close" event, just "finish" and "end" events which are fired before the underlying resource succeeds.

Edit2: I put together this Gist: https://gist.github.com/four43/46fd38fd0c929b14deb6f1744b63026a

That shows me that the readable in readable.pipe(writable) is closed down properly when the writable finishes. That would lead me to believe that when I do transform.pipe(writable) it would close the "readable" side of the transform stream, and since I already "closed" the writable side with .end(), it should close the whole stream. Side note of interest: read is tossing events even though we never use it in Test 2. Could be an isolation issue, but I think my timeout wait does a pretty good job.

1

There are 1 answers

3
peteb On

If you want to know when writeStream is done writing then just listen for the 'finish' event on writeStream

const fs = require('fs');
const stream = require('stream');

const passThrough = new stream.PassThrough({allowHalfOpen: false});
const writeStream = new fs.createWriteStream('/tmp/output.txt');

passThrough
    .on('error', (err) => console.error(err))
    .on('end', () => console.log('passThrough-end'))
    .on('close', () => console.log('passThrough-close'))
    .on('unpipe', () => console.log('passThrough-unpipe'))
    .on('finish', () => console.log('passThrough-finish'));

writeStream
    .on('error', (err) => console.error(err))
    .on('close', () => console.log('full-close'))
    .on('unpipe', () => console.log('full-unpipe'))
    .on('finish', () => console.log('full-finish'));

// passThrough-finish written because all Writes are complete
passThrough.end('hello world');

passThrough.pipe(writeStream);