Nodejs readable-stream vs array.map

125 views Asked by At

In studying streams, the following example was provided:

  const data = ['some', 'data', 'to', 'read']
  const { Readable } = require('stream')
  const readable = Readable.from(data)
  readable.on('data', (data) => { console.log('got data:', data) })
  readable.on('end', () => { console.log('finished reading') })

This of course outputs got data: some followed by got data: data and so on and so forth for each array element.

I understand why a tutorial would use a simple example; however, I am struggling to imagine real world use cases for this snippet.

Speaking for the basic utility of the statement, it seems like no more than an obfuscation for any other way to loop over an array:

  // eg.
  const data = ['some', 'data', 'to', 'read']
  data.map(data => console.log('got data:', data));

I see the benefit of the second being brevity and the con for the prior being complexity. However, the second snippet lacks flexibility for binding stream events. What other information is useful to help understand when and in what cases to implement a read stream in this way?

Please share a code snippet that makes use of the readable stream (from an array or otherwise) in a way that demonstrates the power of the flexibility. Also share any notes that explain better why a readable stream would be more useful in this or other cases.

1

There are 1 answers

1
btm me On

#some more example

 const { Readable, Transform, pipeline } = require('stream');
    
    // Create an array of objects
    const data = [
      { name: 'Alice', age: 25 },
      { name: 'Bob', age: 30 },
      { name: 'Charlie', age: 28 },
    ];
    
    // Create a Readable stream from the array
    const readable = Readable.from(data);
    
    // Create a Transform stream to modify the data
    const transformStream = new Transform({
      objectMode: true,
      transform(chunk, encoding, callback) {
        chunk.age++; // Increment age for each object
        callback(null, chunk);
      },
    });
    
    // Pipe the readable stream through the transform stream and log the modified data
    pipeline(
      readable,
      transformStream,
      process.stdout, // Output to the console
      (err) => {
        if (err) {
          console.error('Pipeline failed.', err);
        } else {
          console.log('Pipeline succeeded.');
        }
      }
    );