Is Node.js native Promise.all processing in parallel or sequentially?

195.9k views Asked by At

I would like to clarify this point, as the documentation is not too clear about it;

Q1: Is Promise.all(iterable) processing all promises sequentially or in parallel? Or, more specifically, is it the equivalent of running chained promises like

p1.then(p2).then(p3).then(p4).then(p5)....

or is it some other kind of algorithm where all p1, p2, p3, p4, p5, etc. are being called at the same time (in parallel) and results are returned as soon as all resolve (or one rejects)?

Q2: If Promise.all runs in parallel, is there a convenient way to run an iterable sequencially?

Note: I don't want to use Q, or Bluebird, but all native ES6 specs.

14

There are 14 answers

16
Bergi On BEST ANSWER

Is Promise.all(iterable) executing all promises?

No, promises cannot "be executed". They start their task when they are being created - they represent the results only - and you are executing everything in parallel even before passing them to Promise.all.

Promise.all does only await multiple promises. It doesn't care in what order they resolve, or whether the computations are running in parallel.

is there a convenient way to run an iterable sequencially?

If you already have your promises, you can't do much but Promise.all([p1, p2, p3, …]) (which does not have a notion of sequence). But if you do have an iterable of asynchronous functions, you can indeed run them sequentially. Basically you need to get from

[fn1, fn2, fn3, …]

to

fn1().then(fn2).then(fn3).then(…)

and the solution to do that is using Array::reduce:

iterable.reduce((p, fn) => p.then(fn), Promise.resolve())
17
david_adler On

In parallel

await Promise.all(items.map(async (item) => { 
  await fetchItem(item) 
}))

Advantages: Faster. All iterations will be started even if one fails later on. However, it will "fail fast". Use Promise.allSettled, to complete all iterations in parallel even if some throw. Technically, these are concurrent invocations not in parallel.

In sequence

for (const item of items) {
  await fetchItem(item)
}

Advantages: Variables in the loop can be shared by each iteration. Behaves like normal imperative synchronous code.

0
Nick Kotenberg On

I've been using for of in order to solve sequential promises. I'm not sure if it helps here but this is what I've been doing.

async function run() {
    for (let val of arr) {
        const res = await someQuery(val)
        console.log(val)
    }
}

run().then().catch()
1
Deepak Sisodiya On

You can do it by for loop.

async function return promise:

async function createClient(client) {
    return await Client.create(client);
}

let clients = [client1, client2, client3];

if you write following code then client are created parallelly:

const createdClientsArray = yield Promise.all(clients.map((client) =>
    createClient(client);
));

But if you want to create client sequentially then you should use for loop:

const createdClientsArray = [];
for(let i = 0; i < clients.length; i++) {
    const createdClient = yield createClient(clients[i]);
    createdClientsArray.push(createdClient);
}
0
mehrdad salehi On

see this sample

Promise.all working parallel

const { range, random, forEach, delay} = require("lodash");  
const run = id => {
    console.log(`Start Task ${id}`);
    let prom = new Promise((resolve, reject) => {
        delay(() => {
            console.log(`Finish Task ${id}`);
            resolve(id);
        }, random(2000, 15000));
    });
    return prom;
}


const exec = () => {
    let proms = []; 
    forEach(range(1,10), (id,index) => {
        proms.push(run(id));
    });
    let allPromis = Promise.all(proms); 
    allPromis.then(
        res => { 
            forEach(res, v => console.log(v));
        }
    );
}

exec();
0
tkarls On

Bergi's answer got me on the right track using Array.reduce.

However, to actually get the functions returning my promises to execute one after another I had to add some more nesting.

My real use case is an array of files that I need to transfer in order one after another due to limits downstream...

Here is what I ended up with:

getAllFiles().then( (files) => {
    return files.reduce((p, theFile) => {
        return p.then(() => {
            return transferFile(theFile); //function returns a promise
        });
    }, Promise.resolve()).then(()=>{
        console.log("All files transferred");
    });
}).catch((error)=>{
    console.log(error);
});

As previous answers suggest, using:

getAllFiles().then( (files) => {
    return files.reduce((p, theFile) => {
        return p.then(transferFile(theFile));
    }, Promise.resolve()).then(()=>{
        console.log("All files transferred");
    });
}).catch((error)=>{
    console.log(error);
});

didn't wait for the transfer to complete before starting another and also the "All files transferred" text came before even the first file transfer was started.

Not sure what I did wrong, but wanted to share what worked for me.

Edit: Since I wrote this post I now understand why the first version didn't work. then() expects a function returning a promise. So, you should pass in the function name without parentheses! Now, my function wants an argument so then I need to wrap in in a anonymous function taking no argument!

0
Chintan Rajpara On

parallel

see this example

const resolveAfterTimeout = async i => {
  return new Promise(resolve => {
    console.log("CALLED");
    setTimeout(() => {
      resolve("RESOLVED", i);
    }, 5000);
  });
};

const call = async () => {
  const res = await Promise.all([
    resolveAfterTimeout(1),
    resolveAfterTimeout(2),
    resolveAfterTimeout(3),
    resolveAfterTimeout(4),
    resolveAfterTimeout(5),
    resolveAfterTimeout(6)
  ]);
  console.log({ res });
};

call();

by running the code it'll console "CALLED" for all six promises and when they are resolved it will console every 6 responses after timeout at the same time

9
Mark On

You can also process an iterable sequentially with an async function using a recursive function. For example, given an array a to process with asynchronous function someAsyncFunction():

var a = [1, 2, 3, 4, 5, 6]

function someAsyncFunction(n) {
  return new Promise((resolve, reject) => {
    setTimeout(() => {
      console.log("someAsyncFunction: ", n)
      resolve(n)
    }, Math.random() * 1500)
  })
}

//You can run each array sequentially with: 

function sequential(arr, index = 0) {
  if (index >= arr.length) return Promise.resolve()
  return someAsyncFunction(arr[index])
    .then(r => {
      console.log("got value: ", r)
      return sequential(arr, index + 1)
    })
}

sequential(a).then(() => console.log("done"))

0
Jay On

I stumbled across this page while trying to solve a problem in NodeJS: reassembly of file chunks. Basically: I have an array of filenames. I need to append all those files, in the correct order, to create one large file. I must do this asynchronously.

Node's 'fs' module does provide appendFileSync but I didn't want to block the server during this operation. I wanted to use the fs.promises module and find a way to chain this stuff together. The examples on this page didn't quite work for me because I actually needed two operations: fsPromises.read() to read in the file chunk, and fsPromises.appendFile() to concat to the destination file. Maybe if I was better with JavaScript I could have made the previous answers work for me. ;-)

I stumbled across this and I was able to hack together a working solution:

/**
 * sequentially append a list of files into a specified destination file
 */
exports.append_files = function (destinationFile, arrayOfFilenames) {
    return arrayOfFilenames.reduce((previousPromise, currentFile) => {
        return previousPromise.then(() => {
            return fsPromises.readFile(currentFile).then(fileContents => {
                return fsPromises.appendFile(destinationFile, fileContents);
            });
        });
    }, Promise.resolve());
};

And here's a jasmine unit test for it:

const fsPromises = require('fs').promises;
const fsUtils = require( ... );
const TEMPDIR = 'temp';

describe("test append_files", function() {
    it('append_files should work', async function(done) {
        try {
            // setup: create some files
            await fsPromises.mkdir(TEMPDIR);
            await fsPromises.writeFile(path.join(TEMPDIR, '1'), 'one');
            await fsPromises.writeFile(path.join(TEMPDIR, '2'), 'two');
            await fsPromises.writeFile(path.join(TEMPDIR, '3'), 'three');
            await fsPromises.writeFile(path.join(TEMPDIR, '4'), 'four');
            await fsPromises.writeFile(path.join(TEMPDIR, '5'), 'five');

            const filenameArray = [];
            for (var i=1; i < 6; i++) {
                filenameArray.push(path.join(TEMPDIR, i.toString()));
            }

            const DESTFILE = path.join(TEMPDIR, 'final');
            await fsUtils.append_files(DESTFILE, filenameArray);

            // confirm "final" file exists    
            const fsStat = await fsPromises.stat(DESTFILE);
            expect(fsStat.isFile()).toBeTruthy();

            // confirm content of the "final" file
            const expectedContent = new Buffer('onetwothreefourfive', 'utf8');
            var fileContents = await fsPromises.readFile(DESTFILE);
            expect(fileContents).toEqual(expectedContent);

            done();
        }
        catch (err) {
            fail(err);
        }
        finally {
        }
    });
});
3
Adrien De Peretti On

NodeJS does not run promises in parallel, it runs them concurrently since it’s a single-threaded event loop architecture. There is a possibility to run things in parallel by creating a new child process to take advantage of the multiple core CPU.

Parallel Vs Concurent

In fact, what Promise.all does is, stacking the promises function in the appropriate queue (see event loop architecture) running them concurrently (call P1, P2,...) then waiting for each result, then resolving the Promise.all with all the promises results. Promise.all will fail at the first promise which fails unless you have to manage the rejection yourself.

There is a major difference between parallel and concurrent, the first one will run a different computation in a separate process at exactly the same time and they will progress at their rhythm, while the other one will execute the different computation one after another without waiting for the previous computation to finish and progress at the same time without depending on each other.

Finally, to answer your question, Promise.all will execute neither in parallel nor sequentially but concurrently.

0
TimoSolo On

Just to elaborate on @Bergi's answer (which is very succinct, but tricky to understand ;)

This code will run each item in the array and add the next 'then chain' to the end:

function eachorder(prev,order) {
        return prev.then(function() {
          return get_order(order)
            .then(check_order)
            .then(update_order);
        });
    }
orderArray.reduce(eachorder,Promise.resolve());
0
Ayan On

Using async await an array of promises can easily be executed sequentially:

let a = [promise1, promise2, promise3];

async function func() {
  for(let i=0; i<a.length; i++){
    await a[i]();
  }  
}

func();

Note: In above implementation, if a promise is rejected, the rest wouldn't be executed.If you want all your promises to be executed, then wrap your await a[i](); inside try catch

0
cestmoi On

Yes, you can chain an array of promise returning functions as follows (this passes the result of each function to the next). You could of course edit it to pass the same argument (or no arguments) to each function.

function tester1(a) {
  return new Promise(function(done) {
    setTimeout(function() {
      done(a + 1);
    }, 1000);
  })
}

function tester2(a) {
  return new Promise(function(done) {
    setTimeout(function() {
      done(a * 5);
    }, 1000);
  })
}

function promise_chain(args, list, results) {

  return new Promise(function(done, errs) {
    var fn = list.shift();
    if (results === undefined) results = [];
    if (typeof fn === 'function') {
      fn(args).then(function(result) {
        results.push(result);
        console.log(result);
        promise_chain(result, list, results).then(done);
      }, errs);
    } else {
      done(results);
    }

  });
}

promise_chain(0, [tester1, tester2, tester1, tester2, tester2]).then(console.log.bind(console), console.error.bind(console));

1
Nithi On

Bergi's answer helped me to make the call synchronous. I have added an example below where we call each function after the previous function is called:

function func1 (param1) {
    console.log("function1 : " + param1);
}
function func2 () {
    console.log("function2");
}
function func3 (param2, param3) {
    console.log("function3 : " + param2 + ", " + param3);
}

function func4 (param4) {
    console.log("function4 : " + param4);
}
param4 = "Kate";

//adding 3 functions to array

a=[
    ()=>func1("Hi"),
    ()=>func2(),
    ()=>func3("Lindsay",param4)
  ];

//adding 4th function

a.push(()=>func4("dad"));

//below does func1().then(func2).then(func3).then(func4)

a.reduce((p, fn) => p.then(fn), Promise.resolve());