Resolve promises one after another (i.e. in sequence)?

Resolve promises one after another (i.e. in sequence)?

Consider the following code that reads an array of files in a serial/sequential manner. readFiles returns a promise, which is resolved only once all files have been read in sequence.
var readFile = function(file) {
… // Returns a promise.
};

var readFiles = function(files) {
return new Promise((resolve, reject) =>

var readSequential = function(index) {
if (index >= files.length) {
resolve();
} else {
readFile(files[index]).then(function() {
readSequential(index + 1);
}).catch(reject);
}
};

readSequential(0); // Start!

});
};

The above code works, but I don’t like having to do recursion for things to occur sequentially. Is there a simpler way that this code can be re-written so that I don’t have to use my weird readSequential function?
Originally I tried to use Promise.all, but that caused all of the readFile calls to happen concurrently, which is not what I want:
var readFiles = function(files) {
return Promise.all(files.map(function(file) {
return readFile(file);
}));
};

Solutions/Answers:

Solution 1:

Update 2017: I would use an async function if the environment supports it:

async function readFiles(files) {
  for(const file of files) {
    await readFile(file);
  }
};

If you’d like, you can defer reading the files until you need them using an async generator (if your environment supports it):

async function* readFiles(files) {
  for(const file of files) {
    yield await readFile(file);
  }
};

Update: In second thought – I might use a for loop instead:

var readFiles = function(files) {
  var p = Promise.resolve(); // Q() in q

  files.forEach(file =>
      p = p.then(() => readFile(file)); 
  );
  return p;
};

Or more compactly, with reduce:

var readFiles = function(files) {
  return files.reduce((p, file) => {
     return p.then(() => readFile(file));
  }, Promise.resolve()); // initial
};

In other promise libraries (like when and Bluebird) you have utility methods for this.

Related:  Is Google Dogfooding Angular.js? [closed]

For example, Bluebird would be:

var Promise = require("bluebird");
var fs = Promise.promisifyAll(require("fs"));

var readAll = Promise.resolve(files).map(fs.readFileAsync,{concurrency: 1 });
// if the order matters, you can use Promise.each instead and omit concurrency param

readAll.then(function(allFileContents){
    // do stuff to read files.
});

Although there is really no reason not to use async await today.

Solution 2:

Here is how I prefer to run tasks in series.

function runSerial() {
    var that = this;
    // task1 is a function that returns a promise (and immediately starts executing)
    // task2 is a function that returns a promise (and immediately starts executing)
    return Promise.resolve()
        .then(function() {
            return that.task1();
        })
        .then(function() {
            return that.task2();
        })
        .then(function() {
            console.log(" ---- done ----");
        });
}

What about cases with more tasks? Like, 10?

function runSerial(tasks) {
  var result = Promise.resolve();
  tasks.forEach(task => {
    result = result.then(() => task());
  });
  return result;
}

Solution 3:

This question is old, but we live in a world of ES6 and functional JavaScript, so let’s see how we can improve.

Because promises execute immediately, we can’t just create an array of promises, they would all fire off in parallel.

Instead, we need to create an array of functions that returns a promise. Each function will then be executed sequentially, which then starts the promise inside.

We can solve this a few ways, but my favorite way is to use reduce.

It gets a little tricky using reduce in combination with promises, so I have broken down the one liner into some smaller digestible bites below.

Related:  Having a hard time debugging error - Token '{' invalid key at column 2

The essence of this function is to use reduce starting with an initial value of Promise.resolve([]), or a promise containing an empty array.

This promise will then be passed into the reduce method as promise. This is the key to chaining each promise together sequentially. The next promise to execute is func and when the then fires, the results are concatenated and that promise is then returned, executing the reduce cycle with the next promise function.

Once all promises have executed, the returned promise will contain an array of all the results of each promise.

ES6 Example (one liner)

/*
 * serial executes Promises sequentially.
 * @param {funcs} An array of funcs that return promises.
 * @example
 * const urls = ['/url1', '/url2', '/url3']
 * serial(urls.map(url => () => $.ajax(url)))
 *     .then(console.log.bind(console))
 */
const serial = funcs =>
    funcs.reduce((promise, func) =>
        promise.then(result => func().then(Array.prototype.concat.bind(result))), Promise.resolve([]))

ES6 Example (broken down)

// broken down to for easier understanding

const concat = list => Array.prototype.concat.bind(list)
const promiseConcat = f => x => f().then(concat(x))
const promiseReduce = (acc, x) => acc.then(promiseConcat(x))
/*
 * serial executes Promises sequentially.
 * @param {funcs} An array of funcs that return promises.
 * @example
 * const urls = ['/url1', '/url2', '/url3']
 * serial(urls.map(url => () => $.ajax(url)))
 *     .then(console.log.bind(console))
 */
const serial = funcs => funcs.reduce(promiseReduce, Promise.resolve([]))

Usage:

// first take your work
const urls = ['/url1', '/url2', '/url3', '/url4']

// next convert each item to a function that returns a promise
const funcs = urls.map(url => () => $.ajax(url))

// execute them serially
serial(funcs)
    .then(console.log.bind(console))

Solution 4:

To do this simply in ES6:

function(files) {

    // Create a new empty promise (don't do that with real people ;)
    var sequence = Promise.resolve();

    // Loop over each file, and add on a promise to the
    // end of the 'sequence' promise.
    files.forEach(function(file) {

      // Chain one computation onto the sequence
      sequence = sequence.then(function() {
        return performComputation(file);
      }).then(function(result) {
        doSomething(result) // Resolves for each file, one at a time.
      });

    })

    // This will resolve after the entire chain is resolved
    return sequence;
  }

Solution 5:

Simple util for standard Node.js promise:

function sequence(tasks, fn) {
    return tasks.reduce((promise, task) => promise.then(() => fn(task)), Promise.resolve());
}

UPDATE

Related:  Null-coalescing in Javascript?

items-promise is a ready to use NPM package doing the same.

Solution 6:

I’ve had to run a lot of sequential tasks and used these answers to forge a function that would take care of handling any sequential task…

function one_by_one(objects_array, iterator, callback) {
    var start_promise = objects_array.reduce(function (prom, object) {
        return prom.then(function () {
            return iterator(object);
        });
    }, Promise.resolve()); // initial
    if(callback){
        start_promise.then(callback);
    }else{
        return start_promise;
    }
}

The function takes 2 arguments + 1 optional. First argument is the array on which we will be working. The second argument is the task itself, a function that returns a promise, the next task will be started only when this promise resolves. The third argument is a callback to run when all tasks have been done. If no callback is passed, then the function returns the promise it created so we can handle the end.

Here’s an example of usage:

var filenames = ['1.jpg','2.jpg','3.jpg'];
var resize_task = function(filename){
    //return promise of async resizing with filename
};
one_by_one(filenames,resize_task );

Hope it saves someone some time…