2016.10.04

Javascript: Programming for Complex Asynchronous Use-Cases (Promises and Generators)

Callbacks

Callbacks are the basic pattern for doing things asynchronously in javascript. Since a task is asynchronous, we don’t know when it will finish. Callbacks are useful in javascript because they say “Call this function when you have finished this asynchronous task”. A callback is simply a function as a parameter that then gets executed.

A node.js style callback is a pattern where the callback is passed as the last parameter to an asynchronous function. The first parameter of the callback is the reason for the error in the failure case. The second parameter for the callback is the success value.

read-callback.1.js

var fs = require('fs')

var getFileContents = function(filename, callback) {
    setTimeout(() => {
        fs.readFile(filename, 'utf-8', (err, data) => {
            callback(data);
        })
    }, Math.random() * 5000);
}

getFileContents('./files/file1', console.log)
This script shows how we would use the node.js callback pattern to do a task asynchronously. Here we are just reading a file after a random time of up to 5 seconds. But you could imagine that we perform an external DB call that we would like to call asynchronously. Because we have specified where exactly to continue after reading the file with our callback, javascript can perform other tasks while waiting for the read to finish.

But callbacks have limitations when we want to do anything complicated like multiple requests and combining the results. This is because the callback pattern is quite rigid and not composable. In this article we will discuss various strategies for programming for more complicated asynchronous use-cases. Read on for tips on avoiding what is commonly known as callback hell.

Promises

Promises are composeable, chainable, we can pass promises around.

Perhaps the most commonly used Promise functions include:
  • Promise.then For handling the success case
  • Promise.catch For handling the failure case
  • Promise.all Takes multiple promises as input and creates a Promise that is resolved when all input Promises are resolved
ECMAscript 6 includes Promises natively, and there are many Promise libraries that all implement the basic Promise functionality with similar interfaces.

Lets look at how we would implement our file reader program with a Promise.

read-promises.1.js

var fs = require('fs')

var getFileContents = function(filename) {
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            fs.readFile(filename, 'utf-8', (err, data) => {
                if (err) reject(err)
                else resolve(data)
            })
        }, Math.random() * 5000)
    })
}

getFileContents('./files/file1')
    .then(console.log)
    .catch(console.error)
read-promises.1.js does the same thing as read-callback.1.js. getFileContents returns a Promise. Rather than pass a callback to getFileContents to describe what is to be done after reading the file, we call then on our promise, passing the callback (console.log). We also pass console.error to the catch function to handle an error.
Our Promise.then callback will receive the file contents as a parameter in the success case. This corresponds to the data passed to Promise.resolve in line 8.
Our catch callback will receive the cause of the error as a parameter in the failure case. This corresponds to the err passed to Promise.reject in line 7.

Given this use-case of reading only a single file using promises doesn’t really offer much advantage. The Promise example is actually a little more complicated because instead of passing the callbacks directly to the getFileContents method we have an intermediary Promise object. But hopefully you can see the similarities and differences of promises and callbacks.

Let’s look at a more complicated use-case, where the merits of using Promises should be obvious.

Take read-callback.2.js as an example of a more complicated use-case involving asynchronous logic.

read-callback.2.js

var fs = require('fs')

var getFileContents = function(filename, callback) {
    setTimeout(() => {
        fs.readFile(filename, 'utf-8', (err, data) => {
            callback(data);
        })
    }, Math.random() * 5000);
}

getFileContents('./files/file1', (contents) => {
    console.log(contents)
    getFileContents('./files/file2', (contents) => {
        console.log(contents)
        getFileContents('./files/file3', (contents) => {
            console.log(contents)
        })
    })
})
This reads the contents of 3 files and prints the contents out in the order they are read. Again each file read happens after a random time of up to 5 seconds.
Note the recursive structure, each read is nested within the callback of the previous read. This means we are running sequentially, not in parallel. So worst case scenario we could wait 15 seconnds (not counting file read time). Also, we do not do error handling, if we add this inside the callbacks our code gets much more difficult to understand.

read-promises.3.js

var fs = require('fs')

var getFileContents = function(filename) {
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            fs.readFile(filename, 'utf-8', (err, data) => {
                resolve(data)
            })
        }, Math.random() * 5000)
    })
}

// with Promise.all we get parallel execution and nice looking code
Promise.all([
    getFileContents('./files/file1'),
    getFileContents('./files/file2'),
    getFileContents('./files/file3')
]).then((contents) => {
    // order of contents array is same as order of promises input
    contents.forEach(function(data) {
        console.log(data)
    })
}).catch(console.error)
Here’s were promises help. read-promises.3.js reads the contents of the same 3 files as read-callback.2.js using promises. The main thing we show here is promise composability. Informally speaking, Promises are composable because we can pass them around and do interesting stuff to them. Here we pass 3 promises to the Promise.all function. This executes our 3 promises in parallel. So we have a worst case time of 5 seconds, much better than 15 seconds! The contents parameter to the then function is the 3 string array of our file contents. The parameter to the catch method is the cause of any error that occurs. If an error occurs all other processing will cease.
In addition to running in parallel, the order of results in contents is the same as the order in the input list of Promises. And the code is obviously more readable than the callback version.

But what happens when we have task dependencies where one task is dependent on the outcome of a previous asynchronous task. We can put our dependant task within the Promise.then of the original task. However, then we get back to something which starts to look like our spaghetti callback code where we read 3 files, each within the callback of the last. Wouldn’t it be nice if we could just write synchronous code, where each subsequent line is not executed until the previous line has finished?

Generators

Generators give an asynchronous program the appearance of being synchronous.

The library we will use to create generators is co.

Let’s look at a way to turn a program written with Promises into a sequential looking program.

read-yield.4.js

var fs = require('fs')
var co = require('co')

var getFileContents = function(filename) {
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            fs.readFile(filename, 'utf-8', (err, data) => {
                resolve(data)
            })
        }, Math.random() * 5000)
    })
}


co(function* () {
    try {
        // with yield [] we get parallel execution and nice looking code
        var contents = yield [
            getFileContents('./files/file1'),
            getFileContents('./files/file2'),
            getFileContents('./files/file3')
        ]

        // order of contents array is same as order of promises input
        contents.forEach(function(data) {
            console.log(data)
        })
    } catch(err) {
        console.error(err)
    }
})
read-yield.4.js introduces the yield keyword. This is a generator and can be used within the co function. co always returns a promise. The value returned from within the co function will be resolved within the Promise.
The yield in read-yield.4.js gets the resolved value from the promise and delays execution of the next line until the asynchronous code is finished and we get the resolved value.
If the promise rejects the yield will transform that into a thrown error which gets caught in our catch block. So we have taken the then and catch
of Promise and converted to a sequential looking code within a try-catch block.

This gets more useful if you have complicated dependencies that need to be combined before doing something else. For example multiple datasource queries which determine execution of subsequent tasks. These datasource queries could all be yielded individually, or in parallel before continuing.

The co function allows us to use yields, but it’s code gets executed instantly. What if we want a reference to a function containing yields that can be executed later?

This is what co.wrap does. Whereas the code inside co can contain generators it gets executed instantly, code inside a co.wrapped function gets executed
when you call it.

read-yield.5.js

var fs = require('fs')
var co = require('co')

var getFileContents = function(filename) {
    return new Promise((resolve, reject) => {
        setTimeout(() => {
            fs.readFile(filename, 'utf-8', (err, data) => {
                resolve(data)
            })
        }, Math.random() * 5000)
    })
}


var fileContentsPromise = co.wrap(function* (filenames) {
    let getFileContentsPromises = filenames.map((name) => getFileContents(name))
    console.log('3 promises:', getFileContentsPromises)
    let yieldedPromises = yield getFileContentsPromises
    console.log('yielded contents of 3 files:', yieldedPromises)
    return yieldedPromises
})

    fileContentsPromise(['./files/file1', './files/file2', './files/file3'])
        .then((contentsArray) => {
            console.log('contents of 3 files', contentsArray)

            // order of contents array is same as order of promises input
            contentsArray.forEach(function(data) {
                console.log(data)
            })
        }).catch((err) => {
            console.error(err)
        })
read-yield.5.js is an example of such a program.

The fileContentsPromise function is co.wrapped. This function is called on line 23 and a Promise object containing the contents of the 3 files is returned.
We access the file contents within the Promise.then function and we see this is resolved to an array of file contents.
Similarly to read-yield.4.js, the yield on line 18 is yields the contents of the 3 files and order is preserved.

Summary

We started with callbacks as a way of performing asynchronous programming in Javascript. We looked at an alternative pattern using a Promise. Then we showed how this helps reduce complexity and add parallelism for a more complicated asynchronous use-case. Then we showed Promises themselves can get into a nested mess with complicated dependencies. Then we saw how generators could flatten out asynchronous code with Promises to make it look synchronous. Then we showed a more practical example of a generator within a function wrapped with co.wrap. With these tools we should be able to write clean asynchronous Javascript, not matter how complicated the use-case!

  • Twitter
  • Facebook
  • はてなブックマークに追加

グループ研究開発本部の最新情報をTwitterで配信中です。ぜひフォローください。

 
  • AI研究開発室
  • 大阪研究開発グループ

関連記事