Wednesday, February 25, 2015

High performance animations (a few tricks)

The web browser is capable of very smooth animations, but there are a few gotchas. Here is what you should know.

jQuery

With jQuery animations went mainstream. But jQuery.animate is possibly the less efficient way for animating elements nowadays:
It can be still useful on old browsers or in specific cases. Better not using it on mobile browsers.

CSS transitions and animations

Native animations runs usually faster than JS ones. So it is quite obvious to use them when possible. There are plenty of tutorial on how to use them so google for it!

Composite layer css properties

When you change a CSS property you trigger some operations in the browser. These are:

  • recalculating sizes and positions (layout)
  • redrawing elements on the screen (paint)
  • composite all elements together (composite)

The topmost operations triggers the ones below. Furthermore the more elements are involved the worst the animation is , performance wise.
You can visualize and debug this process in the useful timeline panel (inside Chrome developer tools).
The trick here is to use css properties that triggers only the composite step. These are opacity
and transform. This is article contains what you need to know.
Sadly it is not enough to use these for getting the performance jump, you should also trigger the creation of a new composite layer using these CSS rules:

.will-change{
    transform: translate3d(0, 0, 0); 
    perspective: 1000;
    backface-visibility: hidden;
}

For a better support you can add these browser prefixes:

.will-change{
    -webkit-transform: translate3d(0, 0, 0); /*old chrome and safari*/
    -o-transform: translate3d(0, 0, 0); /*old opera*/
    -moz-transform: translate3d(0, 0, 0); /*old FF*/
    -ms-transform: translate3d(0, 0, 0); /*IE9*/
    transform: translate3d(0, 0, 0);
    -webkit-perspective: 1000;
    -o-perspective: 1000; /*old opera*/
    perspective: 1000;
    -webkit-backface-visibility: hidden;
    -o-backface-visibility: hidden; /*old opera*/
    backface-visibility: hidden;
}

Doing this opacity and transform are managed by the GPU if possible.
This can be a bit awkward and for this reason browser vendors have created a new css rules: will-change.
This will make the browser know that an element is going to change and to put it inside a compositing layer (this is not yet widely available so sadly, for now, it is better to stick with the hack).

Request Animation Frame


The way js animations work is changing a numeric CSS property over time. The only way to schedule an event in js was setTimeout (and its brother setInterval). As I mentioned they are still used by jQuery.
A while ago browser vendors introduced "requestAnimationFrame". It is a much better way to do it as it execute a piece of code right before the page refresh (approximately 60 times a second).
This is a polyfill for any browser (in the worst case it uses setTimeout).

If your animation depends on user interactions it can be a good idea to throttle the changes to the CSS using requestAnimationFrame. In this example I am using a queue with a simple policy that returns only the last function discarding the others.

function lastOne(q){
    return q.length ? [q.pop()] : [];
}

function getRenderingQueue(policy){
    var queue = [];
    var isRunning = false;
    policy = policy || function (q){return q;};

    var render = function (){
        var f;
        queue = policy(queue);
        isRunning = false;
        while (f = queue.shift()){
            f();
        }
    };
    return {
        empty: function (){
            queue = [];
        },
        push: function (func){
            queue.push(func);
            if (!isRunning){
                isRunning = true;
                window.requestAnimationFrame(render);
            }
        }
    }
}


var renderingQueue = getRenderingQueue(lastOne);

renderingQueue.push(function (){
    //changing a piece of CSS
});


Depending on your application you can decide using a different policy.

This is it, I'll soon put this in context.

Friday, January 30, 2015

urlwalker.js, a context aware router for express

Urlwalker.js is a router middleware for Connect/Express.js.

It is not a substitute of the default Express.js router but it works together with the latter, trying to get an object from a fragment of URL (It literally walks it segment by segment, hence the name). You can then use this object as model in the function called by the default Express.js router.
This process is called URL traversal. This concept is not by any means original: I took the inspiration from other web frameworks such as Zope and Pyramid.

Confused ? Let's make a step behind

URL, model and view

Using REST principles it seems to be natural mapping a URLs to a hierarchy of objects:

  • http://www.example.com/roald_dahl/the_chocolate_factory

This URL represents a relation between two objects: the author (roald_dahl) and one of his books (the_chocolate_factory). The last is the model used by the function. Let's put this thing together using express.js:
app.get("/:author/:book", function (req, res){
    // getting the book object
    // doing something with the object
    // return the result
});
The original "Expressjs" way to get the model is to do it directly inside the function (like the previous example) or (better) using app.param. But it is not flexible enough for managing a deeply arbitrary nested structure.
Furthermore I believe it can be useful to split the URL in two different parts. The first part is for getting an object and the second one to transform the object:

  • http://www.example.com/roald_dahl/the_chocolate_factory/index.json
  • http://www.example.com/roald_dahl/the_chocolate_factory/index.html

Both of these URLS point to the same object but return a different representations of that object.
Urlwalker.js follows this convention.

How to use it


The middleware is initialized with a function and a "root" object.

var traversal = require('urlwalkerjs').traversal;
var traversal_middleware = traversal(function (obj, segment, cb){
    return cb({ ... new obj ... })
    // or
    return cb(); // end of traversing
},root_object);

Then you can use it as a normal middleware and add the regular routing:

app.use(traversal_middleware);

app.get('index.json', function(req, res) {
  res.send(req.context);
});

The routing process starts with an object. I call it the "root object" and it is the second argument passed to the middleware. It can be anything, even undefined.
The function (the first argument of the middleware) is invoked for any URL segment. The first time is invoked with the first segment and the root object. It returns an object. The second time is called with the second segment and the object returned previously. The process is repeated until it can't find a match. Then it returns the last object in "req.context" and pass the control to the next middleware.
For this URL:

  • http://www.example.com/roald_dahl/the_chocolate_factory/index.json

The function is invoked twice:

  • from the root object and the segment "roald_dahl" I get an author object
  • from the author object and "the_chocolate_factory" I get a book object

Then the express.js function is called with the book object inside req.context.
For clarifying the process I have added an example here.

An example with occamsrazor.js


Defining this function with such a complex behaviour can be difficult and not very flexible.
For this reason you can use occamsrazor.js for adding dinamically new behaviours to the function (see example 2).
So it becomes:

var getObject = occamsrazor();
var has_authors = occamsrazor.validator().has("authors");
var has_books = occamsrazor.validator().has("books");

getObject.add(null, function (obj, id, cb){
    return cb(); // this will match if no one else match
});

getObject.add(has_authors, function (obj, id, cb){
    return cb(obj.authors[id]);
});

getObject.add(has_books, function (obj, id, cb){
    return cb(obj.books[id]);
});

var traversal_middleware = traversal(getObject, data);
app.use(traversal_middleware);
app.get('index.json', function(req, res) {
  res.send(req.context);
});

At the beginning it might seem a bit cumbersome until you realize you can easily extend the behaviour so easily:

var has_year = occamsrazor.validator().has("year");

getObject.add(has_year, function (obj, id, cb){
    return cb(obj.year[id]);
});

Plugin all the things

But why stops here? why can't we get the view with a similar mechanism  (example 3) ? Let's replace the Express.js routing completely with this:

...
var view = require('urlwalkerjs').view;
var getView = occamsrazor();

var view_middleware = view(getView);

getView.add(null, function (url, method, context, req, res, next){
    next(); // this will match if no one else match
});

getView.add(["/index", "GET", has_books], function (url, method, context, req, res, next){
  res.send('this is the author name: ' + req.context.name);
});

getView.add(["/index", "GET", has_authors], function (url, method, context, req, res, next){
  res.send('these are the authors available: ' + Object.keys(req.context.authors));
});

getView.add(["/index", "GET", has_title], function (url, method, context, req, res, next){
  res.send('Book: ' + req.context.title + " - " + req.context.year);
});

app.use(view_middleware);

A plugin architecture is very helpful, even though you don't need plugins at all. It allows you to apply the open/close principle and extend your application safely.

Thursday, January 15, 2015

Why I have stopped using requirejs (and you should too)

I have used requirejs extensively and I have written many posts about it. I think it is very ingenious and well designed.
It tries to solve more than one problem at the same time (in a very elegant way) but nowadays these problems are not so important and they have better solutions.

Loading scripts asynchronously

This was one of the main selling point of requirejs in the past. Now it is not necessary anymore. It is much better moving (back) the script tags on the top and use the async attribute as described by this great article. The async attribute now is very well supported !

Loading dependencies

Requirejs can dinamically load dependencies when they are required. But often you want to have the control. Sometime is better to include a library when you load the page (bundling more than a library together, for having them saved in the cache) and sometime you want to load it on demand. In case it is very easy do something like:

var script = document.createElement('script');
script.src = "http://www.example.com/script.js";
document.head.appendChild(script);

Isolate dependencies

Requirejs is even able to run 2 different versions of the same library. But is a feature rarely used and to be honest in 99.9 % of the cases using the module pattern is more than enough.

The only issue

The only issue of having all these asynchronous bundles (using the async attribute) is managing the execution order. You can use a tiny library like this one:

(function (w){

var go = {}, wait = {};
w.later = function (dep, func){
    if (go[dep]) func();
    else {
        wait[dep] = wait[dep] || [];
        wait[dep].push(func);
    }
}

w.later.go = function (dep){
    var funcs =  wait[dep] || [], l = funcs.length;
    delete wait[dep];
    go[dep] = true;
    for (var i = 0; i < l; i++){
        try{
            funcs[i]();
        }
        catch (e){
            console && console.error(e);
        }
    }
}
}(window));

This could be the only JS to be loaded synchronously. For maximum performances you can also minify it and inline in the HTML.
Then you can manage the dependencies at execution time:

later('foo', function (){
   // waiting for the bundle named foo (it is an arbitrary string)
});

You only need to put this instruction at the end of the bundle "foo":

later.go("foo");

There are still valid use cases for requirejs but I suggest to keep your build process lean, tweak performances by hand, use the async attribute and the module pattern.
Simpler and more performant !

Edited: and what about "defer"?

This article suggests to use async and defer together for improving performances on older browsers. I suggest to not do that, unless you know what your script is doing. This is because of this bug. The bug is even worst of what it seems, if you inject a script tag inside a DEFERred script the execution will stop waiting for the injected script to be downloaded and executed. So be careful!