The basic idea is that for many public-facing websites (think NYTimes, natch, or Airbnb's search result listings, for example), the usual Rails convention of "Hey, here comes a request, let me generate a page just for you", is fairly inappropriate. Lots of "publishing" applications will melt very quickly if Rails ever ends up serving dynamic requests. Instead, you cache everything, either on disk with Nginx, in Memcached, or in Varnish.
But you know when the data is changing -- when an article has been updated and republished ... or when you've done another load of the government dataset that's powering your visualization. Waiting for a user request to come in and then caching your response to that (while hoping that the thundering herd doesn't knock you over first) is backwards, right?
I think it would be fun to play around with a Node-based framework that is based around this inverted publishing model, instead of the usual serving one. The default would be to bake out static resources when data changes, and you'd want to automatically track all of the data flows and dependencies within the application. So when your user submits a change, or your cron picks up new data from the FEC, or when your editor hits "publish", all of the bits that need to be updated get generated right then.
It's only a small step from there to pushing down the updates to Backbone models for active users ... but one step at a time, right? No need to couple those things together.
ps. Kudos to you for reading the source. It's always enlightening: https://github.com/angular/angular.js/blob/master/src/ng/roo...