https://developer.mozilla.org/en-US/docs/DOM/Mutation_events
"Adding DOM mutation listeners to a document profoundly degrades the performance of further DOM modifications to that document (making them 1.5 - 7 times slower!). Moreover, removing the listeners does not reverse the damage."
Using the modern DOM mutation observers may be a better solution.
http://updates.html5rocks.com/2012/02/Detect-DOM-changes-wit...
Just as an example, you can initialize your Backbone Models and start sending out XHR's without worrying about the DOM.
I worry about the idea of waiting until late-appearing elements show up if you're selecting on the id. It's tricky to use that feature correctly because ids are globally unique. To handle ids sanely in flexible code, you have to automatically generate them (using, say underscore's uniqueId). Then you pass them around so your code is all talking about the same element. Yuck. At that point, we might as well pass around callbacks.
So you could solve most of that if you let waitUntilExists work for class selectors too. Then the user would just refer to it by the context, passing "itself" as the last parameter.
Which brings me to my last point of feedback: why "itself"? Seems to me that it would make more sense to just have the element be the default context if no last parameter is provided (or if it's undefined).
I'd much rather see a small DOM, so the ready event can fire quickly than do something like this.
For an extension I wrote which adds interface elements to the Facebook timeline, I needed a custom solution to monitor when parts of the dom were changing, since facebook implements infinite scroll. In this case I couldn't use mutation even if I wanted to because of the scoping of chrome extensions.
Since the facebook implementation does a lot of work to render infinite scroll quickly i needed something fast, and lightweight. for this i choose a pattern that locates new dom elements on an interval, appends a class, and then triggers a new event, which then looks for that class ( which is indexed by the browser ) and then does a foreach on each of those new elements only.
I found setting the interval to 100ms was a good fit on facebook, as it's generally the amount of time that a user will take to notice a change. I could have set it lower, to 50ms, 20ms, or even lower, but i found that it caused too much blocking. in many cases the DOM and javascript both need access to the threads and if you have intervals running on too small of an interval, like 5 seconds in this library you are likely going to block the dom from rendering quickly. also, it's unlikely that the browser will actually fire something at 5ms precision.
if you are going to follow a pattern like this i would advise you to use something in the 25ms+ range for your interval and also make sure that your interval script does not cause an invalidation on your browser elements. on a second async call you can then safely alter each element which should assure that you don't have any blocking behaviors.
Basically, this is what makes it so great: Everything that you stick inside its brackets is ready to go at the earliest possible moment — as soon as the DOM is registered by the browser.
Waiting on the window is actually slower than waiting on the DOM.
Uses CSS3 Animation if available, otherwise falls back to DOMNodeInserted.
var intID = setInterval(function() { if($("myDOMElement").length || checkForTimeOut()) { clearInterval(intID); doSomething(); } }, 100);
...if it starts lagging just increase the interval time (that last integer).