Using setTimeout to Solve DOMContentLoaded

Stuart Colville pondered the potential for using setTimeout to solve the DOMContentLoaded problem in browsers that don't support it, namely anything but Opera and Mozilla.

His code is deceivingly simple:

function DOMReady(f){
  if (/(?!.*?compatible|.*?webkit)^mozilla|opera/i.test(navigator.userAgent)){ // Feeling dirty yet?
    document.addEventListener("DOMContentLoaded", f, false);
  }else{
    window.setTimeout(f,0);
  }
} 

Then, to make use of this, simply call DOMReady, passing in the function you want to execute when the page has loaded.

My initial thought was, "Whhaaaat?" It seemed so simple. Turns out, less so and will fail under a number of circumstances but I'll get to why that is and how we might be able to mitigate that.

The first thing I tried to do was go large with the HTML. So, I threw together a simple test case: a 200k file. It still worked. I then tried a 500k file and it failed. Then I tried using Charles to restrict download speed but even at the slowest speeds (14.4k) the test page worked just fine. At this point, I'm thinking that as long as a file isn't 500k then this is a perfectly valid option.

Then I started throwing together other scenarios and realized that IE was waiting until the entire page was received before rendering the page. I was initially baffled since IE should support progressive rendering. That's when I noticed that Gzip compression was enabled. IE was downloading the entire response before it could decompress and display the page. Turning off Gzip meant that IE could progressively render the content, firing off the setTimeout well before the elements had loaded on the page (and obviously generating an error message we don't want).

Why did the 200k file work and the 500k file not? Apache's mod_deflate (the module commonly used for gzip'ing) has a maximum size which is normally at 500k. My file had been just slightly larger, preventing it from getting compressed and thus generating the error.

Could this work?

If you could guarantee that the file would be sent via gzip compression every time then yes, using setTimeout could potentially be a viable way to mimic DOMContentLoaded. In fact, you could forego using DOMContentLoaded at all and simply rely on setTimeout for all browsers. window.setTimeout could be the new window.onload.

I say potentially because, like Stuart, I haven't put this concept through its paces and there may be cases where this doesn't work but it'd be interesting to explore further.

Published February 14, 2008
Categorized as JavaScript
Short URL: https://snook.ca/s/877

Conversation

17 Comments · RSS feed
Bryan said on February 14, 2008

[load the 15k jQuery library first]


<script type="text/javascript">
/*<![CDATA[*/
$(document).ready(function(){

// Do stuff

});
/*]]>*/
</script>

Done.

Jonathan Snook said on February 14, 2008

@Bryan: I can't tell if you're being serious or facetious.

Andrew Dupont said on February 14, 2008

@Jonathan: Perhaps his point is that emulating DOMContentLoaded in IE has been "solved" by all the major libraries already. The solutions are hacky, but battle-tested.

Jonathan Snook said on February 14, 2008

@Andrew Dupont: Certainly, but I'd hate to think that we should just rest on our laurels since the problem is already "solved". If all I have to do is serve gzip and it works in all major browsers, I'd rather just use window.setTimeout than a 15k library (which I'll point out is only 15k when gzipped).

Without gzip, I can't use setTimeout but I have to use a 52k library (that's jQuery minified but not gzipped) to pull it off. (I know, the domready portion of jQuery is only about 1.6k but since Bryan used jQuery as an example.)

Is window.setTimeout a practical option? Can we expect gzip to always be available in current browsers?

Bramus! said on February 15, 2008

Heh, this is something I've used before. Didn't know it was that big of an issue; thought it was my IE flipping out once more...

Jake Archibald said on February 15, 2008

setTimeout is very happy to run before the DOM is ready, it'll just queue the function behind whatever the browser's currently doing. When the whole page arrives in the browser at once, the function is conveniently queued after the DOM is ready.

Obviously the page isn't all arriving at once, but gzipping gives that effect. The parser gets the page as one big complete chunk.

Current IE DOMReady solutions use setTimeout to test properties of an element which throw an error before the DOM is ready. When an error isn't thrown, it is assumed the DOM is ready.

Jake.

Remy said on February 15, 2008

I'm running gzip on my server and have set up a simple test. The culprit of slow loading pages in my experience hasn't been so much the connection the user has or the size of the page, but often third parties - in particular externally loaded scripts, for example for ad serving.

My test simulates a slow response from a third party script provider and, sadly, the setTimeout fires before the DOM is ready.

http://remysharp.com/demo/onload.php

Essentially the setTimeout will work, so long as a) you're running gzip, and b) you're confident you're not serving any external scripts (that might be slow in responding).

Andrew Dupont said on February 15, 2008

I'd rather just use window.setTimeout than a 15k library (which I'll point out is only 15k when gzipped).

Well, if all you need is the DOMContentLoaded stuff, you can gank that pretty easily. It'd cost you only ~15 lines of JavaScript (not including the obligatory MIT license at the top, of course).

My problems with the setTimeout approach are twofold. First: to know if it'll work, you have to keep track of whether a seemingly unrelated and arbitrary switch (gzip on/off) is in the proper position. I usually try to minimize the number of tiny "be sure to..." maxims when I develop, because some law of the universe dictates that I'll forget about one of them just in time to spend 30 minutes debugging some simple piece of code.

Second: it's no guarantee that the DOM is ready – only that a function got called after having been deferred. An approach that demonstrates the DOM is ready (e.g., "the DOM must be ready because I can now read this DHTML property without an exception being thrown") seems far sturdier.

Oh, I just thought of a third: to require gzip compression means that pages can't be tested locally without running a web server. Can't just drag an HTML file into a browser window, because then all your DOMContentLoaded handlers will fire too early.

Jonathan Snook said on February 15, 2008

@Remy: thanks for that.

@Andrew: good point about local development. I actually use a local web server so it's not something that instantly came to mind. And thanks for the well thought out counterpoints.

Diego Perini said on February 15, 2008

I want to add that you may not be seeing the problem until you hit a page long enough due to buffering on the server. In Apache/PHP the output buffering is set by default to buffer up to 4Kbytes. The problem shows more often when pages are served chunked ("Transfer-Encoding: chunked"), normally dynamic pages are served that way.

You have two options to show the problem:

1) Be sure your page is at least 3 times (12Kb) the setting of the output buffer

and/or

2) Insert some delay in a PHP script and force a flush() on the output buffer

Local cache can also be hiding problems, so disable it during testing, then enable it again and repeat the same tests. And you are right, even if we have a solution there is no time to rest on laurels...IE8 is coming :-)

Tim McCormack said on February 16, 2008

I agree with Andrew's first two warnings about the technique.

That said, it's important to keep on innovating. It's still a very valid observation, and one that might be usable at some later date, perhaps with only a little modification.

Brendon Kozlowski said on February 18, 2008

Jonathan, I'm a little curious about your comment about "Charles"? I think I've been looking for a tool like that for awhile now, but have yet to find one that seems to work when testing locally. Can you elaborate on this software at all, and whether it works when accessing a localhost webserver?

Matthias Willerich said on February 19, 2008

Brendon: Charles can be found here.
I've been using it for almost 3 years, and it just gets better and better, with formatting xml responses, displaying images, reading AMF and so on. It's gold when developing with ajax or flash related server calls.

I like this solution, only that I agree with the comments saying the current approach is too specific to be used without thinking or testing. But maybe it's a good new starting point. As it fires once the DOM is ready, if not earlier, you'd have to check if you're complete or not. I can't come up with anything creative right now, possibly checking for some element at the bottom, in a loop, or something? Sounds terrible, I know.

Jakob Heuser said on February 20, 2008

Well, IE has a few properties that could be useful. For example, the activeElement property doesn't exist until the DOM is "ready" according to IE docs. We could just put f back onto the queue if the DOM doesn't seem ready in a progressive load under IE.


function DOMReady(f) {
  if (/(?!.*?compatible|.*?webkit)^mozilla|opera/i.test(navigator.userAgent)){ // Feeling dirty yet?
    document.addEventListener("DOMContentLoaded", f, false);
  }
  /*@cc_on @*/
  /*@if (@_win32)
  elseif(document.activeElement === null) {
    window.setTimeout(function() { DOMReady(f); }, 10); // back onto the event queue
  }
  /*@end @*/
  else{
    window.setTimeout(f,0);
  }
}

There are other ways too that get messy writing script elements with the IE-specific defer property, but the goal seems to be to try and keep the code short and sweet. That said, I am really not a fan of that anonymous function in the IE-specific setTimeout(), but my brain's too tired to figure out how to make it go away.

Brendon Kozlowski said on February 20, 2008

@Matthias: Thanks! If anyone else wants to elaborate a bit more on its use, I'd greatly appreciate it! I'll go take a look at it for now though. As for the DOMContentLoaded issue, most of the projects I've had, I haven't had an excuse to *not* use a JS library that solves the issue, thankfully.

Jenna said on February 24, 2008

Preved dyatlam!

Diego Perini said on September 04, 2008

I believe my IEContentLoaded trick has proven to be more precise and reliable on IE and will work both with compressed and uncompressed pages.

It has been proven by many that the setTimeout is unreliable in most situations, the exact moment is hard to detect, especially when navigating back and forward with the browsers buttons. Unfortunately the combinations we have to check are too many, better rely on something described by the browser vendor itself.

Diego Perini

Sorry, comments are closed for this post. If you have any further questions or comments, feel free to send them to me directly.