2010 update:
Lo, the Web Performance Advent Calendar hath moved
Dec 15 This article is part of the 2009 performance advent calendar experiment. Today's article is a contribution from Ara Pehlivanian, author of two JavaScript books. Please welcome Ara and stay tuned for the articles to come.
JavaScript has a dark side to it that not many people are aware of. It causes the browser to stop everything that it's doing until the script has been downloaded, parsed and executed. This is in sharp contrast to the other dependencies which get loaded in parallel--limited only by the number of connections the browser and server are able to create. So why is this a problem?
Good question! Before I can answer that, I need to explain how the browser goes about building a page. The first thing a it's does once it receives an HTML document from the server is to build the DOM--an object representation of the document in memory. As the browser goes about converting HTML into the DOM, it invariably encounters references to external dependencies such as CSS documents and images. Every time it does so, it fires off a request to the server for that dependency. It doesn't need to wait for one to be loaded before requesting another, it makes as many requests as it's capable of. This way, the page gets built one node at a time and as the dependencies come in, they're put in their correct placeholders. What gums up the works though, is when a JavaScript dependency is encountered. When this happens, the browser stops building the DOM and waits for that file to arrive. Once it receives the file, it parses and executes it. Only once all of that's done does the browser continue building the DOM. I suspect this has to do with wanting to provide as stable a DOM to the script as possible. If things were in flux while the script attempted to access or even modify a DOM node, things could get dicey. Either way, the time it takes before the browser can continue depends entirely on the size and complexity of the script file that's being loaded.
Now imagine loading a 200k JavaScript file right in the <head>
of a document. Say it's a JavaScript file that's not only heavy but also does some fairly complex computing that takes half a second to complete. Imagine now what would happen if that file took a second to transfer. Did you guess? Yup, the page would be blank until that transfer and the computation were complete. A second and a half of a blank page that the visitor has to endure. Given that most people don't spend more than a few seconds on the average web page, that's an eternity of staring at a blank page.
Reduce
So how can this problem be overcome? Well, the first thing that should be done, is to reduce as much as possible, the amount of data that's being sent over the pipe. The smaller the JavaScript file, the less waiting the visitor has to do. So what can be done to reduce file size? JavaScript files can be run through a minifier such as YUI Compressor (which removes unnecessary white space and formatting, as well as comments, and is proven to reduce file size by 40-60%). Also, if at all possible, servers should be set up to gzip files before they're sent. This can drastically reduce the number of bytes that get transferred since JavaScript is plain text, and plain text compresses really well.
Defer
So, once you've made sure your file is as small as possible, what next? Well, the first thing is to make sure the visitor has something to look at while the script is loading. Instead of loading JavaScript files in the document's <head>
, put your <script>
tags immediately before your page's closing </body>
tag. That way, the browser will have built the DOM and begun inserting images and applying CSS long before it encounters your script tags. This also means that your code will execute faster because it won't need to wait for the page's onload event--which only fires once all the page's dependencies are done loading.
So with the script tags placed at the end of the document, when the browser does encounter them, will still halt operations for however long it needs to, but at this point the visitor is reading your page and unaware of what's going on behind the scenes. You've just bought yourself the time to surreptitiously load your script files.
Go Async
There is another way to load JavaScript files which won't block your browser, and that's to insert the script tags into your page using JavaScript. Dynamically including a script tag into the DOM causes it to be loaded asynchronously. The only trouble with that is that you can't rely on the code within the script file to be available immediately after you've included it. What you'll need is a callback function that is executed once your script is done loading. There are several ways of doing this. A lot of libraries have built in async script loading functionality, so you're likely better off using that. But if you want to do it yourself, be ready to deal with the idiosyncrasies of different browsers. For example, where one browser will fire off an onload event for the script, another will not.
Be Lazy
So now that we know how to load scripts behind the scenes, is there anything more we can do to improve performance? Of course.
Say for example your page loads up a large script that gives your site a fancy navigation menu. What if the user never uses the navigation menu? What if they only navigate your site through links in your content? Did you really need to load that script in the first place? What if you could load the necessary code only when it was needed? You can. It's a technique called lazy loading. The principle is simple, instead of binding your fancy navigation script to the menu in your page, you'd bind a simple loader script instead. It would detect an onmouseover event for example, and then insert a script tag with the fancy nav code into the page. Once the tag is done loading, a callback function wires up all the necessary events and presto bingo, your nav menu starts working. This way, your site doesn't have to needlessly bog visitors down with code they'll never use.
Bite Size
In keeping with lazy loading, try to also load only the core components that are needed to make your page work. This is especially the case when it comes to libraries. A lot of the time a library will force you to load up a huge amount of code when all you want to do is add an event handler, or modify class names. If the library doesn't let you pull down only what you need, try ripping out what you want and only load that instead. There's no point in forcing visitors to download 60k of code when all you need is 4k of it.
Do You Need It?
Finally, the best way to speed up JavaScript load times is to not include any JavaScript at all. A lot of times people go nuts for the latest fad and include it in their site without even asking themselves if they really need it. Does this fancy accordion thing actually help my visitors get to my content easier? Does fading everything in and out and bouncing things all over the place actually improve my site's usability? So the next time you feel like adding a three dimensional spinning rainbow tag cloud to your site, ask yourself, "do I really need this?"
I'd like to thank Ara for the great article, it's pleasure for me to be the blog host!
Also wanted to offer some additional links for your reading pleasure:
- Steve Souders has done extensive research on different options for non-blocking async loading, check out this blog post, also code examples from his book, another technique
- Deferred eval on the SproutCore blog
- Non-blocking JavaScript downloads on the YUIblog
- Two articles by another JavaScript book author - Nicholas Zakas
- LABjs - on-demand JavaScript loader
- LazyLoad - library-agnostic JS/CSS loader
Please comment if you can think of more good resources on the topic.