Even if a given JavaScript file is small, if I reduce the amount of repetition in the code, it will get smaller. For example, we've all seen some variation of the following code:
function Object1 () {}
Object1.prototype.blah = "foo"
function Object2 () {}
Object2.prototype.blah = "bar"
function Object3 () {}
Object3.prototype.blah = "zoo"
Minified, assuming we want to keep our object names and properties, nothing changes except perhaps we lose whitespace. So, why not slim down the source?
var PROTO = "prototype",
BLAH = "blah"
function Object1 () {}
Object1[PROTO][BLAH] = "foo"
function Object2 () {}
Object2[PROTO][BLAH] = "bar"
function Object3 () {}
Object3[PROTO][BLAH] = "zoo"
Nothing special and kind of ugly, but what happens if we minify it?
var P="prototype",
B="blah"
function Object1 () {}
Object1[P][B] = "foo"
function Object2 () {}
Object2[P][B] = "bar"
function Object3 () {}
Object3[P][B] = "zoo"
Looks like we saved a few characters! A win, right?
And the Dour Hammer of Reality Strikes Home
You gzip your JavaScript before you send it over the wire, right? Right!? Well guess what? Prior to my premature optimization, the LZ77 + Huffman encoding we all know and love already found the instances of "prototype" and "blah" and put them into a surprisingly efficient bundle during compression. In fact, it found all of the instances of ".prototype.blah = " and tucked them away. In addition, it found all of those pesky function directives and went one better by grabbing "function Object" for good measure. And don't forget " () {}" too.
Now here I am with some minified rubbish before me. Sure gzip can deal with "[P][B] = ", but now it is completely at a loss for the new variables declared at the top of the script. No compression there. So not only have I introduced code that cannot be effectively compressed, I have now tried to outdo a tried and true compression library with some quick text-based hack. Any guesses about how that one will turn out? If you said, "Your JavaScript Frankenstein ends up larger when gzipped than my original code," you get a gold star.
Manual JavaScript Size Optimization is Counterproductive
When I tried to be too smart for my own good in this case, bad things happened. Let's enumerate:
- My code is larger over the wire
- My web page takes longer to load.
- My code is harder to understand.
- My code is harder to fix if something breaks!
The Moral
Obviously this example code snippet is so small, compression is a moot point. However, the story is the same — in fact, even more relevant — as file sizes get bigger and bigger.
Write your algorithms as efficiently as possible of course, but leave the final optimization to the parser/compiler unless you really have a lot of free time... or you write JavaScript parsers and/or compilers. Yes, you could probably hand-tune something that shaves off 3 bytes just like you could when writing in assembly because you don't trust that shifty C compiler.
Sometimes, you've just got to let it go.
99.999% of the time, you'll do just fine writing clean, manageable code, using Google's Closure Compiler, and transmitting through mod_deflate or equivalent. Some servers even allow you to compress your files ahead of time to save CPU. Imagine that! So spend your free time examining the tools around you rather than banging out bread dough on a rock with your bare hands. The results will speak for themselves.
No comments:
Post a Comment