More on Build Processes for the Web
My last screencast was about iteration and build processes for web development. Here’s a bit more on the build process.
The big 4 for web performance are (in this order):
- gzip text before it leaves your server. All of that CSS, JavaScript, JSON, SVG, etc. should be squished before it travels across the wire.
- Smart caching. The fastest resource request is the one that never happens.
- Image optimization. Images are still the biggest pipe suck for the majority of sites.
- Concatenate and compress. Fewer HTTP requests and smaller files sizes are better.
Chris Coyier gave one of his usual amazing talks on just this subject.
The first two problems are part of your server configuration. If you’re on Apache, grab the .htaccess
file in html5boilerplate and put it in your httpd.conf
file. It’ll fix the first two problems and a bunch of others. The last two, ye developer, are in your court.
Image Optimization
Image optimization can save a remarkable amount of bandwidth and cut down page load time significantly. The best tool I’ve used is Trimage (Linux), which is lossless, handles PNG and JPG, and can be automated via the command line as part of your build process. You only have to run it on an image once - running it successive times on the same image won’t do anything (hence lossless). As such I’ll usually comment it out of my build processes once the application is mature. Trimage was inspired by ImageOptim on OS X which you can use there; I haven’t found anything I like on Windows yet.
Concatenation and Minification
You’re doing two things here. The first is reducing the number of HTTP requests. The number of simultaneous requests to a server varies by the browser, and currently range from 2 to 8. Once you hit that number the requests have to get in the queue, which slows page load. The second thing you are doing is minifying the concatenated file, removing any unnecessary characters/white space from the code. That lowers the size of the file over the network.
A common misunderstanding is that you don’t have to minify code if it is getting gzip’d before it leaves the server, as that will smush it up for you anyway. That is not the case; you should do both.
Original: 28,781 bytes
Minified: 22,242 bytes
Gzipped: 6,969 bytes
Min+Gzip: 5,990 bytes
Side Note: Invalidating Cache on Individual Files
You have far-out expiry dates for your JavaScript. Congrats! You just changed your HTML and JavaScript and now the two are out of sync and your site is busted. Balls!
There are various ways to deal with this. My preferred method is adding a phony GET parameter to the request.
1 | <link rel="stylesheet" href="css/main.css?foo=2735"> |
HTML isn’t cached, so the browser fetches it every time. When you change the GET argument on the URL, the browser will see a different URL from the one it got last time, invalidating the cache and fetching the file. This isn’t the only way to do it, and there is some controversy about this approach. It always seems to work though, and it is what Stack Overflow itself uses. Your CSS and JavaScript files of course completely ignore the argument. I usually use foo, short for foobar, which is generally synonymous with a placeholder or garbage.
How I Do It
I do this in two stages. I do concatenation as part of my iteration process so I can point the HTML links directly to what the final location will be. I use Guard for this, which also preprocesses my SASS into a single CSS file.
1 | # sass to css |
I don’t minify at this stage. Debugging minified CSS and JavaScript during iteration is a pain in the ass. Instead my final build process including minification via YUI-Compressor, invalidating cache for CSS and JavaScript via text replacement, and image optimization. As I split time between Linux and Windows, I have separate build scripts for both. If you are using Sublime Text, you can differentiate your build process by platform like this (build files in your Packages/User folder with a .sublime-build extension):
1 | { |
Windows gets a PowerShell Script and Linux gets a bash script.
1 |
|
1 | # Generate Random Number for Cache Busting |
These scripts basically do the same thing. A random number is generated, the CSS and JavaScript files are minified, and some cache busting takes place in index.html
via some regex text replacement. The only difference is the bash
script also optimizes images. Once the build script is run, the application is ready for deployment on a web server.
Easier Than It Looks
This looks like a lot of work, and setting it up the first time can be. After that it’s a piece of cake. When I want to work on an app I fire up Guard, which does my livereload and preprocessing/concatenization. When I’m ready to deploy, I hit CTRL-B in Sublime Text and it minifies, does cache invalidation, and optimizes images. Getting the iteration and build processes set up for a new project takes minutes. And the most important thing: it’s automated. For a busy dev, no matter how compelling the benefits, if you can’t automate it, it probably isn’t going to happen.