More on Build Processes for the Web

My last screencast was about iteration and build processes for web development. Here’s a bit more on the build process.

The big 4 for web performance are (in this order):

  1. gzip text before it leaves your server. All of that CSS, JavaScript, JSON, SVG, etc. should be squished before it travels across the wire.
  2. Smart caching. The fastest resource request is the one that never happens.
  3. Image optimization. Images are still the biggest pipe suck for the majority of sites.
  4. Concatenate and compress. Fewer HTTP requests and smaller files sizes are better.

Chris Coyier gave one of his usual amazing talks on just this subject.

The first two problems are part of your server configuration. If you’re on Apache, grab the .htaccess file in html5boilerplate and put it in your httpd.conf file. It’ll fix the first two problems and a bunch of others. The last two, ye developer, are in your court.

Image Optimization

Image optimization can save a remarkable amount of bandwidth and cut down page load time significantly. The best tool I’ve used is Trimage (Linux), which is lossless, handles PNG and JPG, and can be automated via the command line as part of your build process. You only have to run it on an image once - running it successive times on the same image won’t do anything (hence lossless). As such I’ll usually comment it out of my build processes once the application is mature. Trimage was inspired by ImageOptim on OS X which you can use there; I haven’t found anything I like on Windows yet.

Concatenation and Minification

You’re doing two things here. The first is reducing the number of HTTP requests. The number of simultaneous requests to a server varies by the browser, and currently range from 2 to 8. Once you hit that number the requests have to get in the queue, which slows page load. The second thing you are doing is minifying the concatenated file, removing any unnecessary characters/white space from the code. That lowers the size of the file over the network.

A common misunderstanding is that you don’t have to minify code if it is getting gzip’d before it leaves the server, as that will smush it up for you anyway. That is not the case; you should do both.

Original: 28,781 bytes
Minified: 22,242 bytes
Gzipped: 6,969 bytes
Min+Gzip: 5,990 bytes

Side Note: Invalidating Cache on Individual Files

You have far-out expiry dates for your JavaScript. Congrats! You just changed your HTML and JavaScript and now the two are out of sync and your site is busted. Balls!

There are various ways to deal with this. My preferred method is adding a phony GET parameter to the request.

Phony GET Parameter
1
2
<link rel="stylesheet" href="css/main.css?foo=2735">
<script src="js/main.js?foo=2735"></script>

HTML isn’t cached, so the browser fetches it every time. When you change the GET argument on the URL, the browser will see a different URL from the one it got last time, invalidating the cache and fetching the file. This isn’t the only way to do it, and there is some controversy about this approach. It always seems to work though, and it is what Stack Overflow itself uses. Your CSS and JavaScript files of course completely ignore the argument. I usually use foo, short for foobar, which is generally synonymous with a placeholder or garbage.

How I Do It

I do this in two stages. I do concatenation as part of my iteration process so I can point the HTML links directly to what the final location will be. I use Guard for this, which also preprocesses my SASS into a single CSS file.

My Generic Guard File for Iteration
1
2
3
4
5
6
7
8
9
10
11
12
# sass to css
guard 'sass', :input => 'assets/sass', :output => 'public/css', :line_numbers => true, :all_on_start => true

# javascript concatenation to js/main.js
guard :concat, type: "js", files: %w(vendor/bootstrap/bootstrap-tooltip vendor/bootstrap/bootstrap-modal vendor/bootstrap/bootstrap-transition vendor/bootstrap/bootstrap-button vendor/bootstrap/bootstrap-popover vendor/bootstrap/bootstrap-alert vendor/jquery-ui-1.10.0.custom.min vendor/jquery.ui.slider.labels vendor/jquery.placeholder vendor/jquery.debounce plugins map page), input_dir: "assets/scripts", output: "public/js/main"

# live reload
guard 'livereload' do
watch(%r{public/js/.+\.(js)$})
watch(%r{public/css/.+\.(css)})
watch(%r{public/.+\.(html)$})
end

I don’t minify at this stage. Debugging minified CSS and JavaScript during iteration is a pain in the ass. Instead my final build process including minification via YUI-Compressor, invalidating cache for CSS and JavaScript via text replacement, and image optimization. As I split time between Linux and Windows, I have separate build scripts for both. If you are using Sublime Text, you can differentiate your build process by platform like this (build files in your Packages/User folder with a .sublime-build extension):

webapp.sublime-build
1
2
3
4
5
6
7
8
9
10
11
12
{
"shell": "true",
"selector": [ "Guardfile" ],
"linux":
{
"cmd": [ "./build.sh" ]
},
"windows":
{
"cmd": [ "./build.ps1" ]
}
}

Windows gets a PowerShell Script and Linux gets a bash script.

build.sh
1
2
3
4
5
6
7
8
9
10
11
12
13
14
#! /bin/bash

# Generate Random Number for Cache Busting
rand=$RANDOM

# Minify CSS and JavaScript
yui-compressor --type css -o public/css/main.css public/css/main.css
yui-compressor --type js -o public/js/main.js public/js/main.js

# Bust cache for anything with foo argument
sed -i "s/?foo=[0-9]*/?foo=$rand/g" public/index.html

# Optimize Images
trimage -q -d public/images
build.ps1
1
2
3
4
5
6
7
8
9
# Generate Random Number for Cache Busting
$rand = Get-Random

# Minify CSS and JavaScript
java -jar ../yuicompressor-2.4.7.jar --type css -o public/css/main.css public/css/main.css
java -jar ../yuicompressor-2.4.7.jar --type js -o public/js/main.js public/js/main.js

# Bust cache for anything with foo argument
(Get-Content public/index.html) | ForEach-Object { $_ -replace "^?foo=[0-9]*", "foo=$rand" } | Set-Content public/index.html

These scripts basically do the same thing. A random number is generated, the CSS and JavaScript files are minified, and some cache busting takes place in index.html via some regex text replacement. The only difference is the bash script also optimizes images. Once the build script is run, the application is ready for deployment on a web server.

Easier Than It Looks

This looks like a lot of work, and setting it up the first time can be. After that it’s a piece of cake. When I want to work on an app I fire up Guard, which does my livereload and preprocessing/concatenization. When I’m ready to deploy, I hit CTRL-B in Sublime Text and it minifies, does cache invalidation, and optimizes images. Getting the iteration and build processes set up for a new project takes minutes. And the most important thing: it’s automated. For a busy dev, no matter how compelling the benefits, if you can’t automate it, it probably isn’t going to happen.