CSSpring Cleaning

I was talking to some front-end devs a couple weeks ago about maintaining CSS on an evolving website. In particular, how to figure out which rules are no longer being applied (remnants from experiments / old designs / whatever) and remove them.

I remembered seeing something a few months ago that claimed to do this. After digging around on GitHub for a bit, I re-found deadweight. Unfortunately, after looking at it briefly, I realized that it made some pretty strong assumptions, mainly about being run in the context of a Rails app.

Time for some hacking.

My fork can be installed with RubyGems (0.0.5 is required for deflate and gzip content-encodings):

$ sudo gem install mojodna-deadweight -s http://gems.github.com/

The first step was to make it accessible as a command-line utility (ideally in the spirit of the UNIX Philosophy).

$ deadweight -s styles.css -s ie.css index.html about.html

(This will check styles.css and ie.css against index.html and about.html, outputting unused rules.)

It also supports input from pipes, meaning that you can chain it together and filter orphaned rules without writing them to a file (you'll have to configure logging in order to limit output).

$ cat styles.css | deadweight index.html

Deadweight now contains an experimental -L argument that causes it to use Lyndon if the lyndon executable is in your $PATH (only possible on OS X with MacRuby installed). It was a bit flaky in my testing, but I may be using an older version of MacRuby. Great idea though. Check this post for more info on what it does / how it works.

The next step was to expose it as an HTTP proxy. It uses 8002 by default for no particular reason. (oauth-proxy uses 8001.)

$ deadweight -l deadweight.log -s styles.css -w http://github.com/ -P

(This dumps log output to deadweight.log in $PWD and matches http://github.com/* against styles.css.)

It's probably most useful as an HTTP proxy; you can start it with a list of target stylesheets, configure your browser to load pages through it (sorry, it gets really slow when requesting gzip- or deflate-encoded pages, which is a lot of the time), and watch the output to see how many rules it's hit. When you're done, hit ^C to stop it and (optionally) output the remaining CSS rules into a file for you to examine (-o orphans.css will do this for you when it shuts down).

If you're lucky, the file containing orphaned CSS will be empty. If not, re-start the proxy with the orphaned CSS as the target and continue browsing until you're ready to start looking for matches by hand (this is necessary, as it won't catch classes applied by JavaScript).

Things to Improve

WEBrick's HTTP proxy implementation leaves a lot to be desired (it's nowhere on the same level as Twisted's), but it's the only game in town. Ilya Grigorik has done some excellent work with EventMachine and proxies, but it doesn't appear that anyone has built a general-purpose HTTP proxy that easily supports header modification. It's not a particularly hard problem and being able to move deadweight's proxy implementation to such a base would provide the ability to turn off compression and thus speed things up considerably.

One could also imagine a world in which the proxy server also contains a web interface that displays unmatched CSS rules and updates on the fly. EventMachine would do the trick here, too.