While I had at first thought that urlnorm would be a generalized URI normalization tool, it's become apparent to me that I have no interest in dealing with specialized schemes (such as
goim, or even
mailto). Really all I want is a bit of software that can deconstruct HTTP and HTTPS URLs, normalize those parts, and put the whole thing back together. Furthermore, I want it to be able to handle a URL typed in by a user without a scheme prepended. Browser address bars handle this like champions, and it's important to have this functionality in the context of typing your favorite website's URL into a feed reader and getting back a feed to subscribe to.
That's urlnorm's primary purpose, but there's a secondary purpose that's as-yet unrealized: a simple way to strip out crud that's attached to URLs for marketing and tracking purposes. Feedburner is a service that does this, as do many, many other services, and it drives me bonkers to see people link to things with all of those additional tracking bits still attached!
So it's nice to have that figured out. I hope to have a release ready soon.