25 November 2004

Commerce Clause vs. 21st Amendment

The “commerce clause” and the 21st amendment’s will face off in a case to be argued before the U.S. Supreme Court on December 7th. The commerce clause has traditionally been held to prohibit per-state restrictions on interstate commerce. The 21st amendment however contains the following language:

Section 2. The transportation or importation into any state, territory, or possession of the United States for delivery or use therein of intoxicating liquors, in violation of the laws thereof, is hereby prohibited.

So can states actually regulate interstate commerce by blocking importation of alcohol in ways that may inhibit interstate commerce? We’ll find out sooner or later…

I happened across this story in the WSJ. A few press outlets have talked about the story including The Detroit News. Search Google News for possibly relevant articles. Although there hasn’t been a lot of coverage of the ways that this sort of deregulation could influence other markets and deregulation pushes there.

21 November 2004

Combined Federal Blacklist Campaign

Generally we have done our donations to charitable causes through the CFC. Federal employees in King County donated $2.8 million to the various charities in CFC last year. In the past, we were able to donate to groups including the Cascade Land Conservancy (a group that buys land or conservation easements on property in Snohomish, King and Pierce Counties), the ACLU (a group dedicated to preserving our ability to speak, move and act free of government interference), and the EFF (a group dedicated to preserving the Internet as free and open for sharing of all data).

Things are different this year. As we perused the CFC booklet for the folks we usually donate to, a number of organizations were missing. Try as we might, we couldn’t find the ACLU, EFF, the Sierra Club, Amnesty International or a variety of other groups. This struck us as quite odd. These are well respected non-profit groups, that is outside of claims by radical clerics that the ACLU caused God’s wrath on 9/11.

As it turns out, the CFC had added a new requirement that non-profits who received funds from federal employees’ donations into CFC had to vet their employment and volunteer rosters against a government supplied blacklist of “terrorists”. If they found any matches they were required to inform federal authorities and fire the employee until resolution. This is the same sort of blacklist that keeps Cat Stevens from visiting the United States. Both the ACLU and the EFF strongly believe that a person’s employment and volunteer efforts are a matter between that person and the organization they work for. That’s what freedom of association and presumption of innocence is all about. Being a known felon convicted of a crime in a trial is one thing, being added to a list with no opportunity to contest your inclusion is quite another.

Once on the “terrorist volunteer” list or the “no-fly” list or Sen. McCarthy’s “communist sympathizer” list, it is extraordinarily difficult to get removed. It took Sen. Ted Kennedy talking to Tom Ridge and several weeks to get the TSA to remove him from the “no-fly” list. I can’t presume that this list imposed on non-profits would be any easier to get removed from. Especially when the list is not publicly available and there is no contact for removal.

Things such as this are uniquely anti-American. They chisel away at the freedoms and values that once set us apart form the rest of the world. Federal employees and other citizens who would like to complain to the head of the CFC about this policy, can contact the Director of CFC at:
Mara Patermaster
Combined Federal Campaign
Office of Personnel Management
1900 E Street, NW
Washington, DC 20415-1000

In the meantime, you can still donate to the ACLU, the EFF, the Sierra Club, and any of the other groups that you might normally give to under the CFC.

16 October 2004

Rising Waters

The Snoqualmie River is getting into its flood season again. The autumn rains bring the river up over its banks and flood out the low-lying roads near us and many roads and farms built on the floodplain further downstream. Flooding along the river is fairly reliable annual event, and gives the local television news people an excuse to drive out to this neck of the woods where not much else happens.

If you’ve ever wondered when the best time to see Snoqualmie Falls is, this is it. The huge volumes of water coming down river after a heavy rain on the west slopes of the Cascades help make up for the fact that 75% of the river water is diverted into the power generation plant. Compared to the anemic flow during the summer, the falls are quite impressive during the rainy autumn season.

15 October 2004

WordPress-Pg 1.2.1 Nearly Released

I’ve gotten the changes for WordPress 1.2.1 merged into the WordPress-Pg project’s CVS on SourceForge now. Theoretically you should be able to check this out with pserver and run it now. I’m working with the project administrator now to get a release tarball together and available. I’ll likely post again when that is done.

My own site here is going to have to languish a little bit because my magic caching action is not yet synced into a CVS branch on its own.

I’d also like to take this moment to point out that DOS line endings are very very evil. The WordPress parent project’s CVS repository has a number of files that have been polluted with DOS line endings which makes doing vendor imports and merging incredibly more painful than it needs to be.

11 October 2004

WordPress-pg Gets Faster...

As a followup to my earlier complaints about the slowness of WordPress (the software that creates this page you see), I’ve managed a pretty good speedup with some lazy caching of filtered content. I’ll make my changes publicly available in the next day or two when I manage to get things cleaned up a little. The basic operating theory is to take the plugin filtered content and save it into the database and then use that rather than refiltering the text every time the page is rendered, of course this work is done when an article is requested that is not already available prefiltered.

Of course, there are some catches.

  • If you change your set of content filter plugins you have to invalidate the cached copies of the filtered content.

    • This could be very easily automated with the wp-admin page that changes the plugins clobbering the cached copies.

  • The Textile filter in particular needs some minor tweaks to keep it from gobbling the comment based “more” and “nextpage” directives for WordPress.

    • My quick solution to this was to change the normal single hyphen for the del tag in Textile to be three hyphens.

    • A better solution would be to armor the more directive somehow. Or change it to be less susceptible to being clobbered. Or fix the Textile filtering engine so it leaves comments the hell alone. I think I like the last one best.

  • Interactive plugins like the search word highlighting plugin will not work.

    • Fixing this requires the WordPress authors to separate cacheable plugins from interactive ones.

  • I need to backport the security fixes to WordPress-pg

I should point out that this has resulted in a increase in rendering speed for this home page from 3 seconds to 0.4 seconds on some slow old hardware. Nearly an order of magnitude. And from surfing around other WordPress pages, it looks like I’m not the only one that could use this little fixup. I must admit though that I’m not following what is going on in active development of WordPress.

29 September 2004

Research, Intellectual Property, and Innovation

There is an interesting article (“Grant Givers Turn More Demanding”) in today’s Wall Street Journal about the way that grant donators for medical research are forcing research scientists to collaborate with other donation recipients to work more efficiently toward effective treatment for targeted conditions. This apparently is a shocking change of pace for medical researchers who are used to dealing with the rather untimely process of making a discovery, confirming the discovery with further tests, generating a a journal article, submitting it for peer review, revising the article to fix criticisms, resubmitting it, waiting for acceptance, waiting for it to be printed, and only then does the information become available for other researchers. This obviously isn’t an ideal model for rapid development and discovery.

This ties in a way with an article a friend pointed out about how the bubble proved small teams tend to be more efficient for accomplishing tasks. I think the combination of these two suggests there is certainly a way to do the small team concept the wrong way. That is, with the balkanization of small teams comes a huge reduction in efficiency. While the small teams may be very productive internally, the lack of intellectual discourse on the subject can become a substantial obstacle to boosting that productivity to the next level and a huge obstacle to innovation. You get a vast amount of duplicated work where there isn’t sufficient communication and shared effort.

The open source community tends to have an aversion to this sort of secrecy as well. There was a great deal of anguished discussion in the FreeBSD project when development was taking place in a Perforce environment that wasn’t accessible to some of the developer community. This discussion and the outfall of having “owners” of subject areas within the project wound up driving talented developers away to form their own projects when the “owners” weren’t receptive to outside input and work on “their” property that was held in quasi-secrecy. It is interesting to note that by keeping their projects more open some of these forks have wound up producing a lot more and being more innovative than their slower moving cousins.

There is probably a middle ground somewhere which rewards the discoverers appropriately, but doesn’t derail the innovation by having one discoverer try to solve the whole problem from beginning to end. It is good to see some important medical research work this direction though, by having multiple small teams of interested parties building off each others work. It is probably a model that could be emulated with a great deal of success elsewhere.

28 September 2004

WordPress Filters Slowly

For a while now I’ve noticed that the front page on my site has kind of a long load time. It’s visible at the bottom of the page source and tends to be just over a second. Serving one page a second is not really acceptable for any sort of load, the server would peg its CPU before managing to saturate even the measily DSL line we serve from.

After wrestling with the mess that is PHP in the FreeBSD ports system, I got a profiling tool installed to see what was going on. The results aren’t too pretty.

         Real         User        System             secs/    cumm
%Time (excl/cumm) (excl/cumm) (excl/cumm) Calls call s/call Memory Usage Name
100.0 0.00 1.64 0.00 1.21 0.00 0.19 1 0.0001 1.6418 0 main
56.0 0.00 0.92 0.00 0.76 0.00 0.06 46 0.0001 0.0200 0 apply_filters
52.7 0.00 0.87 0.00 0.71 0.00 0.05 5 0.0003 0.1732 0 the_content
49.2 0.00 0.81 0.00 0.66 0.00 0.05 5 0.0000 0.1616 0 textile
39.1 0.64 0.64 0.62 0.62 0.02 0.02 705 0.0009 0.0009 0 preg_replace

The Textile filter is taking a whopping 0.81 wall clock seconds for the 5 calls to it. It is also responsible for a significant number of the preg_replace calls. The solution to this is not to filter every time you display something, but to save it pre-filtered first. There is even a nice database column “post_content_filtered” for this use. Too bad it’s not used right now.

Given the nasty performance characteristics of these WordPress filters it’s a small wonder that blog sites tend to be easily “slashdotted” off the air. While one should certainly avoid premature optimization, this seems like a place where optimization is a bit overdue. Nothing like one more pet project…


  1. Save a copy of the filtered text to “post_content_filtered” on each saved change to the post

  2. Rebuild this cached filtered text if “post_content_filtered” is empty when we look at it

  3. Provide a mechanism to invalidate the cache, or in simpler terms, empty the filtered copies so the next time they’re looked at they’ll be refiltered by the previous item.

This approach seems to be level with suggestions and at least one other implementation. Deep in my heart I wish the trigger was if post_content_filtered was “null” in the SQL sense rather than just and empty string, but that would require a schema change and probably won’t be accepted by the WordPress folks. The main integration concern I think is going to be properly filtering for some uses of “the_content” and not others. There will probably be a “the_content_static_filter” or some such for things that want to be processed and added to the cache and then the late normal filter like the existing one for everything else.

Interaction with the RSS feed and comments should probably be addressed here at some point, but I imagine I’ll figure that out as I go along. I already know that the Textile2 filter is not being applied to the RSS feed, so there maybe some adapting in order to fix that anyway. I have to admit I’m really surprised by the lack of commentary on this issue when I did some cursory searching online. Either my search terms were poor, or typical bloggers don’t really care about performance much.

05 September 2004

Patch to add Nikon "MakerNote" Support to jhead

About a year and a half ago, I wrote the author of jhead with a nice patch that fetched the extra EXIF information in the “MakerNote” section supplied by Nikon digital cameras. One would think that obtaining this extra information, above and beyond what is specified in the EXIF standard tags would be useful. Especially given that Nikon has around a 15% share of the digital camera market, and a larger percentage of the digital SLR market. Instead I got a curt reply from the author…

My philosophy has been not to implement maker note. If I start including maker note, the program becumes MUCH bigger, because each camera has its own set of maker notes. Plus, these will contain fields that other cameras don’t even have. And that makes testing and maintaining the program harder.

The solution is not to buy cameras that include lots of proprietary tags.

I can understand not wanting feature creep in the program. As far as compatibility, I had gone to some pains to not create conflicts with working files, frankly the code needs an quick automated test set anyway. And that last part, well, that’s great advice. I shouldn’t buy a camera that doesn’t include any information above and beyond the standard. Knowing what lens was used for a given picture is not at all valuable information I’d want to have when viewing pictures. Thanks for that!

As it turns out, now 16 months later, the most recent version of jhead does have support for “MakerNote” information but only for Canon. The inner conspiracy theorist in me wants to think this is part of yet another Canon vs. Nikon squabble, but never attribute to malice what can be adequately explained by apathy.

So, here is a patch I’ve written that adds back the support I once had. It is more involved than the perfunctory Canon support that the author has written, since Nikon has an “EXIF-in-a-EXIF” format for their MakerNote data and I tried to reuse the EXIF parser rather than copy the code. Let me know if this code proves useful to you.

D100 I

  • Lens

  • Focus Mode

  • Auto-focus Position

  • White Balance Name

  • White Balance Bias

  • Flash Setting

  • Flash Metering Mode

  • Noise Reduction

  • JPEG Image Sharpening

Creative Commons License

This patch is licensed under a Creative Commons License.

04 September 2004

DSL Strangelove

(or how I learned to stop worrying and love the RBOC)

I’ve spent a few hours shopping for high-speed internet service for our home again. It’s not that the existing service is particularly broken, it’s just that right now we’re paying $66 per month for 512k/256k DSL with one static IP address. This class of service is available for less half the price with any other phone company in the state. As such, I try to look around from time to time to see if anyone new happens to be offering service in our area. A better deal is bound to pop up sooner or later. Shopping for this might be easier if I didn’t want to host my own content, like this page you are looking at right now, but hosting elsewhere costs more and puts some limits on what I can do.

The folks at Comcast refuse to offer a static IP address at any price. They also are a bit reluctant to give information about prices for any of their services on their web site, which leads me to think that they are embarassingly high. There is a surcharge for the internet access if you don’t get cable TV service from them, and frankly they can peel our DirecTiVo from our cold, dead hands. They want $56 per month for 3M/256k, there’s also an install fee since there isn’t Comcast coax to our house right now.

Our friendly neighborhood Bell operating company has recently upgraded to offering better bandwidth than they did previously. There is now 1.5M/256k service for $55 per month. Unfortunately they still want an additional $20 per month for a single static IP address. As I said earlier, the reliability of the connection has been fairly good versus the anecdotal evidence I’ve heard from Comcast customers. However, when it does go down, they tend not to fess up to it on their support page, which is rather annoying coming from a position where I’m required to report and quantify that sort of downtime.

This local company bought my parent’s dialup ISP, Connect Northwest, some time ago. They advertised pretty heavily on local radio for a good long while and have been offering service nationally of late. They’re offering 1.5M/384k for $50 per month, with a static IP for an extra $5. Or at least that is what they think they’re offering; the salesman I spoke to on the phone couldn’t get information about supported speeds on my line. Odds are that this problem, was in fact, CenturyTel’s fault. He suggested that I call CenturyTel and find out from them what the line will support. Their reviews seem pretty favorable online.

Is there noone else?
Somehow there are only these three choices for land-based non-dialup internet here. Of course the lack of any significant competition goes a long ways to explaining why CenturyTel and Comcast get away with charging such exorbitant prices for their high-speed internet services. In a conversation with a sales tech at Blarg that I had a year or two ago, he told a story about the lineworkers from CenturyTel intentionally destroying competitors’ equipment at the CO. Pretty appalling stuff.

It looks like Isomedia will be a $10 per month savings with significantly more bandwidth at least on the download side. The downside will be the coordination required to get all my resources moved over. I’ll have to have our DNS root records updated and arrange some backup hosting for DNS and email while the switchover happens. Which makes me wonder if the hassle involved will be worth the $10…

01 September 2004

Surprise Pictures

A staircase into the wilderness I’ve added some pictures to the Photo Gallery from our hike on the Surprise Creek trail in the Alpine Lakes Wilderness. There are also pictures of Mocha in her cast, as well as a variety Seattle landmarks that I was photographing for contribution to the Wikipedia project.

The wooden stairs along with a lot of the other work on this trail were quite impressive. It offered a very nice change of pace from a work week trapped in a cubicle. One can never underestimate the positive power of not being within range of a cell phone tower. At the end of the hike, there is Surprise Lake which is open for camping. The sun was setting and the rain was starting to fall so we didn’t make it all the way, but it’s good to have something to work towards. The hike also served as a good test of how far I can press my repaired knee before it acts up.

26 August 2004

A New Look

Once again, this site has a slightly different look. There was no use in me reinventing the wheel to make entries like this, so I’m going to let someone else do the work.

And once again, this is going to take some time to get all the bits and pieces filled back in. Hopefully having a handy interface will make it easier to keep what content there is, a bit more timely. We’ll see how it turns out…