Sunday, October 29, 2006


One of the reasons that I love science is that it gives me the power to do things. It is probably one of the things that attracted me to chemistry as a kid- the ability to make things explode, burn, gel, polymerize, change color- all of these have a visceral appeal to them.

I was a kid when computers first became available to individuals. There is no doubt that I fell in love with them for exactly the same reason.

What makes chemistry so hard is not that there is any one part that is beyond the comprehension of an idividual of normal intelligence, but that the subject is so vast and interelated that it is difficult to get a hand hold in the beginning. And frankly, what one learns in the earliest formal class in chemistry doesn't give one much ability to do things. It gets a little better in organic chemistry, the power yield, but the input in time, and the necessity to master a daunting array of facts by internalizing a handful of powerful organizing principles escapes many, if not most, students. But this is what is necessary to use chemistry to do things.

Computer programming, when I was a kid up through college, was similarly daunting. I learned Basic, then FORTRAN, assembly, then C, then C++. Each new project was begun anew, though I learned to save my earlier work.

Paradigms began to emerge in computer science that conspired to save prior work and coding. The use of libraries of code meant that once something hard was figured out, one could include this work in new programs. Changing and extending this could be problematic, but it was very much an improvement.

If libraries of code were designed well, the person using them did not need to worry about how they worked.

This continues to this day, with the advent of programming models such as Object Oriented Programming. This took some time for me to understand, but the benefits were clear- when designed well, it allowed even more code reuse, less worry about how the implementation was done, but significantly, it allowed one to 'grandfather' in earlier code, using inheritance, to make a new variant of the code that extended or changed the original without doing any modification to the original directly.

There is a similar seismic shift in Web-based programming. We are in the early phases of this, although web applications are available that have the functionality of desktop apps (Google's Spreadsheet and Wordprocessing applications are examples. You need a Google account to get to them, but that is free, as are the apps.)

It is particularly easy to use some of the resources available on the web- Google, Yahoo, Amazon, and others are providing application programming interfaces (APIs) to their services. This allows someone with very limited programming experience to leverage the work (and servers!) of these companies to accomplish goals in their own webpages.

Both Yahoo and Google have map services and provide APIs for them. With just a few lines of code, I can (for example) bring up a map of the National Zoo in Washington DC. All the work to find and display this data, to say nothing of the expense and work that went in to gathering and entering it into computers, is available to me free of charge. The force multiplication is hard to imagine.

Blogger, unfortunately, doesn't seem to allow javascript in entries, or I'd show some. I have begun to set up a server at home, and I will link to stuff I do in the future.

Monday, October 16, 2006

Think Different

I recently bought a Mac. I use a PC every day at work, and haven't really fiddled with Macs much since I was a computer system manager at a commercial printing company in the early '90s. I wanted a new computer, and thought long and hard about the new Intel-based Macs. I'd like to be able to say that I made the decision rationally, but actually, I gambled that the new Macs would just be cool.

I have not been disappointed. There have been years of fighting back and forth between Mac and PC users about which is the best. I'm not going to go there, except to note a few things.

Macs are easier to use. While I have not encountered anything that I simply couldn't do on the PC that I can now do on the Mac, most of what I have done has just been easier. I have downloaded only one driver, and yet I have connected my camcorder and digital camera, and every other thing with a USB or firewire port on it, to the Mac. Worked without any fuss. I was able to do this with my PC only after installing drivers, and fussing. I think I am pretty much at the top end of computer literacy (about half of my professional work involves writing software), and am frankly amazed at how easy setting up the Mac has been.

The Mac OS X is based on Unix. So automatically there is a crapload of free high-quality software available. I use OpenOffice for basic productivity work like word processing. Free. High quality. I have set up Linux on a handful of computers over the past few years. OS X is far, far easier to manage. Not to cut on Linux, because I love it. But OS X is far less fussy.

Development tools are FREE. You can get crippleware versions of Microsoft's development tools for free, but their professional-level tools are very expensive. You have to sign up for Apple's Developer Program, but this is free, and then you can download their development tools. The real stuff. I have been programming in one capacity or another for over 30 years. The process has become quite involved for modern operating systems. The Apple tools are as good as I have seen.

Overall, from the point of view of a computer user, the Mac is great, a very empowering tool. From a programmer's perspective, I am still in the early stages of finding my way around, but what I have seen so far is phenomenally good.

I like my PCs, and there are places where industry software is more available for the PC. I have a lot of programming that I do that uses a plain old serial port, which the Mac lacks (though it could probably emulate this using the USB port). And the model Mac I have doesn't have slots that I can insert interface cards into (though, again, most of the stuff I want to do with data acquisition can be done via USB port these days). On the other hand, my Intel Mac can boot Windows if I need it too.

Monday, October 02, 2006

Hey, y'all look at this...

One thing that has sort of hung over the success of Apollo !!, the mind-bending historical high-water-mark for human technology, is that Neil Armstrong gaffed his lines when he stepped on the moon for the first time. It is telling that this is what people seized upon. And what it tells isn't pretty.

After all these years, it appears Neil was right when he claimed that he had gotten it right. Sophisticated audio analysis found the 'a' in his famous phrase, "That's one small step for a man, one giant leap for Mankind."

Since there are loons that think we never actually went to the moon, I am certain some people will not believe this, taking some sort of bizzare pleasure in shitting on one of mankind's (and frankly, Neil Armstrong's) accomplishment. This will continue to reveal the ugly, petty side of human nature.

If it were me, I'd have been hard pressed to not say "Lookit me, I'm on the goddammed moon!" People would still be talking about that, but there would not be any ambiguity.