Computers. I hate 'em.
Not really, of course. They are just kind of frustrating, at whatever level of expertise one might have. My dad is not a newbie, but most of the problems he calls me with are relatively straightforward for me to fix. In my own work, I can find myself chasing a subtle bug in a program, or looking for days for documentation to figure out how to do something that seems like it ought to be easy. And for just about everyone, from rank amateur to the real hardcore programmers I know, there are times when things happen that just defy explanation.
One reason for this is that computers have become complex enough that they can retain a rather long memory of things that are done to them. Murray Gell-Man, a physicist, was recently talking about how the universe is the product of fundamental rules plus a bunch of accidents. (You can find this talk, and plenty more to entrance you, at TED.com. If you like hearing smart people talk about the things they know about, you can waste a lot of time at the TED site.)
It made me think- computers have all sorts of 'accidents' going on all the time. Things happen that perturb the system in ways the programmers did not predict, and probably could not predict, even in principle. Some are obvious bugs- some action a person takes sends the computer to a nonsensical part of memory. Others may be more subtle, where some lack of resources was not imagined, and was never encountered in testing. This sort of lack of robustness is a type of flaw, but not really a bug, per se.
A computer system's function relies on keeping accurate track of what resources are in use. When I program, one thing I have to be careful of in some languages are 'memory leaks'. This is when I tell the computer "hey, give me a chunk of memory to store this picture of a baboon's ass" but forget, when I have processed the baboon butt, to release the memory. C and C++ make it pretty easy to do this. Java and some other languages do 'garbage collection' automatically, but to do so, they have to have considerable resources tied up to act as garbage collectors. There is usually a trade-off between safety of code and performance.
As computing power increases, it becomes more practical to use 'safer' languages, since the performance hit is not so noticeable. It makes it easier for people like me, people who want to use computational power without devoting my entire professional life to it, to still do useful things.
In the olden days (heck, up to a few years ago) it was pretty hard to use things like networking or serial ports without a lot of detailed knowledge of what the hardware and software does when it moves data around. The realization that jobs like this (and stuff like making lists, printing, etc) were being done over and over again made some really smart people start thinking about how to reuse software efficiently. For a long time, there have been 'libraries' of code to help do hardware access, and all sorts of computational and database operations. Still, many of these have been complicated, and most of them islands unto themselves, requiring serious investments of time to learn to use them efficiently. Learning one system might not teach you anything about any other.
Some time back, people started thinking about how they could abstract things, how they could hide details so that a programmer could use a resource without worrying exactly how things worked at the level of individual bits, or hardware.
This has proven to be very successful. Rather than writing data to a location in memory to use a serial port, I can use a serial port 'object' that hides a lot of the crap I don't care about, at least not most of the time. As important, but more subtle, is that this kind of 'object oriented' programming provides a metaphor that can make the whole process seem much more natural. It takes discipline to do well, and I am still a novice. But the force multiplication in this method is palpable very quickly.
Some hardcore programmers resist this. They scoff at people who aren't guru enough to manage their own pointers and memory allocation. They don't want any detail hidden. A real programmer should know how every last bit is twiddled.
They remind me a bit of the guilds of clockmakers who resisted the introduction of standard parts and machine tools. Beautifully crafted timepieces can be made with little more than hammers, saws, and files. But that kind of craft means that every clock has its own personality. When your objective is timekeeping, and not art, then standardization becomes important. I want artisans to keep at it, but I don't want to have to be one to know what time it is. There just isn't time to reinvent everything.
There will never be any system that can't be misunderstood, nor any detail that can't be forgotten or overlooked. And it gets frustrating, ever more so as we all become more dependent on computers for all that we do. The truth is, though, that I really love computers, and the better they get at hiding details, and presenting their resources as abstractions that I can use, the more power they yield. And I dig that.
0 Comments:
Post a Comment
<< Home