I graduated from high school in 1982. Computers had really just begun their meteoric rise to domination of our work, leisure, and culture.
This is a photograph from my senior yearbook. The computers here are the Commodore PET, certainly a step up from the ubiquitous TRS-80 that I first programmed, but not quite as clever as the Apple II. We learned about programming from a fireplug of a guy, a retired engineer whose name I do not recall. He taught "Computer Math" and physics, but I remember him best for revealing the existence of "machine language", the real ultimate language that the computer spoke. All of our BASIC programs had to be turned into machine language to be executed by the computer. This one idea has fascinated me ever since.
In those days, obviously, the idea that you could look at content, with typesetting and pictures and links to other stuff, that was made on one person's computer and hosted on yet another, was the stuff of science fiction at best. To us, who were there when modern computing began, software was not the focus that it is now, largely because there was very little. As a result, there was a drive to understand what was happening down at the metal and silicon that dominated a lot of computer junkies.
The best analogy I can think of is neurobiology. In those days, we had the equivalent of leeches and slugs for computers, so the fun part was knowing what every neuron did. Now, I think we are more or less at the rodent or lemur stage, so we can ignore the lower levels; to some extent, it is counter-productive to think about the lower levels too much now that there are stable paradigms for hiding the details. The details have to be obsessed over by someone, of course, but the real leverage is at the cutting edge of things like distributed processing, web programming, grid computing, and integration of various services like mobile and internet, and the creation of services that live on this stable base that has been built over the last 26 years.
Sometimes, I try to look at the desktop on my iMac with the same eyes that stared at the white on black text and block graphics of the PET all those years ago. I knew then that things would get wildly better- I could see by analogy to other technology like television and radio and so on that we were at the horse-and-buggy stage of computing. But I am still amazed at what it has become, and realize that we have only begun to plumb the potential of the technology we already have, let alone what will come next. Living in a time of near exponential technological expansion makes prediction difficult, but I'm willing to be optimistic and bet that it will be good.