Occasionally I remember how old I am. Thinking about how I got into computer programming, I usually tell the story about how I was working doing data entry and got tired of the repetitive nature of the job, so automated a piece of it and ended up drawing the attention of the IT department as a result. (I still keep in contact with that guy, 20 years later.) Thinking about it, though, I realized that my start was a lot earlier than that. I realized this when reading an article on The Law of Accelerating Returns. Something in there struck me as being … well, wrong.
The movie Back to the Future came out in 1985, and “the past” took place in 1955. In the movie, when Michael J. Fox went back to 1955, he was caught off-guard by the newness of TVs, the prices of soda, the lack of love for shrill electric guitar, and the variation in slang. It was a different world, yes—but if the movie were made today and the past took place in 1985, the movie could have had much more fun with much bigger differences. The character would be in a time before personal computers, internet, or cell phones—today’s Marty McFly, a teenager born in the late 90s, would be much more out of place in 1985 than the movie’s Marty McFly was in 1955.
Now, I don’t know about you, but my first DOS-based computer resembled something like the PC3 “LunchBox” Portable Computer, and came to me in something like 1984. Of course, somewhere around the same time we were playing with the Commodore Vic-20 (came out in 1981), Commodore 64 (1982), and the Commodore 128 (1985). So, no, going back to 1985 wouldn’t be all that shocking. Yes, it’d be annoying to have to use a card catalog in order to find something, rather than asking teh interwebs. It’d also be strange not to have call waiting, or cell phones, but I can’t say that it’d be particularly troublesome overall. Nor can I really say the world was all that much different.
I got to thinking about how long I’ve been writing software (this time) because I’d been asked to pull together some screenshots and instructions for a database application I built back in 1998. This application is still running, 18 years later, and still the “system of record” for the company. This and a couple other systems I’ve written are still ticking over in some form or another (this one’s running on a virtual machine just to keep it alive, because nobody can get the software any more, and nobody really knows how to replace it – I had to install an older development tool just to convert it to what it would have been in 2003’s format so that I could convert it to the current format and have a look through things.).
In any event, I think it’s important to point out that yes, the rate of technological change is ever-increasing. On the other hand, there are these bedrock systems which keep on running that nobody is willing to replace because they aren’t broken – they still do their job just fine, and there really is no need to change them. (Have a look at this PCWorld article, for instance.) In parallel with these systems, old code keeps on ticking over, and continues to work (e.g., just about the entire Banking sector of the UK runs on COBOL, or the VA Hospital’s Electronic Medical Records system is .NET wrapped around Java wrapped around Delphi wrapped around a file-based storage system – so, your medical record is just a text file somewhere, when all’s said and done). Other, operating-system type foundations have also not shifted – there really are only 2 operating systems in use today, *NIX and Windows NT – and those have been around for decades – everything added to them is just window-dressing.
It’s only the surface of things which has really shifted – the core remains as it was 20 or even 40 years ago. Yes, computers are much faster. Yes, computers are way smaller, and in seemingly everything. But I just don’t see the level of technological change being all that huge even now, nor do I think it’s changing as rapidly as Kurzweil thinks. Or, rather, I don’t think that the entire ecosystem changes as rapidly as all that – it’s that the outliers are arriving faster, but their adoption depends upon their incorporation into the devices and technologies we already use, which is necessarily slowed by our very humanity.
So. Take the time to look back at all the computing you’ve done, and realize how much things haven’t changed, despite the new names and different packages. Ignore the window-dressing and really think about the technology and you may be surprised at how, really, things haven’t changed. Sure, if they implanted teh interwebs into your head you’d be hugely changed – and, yes, they’re working on that somewhere – but do we really see it happening in our lifetimes? I really don’t think so, because I really think that the rate of change is not solely governed by tech, but by the economics of the matter, and by our ability to incorporate that change.