Computers are modeling systems. Good ones. So good, in fact, that it’s easy to forget the worlds they create are not worlds at all, but models.
When a computer models a typewriter, we call it a word processor. It can model a spreadsheet, a checkbook, a weather system or a city’s electrical grid. If the computer simulates the model accurately enough, then we hook it up to the pulleys, printers, and devices of our real world. That’s when the model of a typewriter crosses over into the world of things, and actually types a page. But the word processor itself - the program with which we interact - is still just a simulation.
When I was a school kid, those of us interested in computers were taught how to build models on them. We learned computer languages with names like Basic and Pascal, and these sequences of code allowed us to create whatever models we wanted. I remember spending weeks trying to model a three-car elevator bank.
As a result of this kind of thinking, many of us playing with computers came to recognize the difference between a model and reality. We understood that the models we create, in many cases, lack the complexity of the real world. From an MP3 music file to a population calculator, models can never quite capture the granularity of reality. (Incidentally, this principle accounts for why so many computer programmers became libertarians. It was not simply because they had grown rich and greedy in Silicon Valley, but because they came to realize that the models we use to organize our world - like government, legal systems, and regulations - usually lack the complexity of the things they are meant to represent.)
Today, most people no longer understand that computers model. School children are not taught how to make models on their terminals, but rather how to use the models already in existence. They don’t learn programming - they learn word processing and web-page layout. (I am sorry to inform you, parents, that HTML is not programming, but a mark-up language. It’s equivalent to the kinds of commands you’d type into an old word processing program.) Instead of discovering how to write software for themselves, they are simply learning how to use the stuff already on the shelves.
This may not be such a terrible shame in itself. Many of us know how to drive a car without understanding how the engine works, or how to build one ourselves. But knowledge of new media technologies is different in several respects.
The more we understand about the way in which a medium is produced, the more able we are to distinguish its representations from reality. Take reading. Almost anyone who can read can also write. As a result, we understand that just because something is written down, doesn’t mean it’s true. Likewise, once we become familiar with video camcorders and tape editing as home-movie producers instead of just television viewers, we also come to question the veracity of some of the stories being piped into the TV set.
By understanding our computers and networks as a set of fixed models rather than a modeling environment, we effectively cease the evolution of the medium. We can no longer see that decisions have been made about how the technology is being used. It’s like buying a VCR and never finding out that you can put in another tape, or even record with it. Or buying a book without ever getting a pencil - or even knowing that a pencil exists.
Today, most Western college students would rather learn how to make web pages or write business plans than learn actual coding. It’s hard to blame them. The model we’ve decided to build on our computers and networks is that of a market. That’s right - the World Wide Web is a model of a market. It has hooks to the real economy, but it is a model. As such, it will require more and more resources - namely, money - to maintain the illusion of its effectiveness. And because we no longer have the ability to change the models we built, we are forced to accept it as real.
It’s not as if there’s some conspiracy of programmers keeping our hands off the control knobs. One of the side-effects of a profit-driven model of technology is that software become proprietary. The less we know about the file formats of a program like Outlook, for example, the more irreversible is our decision to use the program for storing email and appointments, and the more easily we can be cajoled into senseless upgrade cycles.
In fact, the code for most Windows programs has grown so complicated and opaque that no human being understands the entirety of any single program. Now that fewer college computing science students graduate with even a working knowledge of programming languages like C++, we stand little chance of ever figuring them out. Unless an open operating system like Linux finds a way to subvert our addiction to the market model of software production, we will grow evermore distant from the source of our code.
Ironically, perhaps, the West is now depending on members of other societies to maintain our models for us. An increasing number of our programmers come from places like Bangalore, where engineers are still taught unfashionable cyber-bricklaying skills like coding. They, alone, will be capable of distinguishing between the real and arbitrary, and will likely direct the evolution of our models for years to come.
Meanwhile, those of us blindly invested in the maps our computers have drawn, will increasingly be depending on the kindness of strangers. Let’s hope they’re nicer to us than we were to them back when we were the world’s cartographers.