As millions in America sat down to their Thanksgiving turkey dinners last week, we celebrated a shamelessly mythologized reconstruction of our continent’s history. According to the well-ingrained but now-disputed legend, the pilgrims’ first Thanksgiving was communal in spirit, a demonstration of the debt they owed the native Americans who taught them how to survive in a new and hostile environment.
So much time has past since this myth was created that no one really knows what happened – only that the Native Americans were eventually slaughtered both by the Pilgrims and those who followed shortly afterwards. In any case, there was no one around at the time to correct the “bad” information before it became accepted historical fact.
We now embark on another pilgrimage to a New World – cyberspace. Yet, rather than learning from our previous mistakes, we have already instilled ourselves with a set of myths about computers and technology that rival the inaccuracies promulgated by our forefathers. Like myths of the past, most of our Internet legends serve to revise history in a manner that makes the conquerors look like natives, and reality appear like superstition or paranoia.
Here then, in the spirit of truth, are five of the most commonly believed Internet myths as well as some of the reasons why they persist. See how many you still believe, even after reading this column.
1\. The Internet was invented by the United States military in order to create a communications network that could survive a nuclear war.
Sorry, but this just ain’t true. I believed this one myself until I read journalist Katie Hafner’s excellent book, When Wizards Say Up Late, chronicling the real history of the Internet. As many suspected, the US military was not capable of conceiving much less inventing an open, interdependent network. Rather, a division of the Pentagon dedicated to research offered grants to a loose consortium of University-based computer scientists who were already developing protocols for processor sharing and information exchange.
This myth came into popularity after a report by the Rand Corporation, a U.S. think tank that often works on military scenarios, was released onto the Internet. The report – written well after the Internet came into existence – did make the observation that a decentralized communications infrastructure could potentially resist conventional attacks. It had nothing to do with the development of the Internet itself.
2\. Free market competition has led to our greatest technological innovations.
An offshoot of the military myth (above) the libertarian scenario promoted by Silicon Valley and its advocates in the media is that individuals and companies, competing for profits, developed the Internet as well as the software we use to navigate it. In fact, the private sector only took interest in the Internet after the creator of the Mosaic browser for the World Wide Web formed a for-profit business in the mid-1990’s based on the same technology, called Netscape.
A vast majority of the protocols and software we use to communicate through the Internet are based on the work of scientists working collaboratively at universities. Since those golden years, there have been no genuinely new innovations for the Internet – except, perhaps, Javascript and the Hotline server system, both reactions to the debilitating effects of a series profit-driven, closed, and proprietary protocols.
3\. Processor speed will double every 18 months (and this makes computers cheaper).
Otherwise known as Moore’s Law, the myth is true when taken in isolation. Chip speed does tend to double every year and a half. But the corollary to this law is that every time chip speed increases, Microsoft will have developed a new, fatter, and less efficient operating system that effectively negates this acceleration in processor power.
This myth has served to help us rationalize constant expenditures on new computers and upgrades. But while a 300mhz computer costs less today than a 200mhz model did a year ago, we spend more total cash if we continually replace our machines for little or no added benefits. In a flourish of planned obsolescence bravado, computer companies and software writers create chips that require more advanced operating systems, and operating systems that require newer chips and more memory. Buying more computers, even if progressively cheaper, costs more.
4\. Anyone who speaks this way is a conspiracy theorist or, worse, a Marxist.
According to many members of the business community, the shareware-based “gift economy” we created through the Internet is dangerously un-American, and a real threat to the health of the global economy. This is because those of us who see the value of shareware and cooperation and who refuse to participate in the endless cycle of upgrades, challenge the Emperor-wears-no-clothes pyramid scheme known as the Long Boom.
It is not the Internet or even the global economy that is put at risk by clear thinking and honest reporting – it is only the profits of a few people who don’t know how to make money honestly.
5\. Interactivity means people interacting with machines.
The World Wide Web is no more interactive than a spring or a light switch. It’s the people who make a medium interactive, not the technology. No matter how many bells and whistles there are on a computer, how many cool buttons on an interface, or dazzling streaming video clips on a Web site, they are no substitute for interactions with people. The Internet is a communications technology.
This myth got started because its much easier to make money by selling technologies than it is to sell them one another. But, for my money anyway, it’s the living, breathing human beings we find online who are this ethereal world’s greatest resource.
The myths used to obfuscate public perception of the true history and functioning of the Internet have been designed to eliminate the culture and irrevocably alter the habitat of its indigenous population. I wonder what holiday they’ll create to commemorate our cooperation.