This Time, It’s Personal
How digital technology alienates you from your soul

By Douglas Rushkoff. Published in Medium on 3 April 2019

For the past couple years, I’ve been complaining rather emphatically about the way digital technology has been used to desocialize us — how platforms like Facebook and YouTube turn us against one another by emphasizing our differences and encouraging us to behave like threatened reptiles.

This is indeed lamentable, but in many ways, it’s nothing new. Our media and technologies have been undermining our social bonds for centuries. So, what’s different now? Is this digital alienation the same thing amplified, or is something else going on? Only when we understand how tech has been working all along can we begin to reckon with what’s different about the digital landscape in which we’re living.

First off, we have to accept the fact that at least since the industrial age, major technologies aren’t usually developed to make things “better” on any fundamental, experiential, human level. They’re nearly always the expression of some underlying economic dynamic — in our case, that means capitalism.

As I chronicled in my book Team Human, the industrial age brought us many mechanical innovations, but in very few cases did they actually make production more efficient. They simply made human skill less important so that laborers could be paid less or be more easily replaced. Queen Victoria sent mechanical weaving looms to India for the sole purpose of being able to hire less-skilled workers and devalue local value production. Instead of requiring seasoned craftspeople, who were difficult to exploit and replace, assembly lines were designed to utilize unskilled labor. Workers had to be taught only a single, simple task — like nailing one tack into the sole of a shoe. Training took minutes instead of years. If the workers started to complain about wages or conditions, they could be replaced the next day.

The consumers of factory goods loved the idea that no human hands were involved in the creation of those items. They marveled at the seamless machined edges of industrial age products. There was no trace of the tiny imperfections of hand-stitched or manually crafted goods. There was no trace of humans at all. This became the new fetish. At the Great Exhibition of the Works of Industry of All Nations at London’s Hyde Park in 1851, Prince Albert established England’s preeminence in the global marketplace with a tremendous display of British mass production. Visitors were treated to the spectacle of mechanical looms and steam engines — but no human operators were manning them.

Even today, Chinese laborers “finish” smartphones by wiping off any fingerprints with a highly toxic solvent proven to shorten the workers’ lives. That’s how valuable it is for consumers to believe their devices have been assembled by magic rather than by the fingers of underpaid and poisoned children. Creating the illusion of no human involvement actually costs more human lives.

Of course, the mass production of goods requires mass marketing. While people once bought products from the people who made them, mass production separates the consumer from the producer and replaces this human relationship with the brand. So, where people used to purchase oats from the human miller down the block, now consumers are supposed to go to the store and buy a box shipped from a thousand miles away.

The brand image — in this case, a smiling Quaker — substituted the real human with a mythological one, carefully designed to appeal to us more than a living person could. Just as the goods of industry were considered better by virtue of their freedom from human hands, our relationships with the idealized brands of consumer culture were meant to surpass anything we could establish with an imperfect fellow human. Human relationships are so messy and personal. Brand relationships are almost platonic ideals. Clean, abstract, and impersonal.

To pull that off, producers turned again to technology: Mass production may have led to mass marketing, but then mass marketing required mass media to reach the actual masses. We may like to think that radio and TV were invented so entertainers could reach bigger audiences, but the proliferation of broadcast media was designed to enable America’s new national brands to reach consumers from coast to coast. Consumer culture was born, and media technologies became the main ways to persuade people to desire possessions over relationships and social status over social connections. The less fruitful the relationships in a person’s life, the better target they are for synthetic ones.

The surveillance capitalism that so many of us are now complaining about is a simple extension of this same effort to disconnect people from one another so we can be worked over individually by the algorithms. Get all of us trapped in our personalized news feeds and interacting with content instead of one another. By preventing human beings from establishing rapport with one another, these platforms keep us from achieving solidarity and power. This is the same old industrial age, television-enabled desocialization we’ve always endured, only amplified by more powerful digital tools.

In this sense, Facebook is not anything truly new. Like Netflix, Instagram, or Amazon Prime, it’s just attention economy–fueled television running on a digital platform, and its psychosocial effects are the same. Where a new form of uniquely digital manipulation surfaces is in these platforms’ impact on us as individuals — as humans. Once the algorithms have grasped us by the brain stem, they’re no longer working to disconnect us from who we like, but who we are. They’re not alienating us from the value we create, but from our intrinsic value as conscious, living beings. They’re disconnecting us from our souls. Or at least our soul, in the James Brown sense.

What does that look like? It’s a digital landscape where we understand our worth through metrics: friends, likes, and retweets, yes — but also through our utility value as workers and earners, rather than our essential value and dignity as human beings. It’s what encourages us to give schoolchildren iPads and Chromebooks instead of giving their teachers more autonomy and contact hours with pupils. It’s how the chief scientist at Google came to see artificial intelligence as our evolutionary future.

It’s the disorientation so many of us are feeling right now, at this moment, when news stories and urgent issues are presented to us in ways that trigger fight-or-flight reactions but are then replaced by another and another and another before we can achieve any semblance of coherence to make sense of them. It’s the way algorithms work to iron out our unpredictability and get us to behave more consistently with our consumer profiles. It’s everything from slot machine algorithms embedded in our news feeds to induce addictive behavior to auto-tuned pop music that quantizes and standardizes the once-human voice. That’s not just antisocial—it’s anti-human.

Yes, the easiest way back is to reconnect with others. Being human is a team sport, and we can recalibrate most easily by simply looking into someone else’s eyes, holding a hand, or breathing together.