Looking for job security in the knowledge economy? Just learn to code. At least, that’s what we’ve been telling young professionals and mid-career workers alike who want to hack it in the modern workforce—in fact, it’s advice I’ve given myself. And judging by the proliferation of coding schools and bootcamps we’ve seen over the past few years, not a few have eagerly heeded that instruction, thinking they’re shoring up their livelihoods in the process.
Unfortunately, many have already learned the hard way that even the best coding chops have their limits. More and more, “learn to code” is looking like bad advice.
CODING CAN’T SAVE YOU
Anyone competent in languages such as Python, Java, or even web coding like HTML and CSS, is currently in high demand by businesses that are still just gearing up for the digital marketplace. However, as coding becomes more commonplace, particularly in developing nations like India, we find a lot of that work is being assigned piecemeal by computerized services such as Upwork to low-paid workers in digital sweatshops.
This trend is bound to increase. The better opportunity may be to use your coding skills to develop an app or platform yourself, but this means competing against thousands of others doing the same thing—and in an online marketplace ruled by just about the same power dynamics as the digital music business.
Besides, learning code is hard, particularly for adults who don’t remember their algebra and haven’t been raised thinking algorithmically. Learning code well enough to be a competent programmer is even harder.
Although I certainly believe that any member of our highly digital society should be familiar with how these platforms work, universal code literacy won’t solve our employment crisis any more than the universal ability to read and write would result in a full-employment economy of book publishing.
It’s actually worse. A single computer program written by perhaps a dozen developers can wipe out hundreds of jobs. As the author and entrepreneur Andrew Keen has pointed out, digital companies employ 10 times fewer people per dollar earned than traditional companies. Every time a company decides to relegate its computing to the cloud, it’s free to release a few more IT employees.
Most of the technologies we’re currently developing replace or obsolesce far more employment opportunities than they create. Those that don’t—technologies that require ongoing human maintenance or participation in order to work—are not supported by venture capital for precisely this reason. They are considered unscalable because they demand more paid human employees as the business grows.
TRAINING OUR ROBO-REPLACEMENTS
Finally, there are jobs for those willing to assist with our transition to a more computerized society. As employment counselors like to point out, self-checkout stations may have cost you your job as a supermarket cashier, but there’s a new opening for that person who assists customers having trouble scanning their items at the kiosk, swiping their debit cards, or finding the SKU code for Swiss chard. It’s a slightly more skilled job and may even pay better than working as a regular cashier.
But it’s a temporary position: Soon enough, consumers will be as proficient at self-checkout as they are at getting cash from the bank machine, and the self-checkout tutor will be unnecessary. By then, digital tagging technology may have advanced to the point where shoppers just leave stores with the items they want and get billed automatically.
For the moment, we’ll need more of those specialists than we’ll be able to find—mechanics to fit our current cars with robot drivers, engineers to replace medical staff with sensors, and to write software for postal drones. There will be an increase in specialized jobs before there’s a precipitous drop. Already in China, the implementation of 3-D printing and other automated solutions is threatening hundreds of thousands of high-tech manufacturing jobs, many of which have existed for less than a decade.
American factories would be winning back this business but for a shortage of workers with the training necessary to run an automated factory. Still, this wealth of opportunity will likely be only temporary. Once the robots are in place, their continued upkeep and a large part of their improvement will be automated as well. Humans may have to learn to live with it.
HIGH-TECH UNEMPLOYMENT
This conundrum was first articulated back in the 1940s by the cybernetics pioneer Norbert Wiener, whose work influenced members of the Eisenhower Administration to start worrying about what would come after industrialism. By 1966, the United States convened the first and only sessions of the National Commission on Technology, Automation, and Economic Progress, which published six (mostly ignored) volumes sizing up what would later be termed the “post-industrial economy.”
Today, it’s MIT’s Erik Brynjolfsson and Andrew McAfee who appear to be leading the conversation about technology’s impact on the future of employment—what they call the “great decoupling.” Their extensive research shows, beyond reasonable doubt, that technological progress eliminates jobs and leaves average workers worse off than they were before.
“It’s the great paradox of our era,” Brynjolfsson explained to MIT Technology Review in 2013. “Productivity is at record levels, innovation has never been faster, and yet at the same time, we have a falling median income and we have fewer jobs. People are falling behind because technology is advancing so fast and our skills and organizations aren’t keeping up.”
Yet it’s hard to see this great decoupling as a mere unintended consequence of digital technology. It is not a paradox but the realization of the industrial drive to remove humans from the value equation. That’s the big news: The growth of an economy does not mean more jobs or prosperity for the people living in it.
“I would like to be wrong,” a flummoxed McAfee confided in the same article, “but when all these science-fiction technologies are deployed, what will we need all the people for?”
When technology increases productivity, a company has a new excuse to eliminate jobs and use the savings to reward its shareholders with dividends and stock buybacks. What would’ve been lost to wages is instead turned back into capital. So the middle class hollows out, and the only ones left making money are those depending on the passive returns from their investments.
It turns out that digital technology merely accelerates this process to the point where we can all see it occurring. It’s just that we haven’t all taken notice yet—we’ve been busy coding.