By most accounts, the American economy in 2015 is growing appreciably. Unemployment, while perhaps an undeserving bellwether, is down. The Dow almost daily makes last month’s dizzying heights a footstool. But there’s evidence that while the past few quarters have been good for everyone, the larger trend is fueled by digitally-driven market efficiencies that create value without creating jobs.
And as good as the numbers look today, it was only a couple of years ago when it seemed we were ready to declare ourselves economically dead–or at least we were getting pretty tired of being comatose. In 2013 David Rotman wrote an article in the MIT Technology Review called How Technology is Destroying Jobs. In it he talked about Erik Brynjolffsen and Andrew McAfee’s 2011 book Race Against the Machine in which they posited that productivity and employment had been decoupled, and that, much as his title suggested, digital was beginning to impact not just robot-killed manufacturing jobs, but “knowledge-worker” jobs as well in medicine, law and beyond.
In my book Digital is Destroying Everything (Rowman & Littlefield) I make the case that a digital onslaught is creating a “blasted heath” where nothing much looks the same after the storm has passed; and that while digital has brought wonders, it has not brought a net gain in jobs.
Today’s favorable economic climate—where it concerns the middle class worker not employed in the digital workforce—can be compared to a rebuttal against those who would argue there’s no “global warming”: one temperate season does not mean the polar melt is slowing; nor does a very cold winter change the fact we’ve had, in the past decade, five of the warmest years in the past two-hundred. Let’s not debate here whether humans have anything to do with it. These things are happening, and will have broad and deep consequences for millions—or billions.
And so it is with digital and the middle-class workforce.
Rand Schulman, a Silicon Valley pioneer and venture capital advisor, puts it this way: “Digital can be like a fire in the forest. Fire destroys a great deal, but nature requires it in order to clear brush and thrive. At the same time, we don’t want to see it burn down the neighborhood.”
If digital fire has an accelerant, it’s probably a type of computer code known as the algorithm. Some say algorithms are what makes computers “smart”.
While the manner in which algorithms are constructed may be abstruse, their effect is easy to understand. Essentially, algorithms are sets of rules and routines that can be repeated by a computer endlessly in ever-more-rapid fashion and in ever-greater complexity. By and large they are designed to capture the power of what used to be distinctly human endeavor; and thereby empower machines to perform these tasks near-instantly and for almost no cost. Locked inside of an algorithm are the accumulated sensibility of millions of years of evolution and experience, now available at the push of a an on-screen “button”.
I’ve been building and measuring web sites for twenty years; and can testify to the enormous benefits we’ve been handed by Internet-related technologies. But it would be a mistake to say there aren’t more than a few inconvenient truths about digital that very few people have yet addressed.
In some industries affected by digital, all the damage has already been done. We ought to look at how quickly the changes came; as it may prove helpful in determining how rapidly and how utterly change will come elsewhere. Take for instance the exercise of capturing images and then printing them. We have come to call this “photography”.
Do you know anyone who works today in a darkroom? New York City’s West 20th Street, as late as the early 2000s, was the main drag of what was called “the photography district”. You wouldn’t so much buy cameras there, but if you were a photographer you might have your studio nearby and more importantly, you might patronize one of several large “service bureaus” that took your film negatives and printed them out—in the 20th century on photography paper or transparency and then later, by “scanning” them onto digital files ready for print.
Then a team of visionaries in Silicon Valley wrote some algorithms and bundled them into a software product called “Photoshop”. Quite suddenly you could take an image and transform it on your laptop beyond anything you might have hoped for in the darkroom. Almost overnight, the service-bureaus and their dedicated, talented darkroom workers shut down and went the way of the African White Rhinoceros. Also, to the extent you can recall that the Kodak company used to employ tens-of-thousands to make film and huge profits, you are beginning to show your age. It went bankrupt several years ago and its home city, Rochester, NY, has never been the same.
I’m a big fan of Photoshop—I’ve used it extensively for both creative and commercial pursuits. But if I were a darkroom worker, I’d have focused my last enlarger-lens years ago. Maybe I’d have a different, better job. Maybe not.
The change was rapid and near-universal. Today there are a handful of photo labs in and around 20th Street but the so-called “photography district” is no more.
“Who’s next” is everyone holding a job that does not involve agriculture or the necessity of interacting personally with customers or partners or legally protected entities like the judicial system.
And it’s become harder to ignore the suggestion that the entire point of most software today is to disintermediate people themselves. Entry-level clerical jobs, long a springboard to respectability especially for women, are almost entirely gone. They have been replaced by email, Dropbox, Google Calendar and instant messaging.
Highly valued companies used to employ many thousands of workers and to be sure, some still do. General Motors at its peak employed more people than any entity save the Soviet state enterprises. Today, Twitter is valued at over $30 billion, but employs less than 2,000. It begs the question who will be the consumers for business, if such an outlandish value-to-employment ratio were to become the norm.
The creative professions are now under attack as we have “templatized” content, and gained access to an uncountable number of images available for free or almost free. Algorithms are now writing natural language business abstracts and sports recaps. As for music—in an over-sampled world where no one pays for tunes—many say the music industry does not exist any more, or at least not in any way consistent with its history.
It is entirely possible that in the next decade or so, we may reach a job-market singularity in which the only remaining players in the economy are the software owner, the developers who refine the algorithms, and those who can use that software to make their businesses run with less “friction”. “Friction”, in this construct, means chiefly “people”–those messy, inconstant beings that sometimes get cranky or miss work and have annoying things called “rights” and “salaries”.
A History of Great Upheavals
With a job-market singularity will come massive social displacement. I don’t fear that humanity cannot survive the age of robots. But I do question how long it will take for us to make the adjustment, and how much harm will be done in the process.
Digital may be the most important technology of the past several centuries. And it may be at the root of some rapid and encompassing change. But we have been at this juncture before. Not often, but not never. I suggest we look back towards two other global shifts in human history to see what sorts of changes were wrought, and what became of us.
The human clan started with no technology much beyond a sharp stick. Naked but for animal skins, we lived and died by the spear. Then we planted seeds, built some shelter out of stuff lying around, and stayed in one place for a while. The result was what today we call “civilization”. Centuries of famine, bloodshed and halting progress suggest we are still perfecting the enterprise.
Sometime after the American Revolution and before the Civil War, we saw the beginnings of the Industrial Revolution. Whitney’s cotton gin did the work of ten laborers and sent them from the field to the factory. A century of machine-driven bad behavior got us to the cataclysm of two world wars. After that, and with America astride a broken world, we can suggest the Industrial Age at last fulfilled its potential. It provided well-paying work to almost anyone who wanted it. We had fins on our cars, backyard barbecues, and the threat of nuclear annihilation—emblems of a culture unbound and precarious.
It took thousands of years before agriculture proved equal to feeding the hungry. It’s been about two hundred years since the cotton gin, and now in Switzerland we’ve got the Large Hadron Collider, a sort of atom-gin that helps us peer into the very mysteries of existence.
We’re still here. America’s not really astride anything anymore, but we are still dominant in many important ways, and that includes the ways of digital. But digital may prove our biggest challenge yet, and we have a long way to fall.
The Upheaval Now
During the 1960s, while Timothy Leary was urging young folks to turn on, tune in and drop out, others were automating data. Only a few thousand people in the world used computers then, and in the popular imagination they dressed in lab coats. Then came the personal computer and the web and mobile and Snapchat. And here we are.
When the Internet was younger, some (including me) wanted to compare it—grandly enough—to the automobile industry. How it would change us! Much as the car would replace the buggy, we would go the same places, only faster. Now that seems small-beer. Today it makes more sense to compare “cyberspace” to the very experiment of civilization itself; and to the way in which agriculture and industry took us from barefooted serfdom to spiky Manolo Blahnik knockoffs.
Curving Away From Full Employment
Past upheavals have resulted in mass-migrations, the arrangement of enormous new command-and-control structures, rivers of blood—and sliced bread. If we buy into the notion that digital is already transforming us in ways that rival these earlier paradigm-shifts, then what shouldn’t we do to prepare for the troubles that may come with it?
We built our success on the emergence of a ubiquitous middle class. Terrific armies of commuters and a ganglion of highways is testament to its dominance. While digital is creating lots of good jobs, it’s not creating as many as it is destroying. And that curve is accelerating in the wrong direction. According to Brynjolffsen an McAfee, sometime around 2000, a significant gap opened between productivity and employment—cue the Internet. It is taking many fewer workers now to get the same result. And as algorithms make commerce less sticky, make sales-cycles more automated, make human interactions more remote, gone are the ladies’ departments, the perfume-sprayers and their managers, the back-office admins and often enough senior management as well.
In Digital is Destroying Everything, I make the claim that we are only beginning to see the effect of digital on the knowledge-industries. Education is too expensive and no longer a guarantee of the “good life”, so it goes on line with free courses and targeted certifications. Medicine becomes a society of caretakers and attendants while machines diagnose, prescribe and deliver. Financial services is already well beyond the trading floor and into a world of machine-driven, micro-second mega-trades.
Marc Andreesson (Netscape/Andreesson Partners), has said that in our lifetimes, there may no longer be an appreciable brick-and-mortar retail industry. Thousands of weed-choked strip-malls augur this as well. Stephen Hawking, while he may be having second thoughts about black holes, has recently said that artificial intelligence may be the most significant threat to humanity in its long history. In 2014, Tesla CEO Elon Musk said that toying with Artificial Intelligence is like “summoning the demon”. And Vint Cerf, a “father of the Internet”, warns of a “digital dark age” in which all the records of our era disappear like Hillary Clinton’s email server.
In the 1950s, nuclear power was touted as panacea—a reactor in every garage and everything for free. One Chernobyl and a Fukushima later, nuclear is in the doghouse. Too many pundits today seem to believe digital is our own cure-all. But there are rumblings underfoot that should be getting our attention.
A Middle-Class Santorini
Awash in the pearly-blue waters of the Aegean sits an idyllic island historically known as Santorini. Some say it’s a likely site for the legendary Atlantis. That’s because the entire island is a crescent-shaped volcanic caldera. We already know it was home to a Minoan civilization that flourished and failed before the Greeks. We also know there was a volcanic cataclysm there, and that what remains is the dormant tip of a volcano and a gorgeous tourist-destination.
The Minoan civilization was advanced. They had indoor plumbing and central heating. And then, if Plato’s story of Atlantis is any guide, their civilization collapsed in a shower of hot ash and was covered in lava.
Are the middle classes dancing at the edge of a caldera? How much more displacement can we stand before the algorithm, soon morphing into a human-like Artificial Intelligence, erupts and buries the employed classes in an avalanche of code? Will we see a new kind of company with only a handful of workers and a trillion dollar market cap? At what point are there no longer any customers for business? Will joblessness drive up the middle-class misery index to the point of insurrection? Or will the idled masses be content with free movies on their smartphones?
There’s every chance humans will figure a way through this challenge like we have in the past. We may enjoy lives of leisure and plenty because artificially intelligent machines will do all of the work—much as we had long hoped. But it’s just as likely we will have unemployment well beyond anything we can imagine today; and that our world will be disconnected, impoverished and in every way diminished by lack of opportunity. Today we are doing absolutely nothing to forestall the worst.
Likely, we will find out within a decade where this is headed—maybe sooner than that. And if you’re not an owner or an engineer, let this come as a friendly suggestion: learn how to code. It may mean your livelihood not long from now.