The boom in computer aided productivity
The 80’s were a crazy time. Big hair. Cocaine. David Bowie.
But taking place underneath that glamorous surface were events that would ripple forward for decades in the US economy. These ripples would eventually start tearing America apart and make the US one of the most sick, most stressed, and most violent advanced economies in the world. It’s a weird and winding story, fueled by numbers and silicon chips, but it might make the troubles of today make more sense.
Alvin Toffler, an influential business leader and futurist, described the coming age as one of “super industrialism” in his 1970 work Future Shock. In it he outlines the splintering of the public consciousness into finer and finer threads of subculture, positing that with the increasing speed of change will come an explosion of choice. He explains how permanence in goods, services, jobs, and even personal relationships will become tenuous and calls it Transience.
Most importantly, Toffler defines the title of his book. He describes the effect that this increasing rate of change would start having on humans in the coming decades. He writes of inadequate reactions by those tasked with maintaining order in our society:
He fails to recognize that the faster pace of change demands — and creates — a new kind of information system in society: a loop, rather than a ladder. Information must pulse through this loop at accelerating speeds, with the output of one group becoming the input for many others, so that no group, however politically potent it may seem, can independently set goals for the whole. As the number of social components multiplies, and change jolts and destabilizes the entire system, the power of subgroups to wreak havoc on the whole is tremendously amplified. There is, in the words of W. Ross Ashby, a brilliant cyberneticist, a mathematically provable law to the effect that “when a whole system is composed of a number of subsystems, the one that tends to dominate is the one that is least stable.”
He warns that those who are left behind in this rapidly changing climate will become desperate and begin destabilizing the system. As we will see, his work becomes vitally insightful almost 50 years later in 2016.
Let’s go back to the 1980’s, where things started getting heavy.
In 1983 the famous report “A Nation at Risk: The Imperative for Educational Reform” was released, warning that:
“US students are unprepared to join an increasingly high-tech workforce, and 23 million Americans were functionally illiterate…”
The report went on to suggest 38 recommendations to strengthen America’s education system. Its findings and suggestions were immediately mishandled as Reagan instead called for prayer in schools, school vouchers, and the abolition of the department of education. He manages to erode the power of local school districts and halves the federal budget for education before his second term is up.
In the very same year, Apple’s Lisa, the first personal computer with a GUI, was released. Two years later in 1985 Microsoft announced Windows. This began in earnest a pattern of computer use that would come to define both our political beliefs and our bank accounts.
The 1984 census data on computer usage in the US reported computer ownership in 8.2% of households. Usage was most correlated with those who were 35–44, worked in managerial or professional occupations, had children in the home, were white, and had an income of 50k (about $122,977 in 2019).
14 years later and the year was 1997. Barbie Girl was topping the charts, Amazon had its IPO at $18 a share, and we were approaching height of the dotcom bubble. Computer market penetration was only 35% but, with the rapid rate of internet adoption, venture capitalists had seen the writing on the wall and were investing like mad. The new median age for computer usage was 45–54 - likely the same core users from 1984 - with their grad school children accounting for the second highest usage rates.
The dotcom bubble bursts in March of 2000. It would take 15 years for the Nasdaq to regain its dotcom peak, during which time the U.S ultimately sheds 5 million manufacturing jobs.
This is the shape of an economy restructuring its foundation into something new. Those who entered the new millennium better prepared for this new economy were directly connected to those who were early adopters of computer technology — as in those who had managerial or professional educations, were white, and already in the top half of income brackets. Those who weren’t so fortunate instead exchanged a median income of 60k a year with significant benefits for jobs that paid less, had few or no benefits, and promised no guarantee of stability. This was in part because 94% of the new jobs created between 2005 and 2015 were gig economy jobs — as in temporary, on-call, contract, or freelance jobs. The better of these jobs were the technical and professional contracts, the worse are, well, driving Uber part time.
A large portion of these manufacturing workers ended up leaving the labor -force all together, which contributed to a participation rate of 63% — on par with El Salvador and the Dominican Republic. When analyzed geographically, the higher the percentage of workers displaced in a voting district, the more red it became.
This change was gradual, repainting the map as the displacement wave moved from manufacturing into local main street economies. In 2016 Trump promised to bring back all those lost jobs and, in the end, the districts that flipped to win him the presidency were ready to try anything new. After all, they were already dying.
The picture is becoming clearer — tech boomed, manufacturing busted, people lost their jobs. But why didn’t more people just go back to school and retrain? Or move into cities seeking greener pastures? That’s what our economics textbooks tell us should happen… right?
First, let’s look at employment statistics to see what people actually did.
We can see employment start rising in jobs that require higher levels of preparation in the 1980’s, which created a visible pay gap around technical and professional education.
The number of workers in occupations requiring average to above-average education, training, and experience increased from 49 million in 1980 to 83 million in 2015, or by 68%. Given that the labor force in 2015 was 160 million, this means that the percentage of the total labor market requiring increased education rose from 33.9% to 51.8%. Were these people retrained?
No. Not by the numbers.
Of course some non-negligible percentage did, but most of the people filling these jobs were newcomers to the market — in fact, the current median age for a tech worker is just 31 years old.
Now that we have proof that most people didn’t successfully retrain, let’s consider some possible reasons as to why much of the retraining logic of conventional economics became less effective after 1980:
First, we can look at direct attempts. The Trade Adjustment Assistance (TAA) program, a federal program for displaced manufacturing workers, found only 37% of its workers in the fields they retrained for. Michigan’s No Worker Left Behind program found that 33% of its members remained unemployed after retraining, similar to the 40% unemployment rate of their peers who did not enroll. In fact, about half of all Michigan workers who left the workforce between 2003 and 2013 went on disability instead because retraining programs didn’t work.
Second, we can consider the thing that really fuels life improvement projects: money. Real wages (adjusted for inflation) peaked 45 years ago in the late 70’s at $4.03 per hour ($23.68 in 2019). They dropped in the 80’s and have remained relatively stagnant ever since, with the few gains being made in, surprise, technical and managerial fields.
On our third point, If we take a look at consumer spending, we can see it has risen dramatically. Consumer spending now makes up 68% of GDP despite wage stagnation. People now spend more than ever before but aren’t seeing increased earnings.
This might explain the precipitous drop in the US personal savings rate since the 1980’s, especially as much of this rise in spending has been in broken markets that don’t respond normally to market pressure: healthcare, housing, retirement, transportation, and education.
And finally, for those who needed help transitioning into the new economy and applied for welfare, they found a nasty trap: Cut off rates for most programs were set at 130% of the poverty line, currently $17,236 in 2019. These policies assumed, and still assume, that at that point a person can be considered back on their feet with a great new job. Unfortunately, between loss of benefits in healthcare, food assistance, and housing assistance, the distance is simply too great.
The most common job in the US is currently retail, with the average retail worker being a 39 year old woman with a high school diploma and earning 26k yearly. 1 in 3 malls are expected to close in the next 5 years due to Amazon’s increasing share of the retail market. If she were to try to retrain, she’d have to overcome all of the impediments listed above while dealing with an average debt load of 39k - not to mention the difficulty of keeping pace with the increasing complexity of higher-paying markets. This, along with an increase in our aging population, is why we see the average monthly participation rate in major means-tested welfare programs increase from 18.6% in 2009 to 20.9% in 2011.
If you and yours started the 1980’s without the conditions favorable for the upcoming “super industrial” economy then you were likely to be stuck in a stagnant market with stagnant wages. You were likely to find yourself losing your savings to broken markets, scrabbling for any work to keep the lights on, or suffering punitive conditions for joining welfare programs — too old, sick, in debt, isolated and uneducated to absorb the massive explosion of information necessary to remain competitive in the modern technological economy.
THAT is why you can’t just #learntocode.
One of the attitudes towards this issue is “So what? There are winners and loser in every game after all. It’s the free market, the system will balance itself out.”
After all, we have a record 21 trillion dollar economy, making us the richest nation in the both our own history and the history of the world. Our GDP is 27.4% higher than China, which has a whopping 424% more people - 1.4 billion to our 330 million.
Yet we see a rising number of people permanently on disability. So many people have quit looking for meaningful work that labor force participation rate is equivalent to that of third world countries. We are seeing such high numbers of people overdosing on opiates and other drugs as well as committing suicide that our life expectancy dropped for the first time since the Spanish flu pandemic 100 years ago. We have the highest gun violence rates, lowest maternity survival rates, and highest mental illness rates of any advanced economy in the world.
Why hasn’t any of this put a ding in our GDP? Why does the news say the economy is bustling? How can this be possible?
The answer is economic productivity.
To explain this, let’s take a look at the manufacturing sector. Manufacturing’s share of U.S. employment and GDP over the last 70 years shows that in 1953 manufacturing accounted for 35% of private sector employment. By 2016, that figure had fallen to under 10%. Manufacturing’s share of private sector GDP has experienced a parallel decline: it peaked at 33% in 1953 and by 2016 was just 13%.
However, when you adjust for inflation, manufacturing GDP has kept up with private industry GDP. Its output kept up with overall output despite employing fewer people in the sector and occupying a smaller share of the economy. How is that possible?
We know net exports have been negative since 1980 and have contributed to roughly 13.4% of overall U.S. job losses in the last decade. This means that globalization has certainly had a role, but not a majority role.
The growth in production per worker has played a much more significant role in manufacturing job losses.
The kicker: when you remove growth from the computer industry, manufacturing GDP gets cut in half. But aren’t most computer chips made in China? It has nothing to do with the manufacturing of chips in the U.S. and everything to do with rapid advancement of computer technology. It allows for ever more streamlined processes and better infrastructure — thus increasing productivity per worker and decreasing the number of workers needed.
It’s not just automation, robots, or AI. It’s the introduction of information technology into every layer of the economy, reorganizing it more efficiently and setting its pace to that of Moore’s law.
There’s an old saying: “You manage what you measure.” The inventor of GDP, Simon Kuznets, warned that GDP should never be confused with well-being and did not like that it counted armaments and financial speculation as positive outputs. Despite this, America as a whole assumed that a high GDP would mean a better country and came up with a highly successful plan based on its own particular brand of capitalism. Unfortunately, very few saw that most humans wouldn’t be necessary for it to work.
To paraphrase mathematician and economist Eric Weinstein:
Capitalism is simply a tool, an algorithm for efficiency. It gave birth to high technology and was then eaten and replaced by its own child.
The freedom of economic choice, or how free we are to choose how we produce value, is now closely linked to how early and deeply we entered this new technological economy. There are more choices now than ever but they are not evenly distributed in quality or quantity, having drained away from rural areas and into cities as technology optimized itself into efficient nodes. Only the affluent can now afford to live in these choice-rich centers, waited on by the urban-centered poor who are themselves trapped in overpriced living situations, predatory car loans, and restrictive welfare limitations.
This is why a road trip across the country is guaranteed to take you through countless dying towns covered in faded Trump signs, evidence of desperate pleas and resentful pushbacks from an aging poor white working class left behind during this 40 year efficiency rat race to the top. Towns where the grocery store is also the dollar store and the newest small business is a thrift shop. Towns feeling the brunt of the opioid epidemic. For historically poor demographic groups this sort of desolation was simply how life had always been, but for these latest victims it was new and unacceptable. They had the incentive and means to destabilize the system, and so they did — and the resulting tremors may increase until entire system shatters and we all lose.
In the end Toffler was right. Technology did give the market the ability to cater to ever finer threads of subcultures - our YouTube and Twitter feeds are living proof. We pay for it with our freedom of choice, by feeding algorithms the vital information that allows advertisers to prey on us, with those who can’t afford to opt out becoming the products, not the consumers. As a result of increasingly optimized algorithms, we fall more and more into echo chambers of similarly minded people. We become blissfully ignorant to the state of the average American.
We have missed entirely the splintering of the American consciousness into categories of economic choice — those who own the tech, those who can use the tech, and those that can’t do either. This is the real issue of our era. All others stem from this disparity in economic freedom.
Our time is running out. The 2020’s will see the beginning of global refugee crises from climate change-related disasters, China asserting itself as an AI superpower, and massive economic destabilization as productivity and automation continue to drive normal people to desperation and disruption. We need to redesign our successful capital-centered economic system to one that is centered instead around human well being. We need to rethink what work is. We need to adapt to the challenges of the 21st century and uplift the average mental, emotional, and fiscal complexity of our population before it’s too late for everyone. We have to start now.
How we do that, though, is a long and winding story for another time.