Are the productivity gains from tech fading?

Chris Oestereich
Wicked Problems Collaborative
5 min readFeb 21, 2018

--

I’ve struggled with an idea for a while, the idea that economic productivity gains from technology are stalling out. The measure had a good run leading up to the Great Recession, but the long run average has faded ever since (as shown in this graph from the FT).

Productivity was long a key metric for wages, as throughout the middle part of the 20th century the measures were tightly correlated. But around 1973, the measures parted ways as productivity continued growing while wages essentially leveled off.

With productivity slowing, the accepted explanation is that our technology sector is no longer creating the sort of economic gains that it used to. That’s a sensible explanation that’s been shared by economists like Larry Mishel, Josh Bivens, and Marshall Steinbaum (thinkers I hold in high esteem and who I have no business challenging, btw).

Even so, I would like to posit an alternative explanation. Maybe it’s nonsense, but if nothing else, I need to get it off my chest. And here’s the thing, they have the academic chops AND Ockham’s Razor in their favor, so I know I’m out here on thin ice swinging a sledgehammer, but what can you do? The idea’s been eating at me for a while, so I’ll share it and if that leads to a good chuckle for others, I’ve got broad shoulders.

Here are a few job categories as references for the rest of this post.

The idea that I have stems from my experience working in the US corporate workforce. There you have a wide-variety of roles, and work that’s often very unevenly distributed. Some people are overwhelmed with work, while others seem to have a lot of free time. Maybe some are more or less efficient, but even granting that, it’s highly implausible to suggest that the volume of work within large organizations could be evenly distributed even if it was desirable. (Would you want HR filling in for a shorthanded Operations group, or Sales dealing with a sticky HR issue to balance out the workload? Maybe that would work out in some cases…)

On top of this, there’s always some amount of slack (more hours available for work than hours of work available) in any cognitive-based work system, at least in some roles. The only way to avoid that is to overwhelm everyone constantly, which leads to a host of problems.

If you accept the premises to this point, here’s where it might get interesting. Check out the job growth along the four lines in the graph below.

Jobs Routine vs. Nonroutine

The two routine lines (routine cognitive and routine manual) have been relatively flat over the past 35 years. Meanwhile, the other two lines (nonroutine manual and nonroutine cognitive) have seen significant growth.

Back in the mid-1980s, we had about 60 million routine jobs and about 40 million nonroutine (give or take a few million jobs here or there).

Today, you have a little over 60 million routine jobs, and somewhere around 80 million nonroutine jobs. So, over the course of 35 years, the mix of jobs has gone from being around 60% routine at the outset, to nearly 60% of them being nonroutine today.

Here’s the crackpot theory:

What I think might be happening is that technology is still taking on ever more of our work, but instead of it making us redundant, it’s making us cling to our jobs. If this is right, people are continually ending up with less work, but then they’re doing whatever they have to do to look busy (busy work, meetings, etc.) to keep their paychecks flowing. (I’m stealing liberally from David Graeber here.)

Why do I think this more complicated explanation might have legs?

I think a few different factors make this plausible. First, unemployment benefits have been on the wane for many years, so people know that being out of work is more likely to mean challenging circumstances these days. Second, many who have been “made redundant” in recent years have had a hard time finding similar work at similar pay, so wariness of being unemployed has grown over time. And third, people’s finances seem to be growing ever tighter. (One recent survey of American workers found only 39% of respondents would be able to cover an unexpected $1,000 expense using their savings.)

So, given the circumstances, behavior that might seem “irrational” from the perspective of the system that’s expected to self-optimize, instead gives us the outcome of individuals behaving highly rationally at the “expense” of the system.

Why wouldn’t this be self-evident?

When the workforce was 60% routine, it may have been easier to suss out labor productivity. A worker who’s cranking out physical or intellectual widgets of a known sort can probably be measured more reliably than one who’s effectively performing freestyle jazz every day. (How many 15+ minute solos did you perform today?) As the mix of work has shifted, maybe the ability to measure labor productivity has become more complicated.

I want to be clear that I’m not trying to call the big picture stats into question, but rather asking if tech productivity is still increasing at a solid clip, while the overall figure is being held back by slipping labor productivity. If this was correct, it would seem to suggest a serious problem as recent productivity growth has been used as an explanation against worries of massive job loss via the “4th Industrial Revolution.”

Hopefully, I’m wrong. But the idea that tech productivity was slowing down, given the seemingly constant advances in business-related tech, has been hard to accept. Maybe my biases have gotten the best of me. We’ll see.

Originally published at Chris Oestereich.

--

--

Chris Oestereich
Wicked Problems Collaborative

Wicked problems & circular systems. Books: Pandemic Capitalism: bit.ly/wpcpandemiccapitalism What do we do about inequality? bit.ly/wpcinequality