*Race Against the Machine* and TGS, a comparison

Race Against the Machine is compared to TGS in this forthcoming Economist article, and see this earlier piece.  Short bit:

Erik Brynjolfsson, an economist, and Andrew McAfee, a technology expert, argue in their new e-book, “Race Against the Machine”, that too much innovation is the bane of struggling workers. Progress in information and communication technology (ICT) may be occurring too fast for labour markets to keep up.

I agree a version of this is happening (though I wouldn’t use the phrase “too much”) and I don’t see my analysis as so different from theirs.  Possibly the three of us could agree on these propositions:

1. IT has seen rapid innovation since the 1990s, and it has led to great gains, but not so much for ordinary workers.

2. Had innovation gains been much greater in non-IT sectors, median living standards would have gone up much more.

I’m not sure if Erik and Andrew would agree with my:

3. Innovation gains in non-IT sectors have been much slower than usual (by post 1870 standards) in the post 1973 period.

I would make a few other points which I suspect they would not agree with, or would wish to reframe:

4. IT productivity was highest in 1995-1998 and those were splendid years for wages and the labor market.  That is one reason why I focus on the “glass half empty problem” (low non-IT innovation post 1973), rather than the “too much IT innovation for labor’s good” argument.  I do think the “too much innovation for labor’s good” argument explains why the productivity statistics from 2001-2004 didn’t translate into significant real wage gains and that is an important phenomenon.  But it’s at the side of my argument rather than central to it.  It’s central to their argument.

5. The S&P 500 has been flat, in real terms, for well over a decade.  I thus see a truly gloomy productivity picture post 1997-98 or so, weighing IT successes against other problems and slowdowns.  (I see this as one issue for their account, since they are asserting good times for capital in recent years.)  More generally, the productivity picture for the U.S. between 1995-1998 is quite good and IT-based, and 1973-1995 is poorly understood, highly mixed, but overall still quite inadequate with the 1970s and early 1990s having been worse than most people realize.  Overall I see the bad productivity years as bad years for the workers, not vice versa.  I also see the broader technological stagnation as setting in well before the IT boom of the 1990s.

6. The biggest employment problems, namely those of late, have come when output is low and/or falling, not rising.  That is another reason why my agreement with some of their major propositions comes at the side of my argument rather than being central to it.  I understand how the TGS argument fits into the cyclical story of 2007-2011 (excess confidence and overextension, Minsky moment, AD contraction, AS problems slow down the recovery), but I don’t understand how the Race Against the Machine story interacts with the cycle, or if it is even supposed to.

I had a long chat with Erik earlier in the week and found large areas of agreement on many matters (without wishing to speak for him in any formal sense, don’t attribute any of this to him!).  We also have quite similar predictions about the future, and this blog post is solely about the past.  These “side arguments” I am referring to are already in the process of becoming more central and they will shape our future.  The story of a largely stagnant median will become progressively less important relative to the story of labor market polarization, and I suspect that over time Erik and I (I didn’t get to chat with Andrew as much) will agree increasingly.

Addendum: Arnold Kling comments.


I think we underestimate the need for partial IT laborers in the next five years. There is a growing need for employees who are not working in IT that have some IT training and ability. These employees would not be working on IT stuff more than 20% of their time, but that 20% will be spent using and fixing technology to the productivity benefit of the business as a whole. Often this is called the decentralization of IT, but it has more to do with the tools the people are using in the workplace.

Imagine for a second the value of a worker on the assembly line with no computer training or ability. That is what we used to have 99% of the time. Yet computers have become a greater and greater part of the process. The value to the company of these workers is decreasing or at a minimum not increasing even though their productivity is increasing.

Now imagine a new worker. He does not have a computer science degree, but has training and computer abilities above the average. Most important he has an inclination to be able to handle computers outside of a pre-scripted process. The value of this worker to the company is not only greater than the first worker, it increases as the use of computers increases.

I am not saying that education is the answer, but that IT is and will be more in the future spilling over into more labor centered areas and that those employees with or acquiring these skill sets are going to be paid better (or should be if allowed.)

This comment is one of the best observations I have read on the transforming labor market.

I work in the automation controls industry. My specialty is HMI (human machine interface) development. So I've got a lot of experience in exactly what you are referring to. And I'd say the work that the controls industry has been doing in converting over to heavy automation in the US is spilling over to other fields as well. The most obvious examples that most people see are ATM's, self checkout kiosks at groceries stores and the like, and graphical menu displays at fast food restaurants.

It's easy to think the menu display at a restaurant is trivial, but it is a huge waste reducer and labor saver. It results in significantly fewer incorrect orders and a faster process overall.

I'd say overall you are mostly correct.

"Imagine for a second the value of a worker on the assembly line with no computer training or ability."

Very true 15 years ago, almost unheard of today. Certainly, untrue on any line that's had a major upgrade in the last 15 years.

"The value to the company of these workers is decreasing or at a minimum not increasing even though their productivity is increasing."

Yes, agreed. A worker with no computer knowledge is of little value in a modern American factory. Even the tow truck drivers have computers on the tow trucks. There are literally more computers than people in almost any high value product factory. (And all of the low value product factories are gone or are on their way out of the US/Canada.)

"Now imagine a new worker. He does not have a computer science degree, but has training and computer abilities above the average. Most important he has an inclination to be able to handle computers outside of a pre-scripted process. The value of this worker to the company is not only greater than the first worker, it increases as the use of computers increases."

Yes, this is true in my experience as well.

For example, if a company builds a series of 'new' lines (real world example), the 2 lines built in the 1980's might require 14 workers per shift for X biscuits per day. Two newer lines built in the 1990's might require 6 workers per shift for 1.2x biscuits per day. And two newest lines built in the 2000's require 2 workers per shift for 1.4x biscuits per day.

So one could say that the newer line's 'workers' value relative to the value of output is decreasing, but it is definitely increasing in absolute terms. Who do you think the factory operations manager is going to put on the newer line? He's absolutely going to place his better workers there. Because if something goes wrong he a) loses more production and b) there are less people to help fix it quickly.

"I am not saying that education is the answer, but that IT is and will be more in the future spilling over into more labor centered areas and that those employees with or acquiring these skill sets are going to be paid better (or should be if allowed.)"

I doubt 'education' is the answer, because it's superfluous. An employee that can use a smart phone effectively can use a modern industrial computer. If you can play any kind of strategic video game well, you are more than competent to run a modern line. The fact is that society is busy actively training up the population in an ongoing fashion. And more formal education is unlikely to be of much benefit.

No operator, I've worked with needed to know how to write ladder logic code, though some could and were pretty good at it. If I'm doing my job half as well as I should, the HMI and PLCs (factory computers) should automate the repetitious parts of the process. The operators most valuable contribution is as a trouble shooter and a well designed system makes that process straight forward. That kind of trouble shooting isn't really something you learn in college. However, you learn it just fine customizing your latest iPhone.

My apologies for a ridiculously long post above. I got carried away.

Excellent contribution, thanks.

Very interesting. My father is an automation engineer mostly working in plants and you describe. I am more of an office and business automator. Currently I am working in a small service lab. For the past year I have been creating a system to handle data. For the techs out there it is a basic PHP/MySQL web app. The system is almost done now although they have been using as it as each piece is finished.

There is a lot of work in it that it automates. A lot of it is work that the computer used to do still, with Excel mostly,but the new system does it faster. Click and print labels for samples is the easiest of examples. No more mail merging, and standardized.

I also do lab work which has enabled me to make sure that it works well, and working with those who use it all the time helps with instant feedback.

I should also mention that my work here is almost done and am looking work a new job. My online resume is willmwade.com

My uncle worked in the computer business in the '50s and he argues there's actually been a great stagnation in that field. He argues all the focus today in on databases, which he calls the lowest functions. Maybe the problem isn't too much technological innovation, but too little major innovation.
Also, friends of mine and I worked in IT in the mid-90s. We used to talk all the time about how we were being overpaid by older managers who thought what we were doing was sorcery and that would end when they moved on and younger managers took their places.

"but not so much for ordinary workers"

I have not yet read "the book", so I apologise if this is already discussed.

To what extent do you think the improvements to the living standards of "ordinary works" from IT are simply not measured - or measurable - in changes to real income, or changes to GDP/capita? No price index can capture the value my kids gain from being able to Skype their grandparents, or from being able to use Google.

What if the nature of IT/internet innovation is a secular trend which necessarily results in no "newly produced final goods and services", so the econometricians are simply missing them?

"Also, friends of mine and I worked in IT in the mid-90s. We used to talk all the time about how we were being overpaid by older managers who thought what we were doing was sorcery and that would end when they moved on and younger managers took their places."

This hasn't happened yet.

Tyler, I would be interested in hearing your theory of WHY the IT sector has continued the robust technological progress that physical tech had in the decades before 1970, whereas the other areas of technology stagnated. I can argue at length and in detail against the notion that there were physical or scientific limits that were reached in the other fields, or that the low-hanging fruit had been picked. In a technological sense, that is.
For an almost tailor-made example of tech stagnation, consider the speed of airliner service, which plateaued at about 500 mph right around 1970 (after increasing exponentially since 1910). In the 70s there was plenty of understanding of how to build supersonic airliners: we did get the Concorde, and the US built the XB70 which could have been turned into a liner. It was an economic, not technological, roadblock. I suspect the oil shocks as a major contributor. But I'd be interested in your theories.

It's partly a quirk of physics - computers get faster as they get smaller and more transistors can be packed into a given surface area. The entire manufacturing process can therefore optimise around this (there are other considerations, of course, but by and large, smaller is better). Computers are unconcerned with the kinds of forces required to produce a causally effective 'wallop', if you like, so they can ignore the kind of fundamental physical constraints of the kind that keeps mechanical or civil engineering progressing at a much slower rate.

To put it another way, computers benefit from an ongoing exponential increase in power that hides the relatively slow growth of actual information processing and core computer science knowledge.

Two good examples - chess playing computers, and robotics. Our understanding of chess is obviously no greater now than it was in the 1950s, but we saw very early on in the development in chess playing programs that it was simply a matter of time before machines could beat the strongest humans. We developed some better heuristics along the way, but really it was just a case of letting the required computer horsepower arrive.

In robotics, the fundamental algorithms to accomplish things like machine vision and inverse kinematics (the mathematics of deriving the required motor forces to produce the precisely desired arrangement of limbs and joints for movement) have been known for decades. Only in the last three decades could we put this to use in factories, and only in the last 10-15 years did we have enough power in a small space to produce autonomously moving bipedal robots.

So, the answer to your question is that fundamental CS knowledge advances at around the same rate as the other physical sciences, it just appears otherwise because we're on an exponential ramp of computing power that allows us to practically attack many problems that we already had the theoretical ability to solve.

Quibble: Our understanding of chess is vastly greater than it was in the 1950s, and computers have been a huge contributor to that expanded understanding.

Actually, the "smaller is faster" scaling laws work in the physical world as well. There are a lot more small, fast electric motors out there now than there were in 1960. Furthermore, this was pointed out in 1960 by Feynman, somewhat before Carver Mead and Moore made their observations about microelectronics.

One key element in the rapid rise of electronics (not just computers) over the period is a strong feedback effect from design automation software. Current-day microprocessors would be flat-out impossible to design using the methods of 1990, much less 1960. There's been an "intellectual capital formation" process that has been a major component of the exponential growth curve.

What I'm trying to get out of Tyler is an economic narrative of why the 70s happened the way they did. The front-page story of decline was the energy crisis. But as I understand economics, that should have stimulated investment in alternative energy sources. Yet the one with the highest physics upside -- nuclear, with at least as much headroom for technological advance as electronics -- got shut down instead.

To me, this tells an economic story that TGS was fundamentally a misallocation of effort and resources (e.g. Into conservation instead of new energy) and cannot be explained in aggregate terms.

There are the realities of diminishing returns.

Computing has gotten cheaper, opening up new markets for development. We all could have used the NASA moon landing computing technology at the time, but could never have afforded it. A $200 tablet will be used by far more people, driving more and more innovation than the $2500 one.

In so many other endeavors the realities of energy or materials or production capability put rather hard ceilings on what can be produced economically and that will be bought. From time to time the ceiling is lifted by some development or other, but at nowhere near the pace of computing development.

I think you underrate the changes due to IT (and microelectronics generally) since the 1960s. The cheap phone calls, the just-in-time delivery systems, the world-wide supply chains, the automation in factories, the outsourcing, the cheap imports from Asia and the global financial network moving trillions a day are all enabled by that technology. Ordinary workers have seen greater choices and lower prices from all of that.

In TGS, you mention the low (even negative) productivity growth of Education, Health and Government sectors. Most TGS reviewers seem to skip this point. I think it's been deadly, killing a lot of the improvements which IT/tech has delivered. Think of the cost of an international phone call or portable music player now vs. 1960. Now compare the cost of health care (even basic primary care) or a college education now vs. 1960. Why are you surprised that overall productivity stinks with those huge negatives thrown in the pot?

American primary education is so bad that the average American worker has no real advantage over third world labor. Note how many immigrants from third world countries compete directly with American workers with no disadvantage due to their (much cheaper) educations.

In my opinion, the American work force is increasingly divided into a majority with third world skills and a small minority of tech-enabled, somewhat better educated, high productivity people. I agree with the common-sense view that there's no point in doing any job in the U.S. that can be outsourced to the third world. You just pay more here, and for what?

There's no short term fix for this. The people you want to employ have more or less finished their education and are not suddenly going to become more skilled. There's no shortage of labor in the third world. U.S. companies know how to use that labor now, and due to the internet/computers/cheap phone calls, that labor is readily accessible.

Just for the record, "Race Against the Machine" is almost like the Readers Digest Condensed Version of the first half of our 2009 book "The Lights in the Tunnel."

You can see a September, 2011, panel discussion with Tyler and the author of our book here:


and the book is here: http://www.thelightsinthetunnel.com

Also for the record, the solutions offered in "Race Against the Machine" will not be sufficient. Improving education and getting people to start businesses on eBay is not going to solve this problem. Certainly not in the long run. Most likely not in the short run either. I suspect that Tyler recognizes this, as he suggests in the video above that "tent cities will be part of the solution."

You may not agree with the solutions suggested in "The Lights in the Tunnel" but it least it recognizes the full extent of the structural changes that are occurring and does not shy away from confronting the implications of those changes.

2. Had innovation gains been much greater in non-IT sectors, median living standards would have gone up much more.

I'd like to see this argument; it doesn't pass the sniff test.

One point for consideration in RAM v. TGS is that the innovation cited in RAM has moved so much more quickly than the ability to manage and leverage it successfully. Take ERP systems as an example - certainly, major productivity gains there, but implementation timelines, costs, optimization and support requirements over time are substantial. We're only now getting at the point where the coordination costs for a lot of enterprise IT are dropping significantly due to better back-end automation and management tools.


"too much innovation is the bane of struggling workers."

That strikes me as such an off phrase. It is certainly true, but it's true in the sense that gravity is the bane of people who fall off the Empire State Building.

j r,
It only happened because of the degree to which we were encouraged to see the validation of our skills through a third party or entity..sort of like when the church was the only way to salvation.

Well, thank God China didn't come on the world scene. Imagine if 1.3 billion low cost workers had exploded onto the world stage. Manufacturing wages would stagnate, or perhaps even fall. Chinese savings would cause a huge glut of capital, leading to a lot of financial transactions, and who knows, maybe even an asset bubble. Yes, the wages of labor might decline and the profits of the financial sector might rise...ah, but all this is speculation.

But if it did, obviously the change would be permanent. Because China's wages would never rise, and their finance people would never learn how to do IPOs or private equity deals. No indeed. The Chinese are happy to start and the bottom and stay there forever. So we can thank our stars that they haven't opened up the Chinese economy yet and that the Chinese really aren't interested in material progress at all.

These are all thought provoking comments, particularly mgoodfel's. However, one thing lost in this discussion is the importance of whether the innovation increases production or creates an entirely new product, as opposed to making it cheaper to produce an existing product or service.

If a new tech provides people with a product or service that they never had before, for example tropical fruit in the winter or a car, the tech increases demand and expands the economy. This cancels out the labor savings effects, no you can't get work anymore as a buggy whip maker but you can become a car dealer and so on.

If the tech makes the production process cheaper and quicker, but doesn't really provide a new good or service, just the same good but its cheaper and now maybe has more bells and whistles, there is no real expansion of demand. People bought x before, now they buy x2, it leads to a real improvement in their lifestyle because x2 is a cheaper, better, and more versatile product, but its not like they were buying one x and now will buy five x2s. You need the same workers to produce the x2 as you did the x, no wait because of the tech you need fewer workers and they have to have different skills. So you just get labor contraction and there is no place for the laid off workers to go. But they can now afford to buy x2s with their unemployment checks.

I completely agree that too quick a pace of technology is the bane of labor markets in the short term.

In my own overly simplistic economic model, this should be self-correcting. When technology moves too fast for labor markets to keep pace, unemployment rises and the resulting reduction in purchasing power reduces overall demand. The pace of technological advancement slows, due to reduced investment, until labor markets shift in a way that increases purchasing power again.

I think right now we're kind of fumbling around with shifting labor markets and haven't found a meaningful place for them to settle. Money is available, but a lot of companies are not investing due to the uncertainty of the economy. Tech advancements, and adoption of recent advancements, continue, but at a slower pace. The question is how long will it take the labor market to adjust? Is there even a place for it to land? Are we still advancing too fast, trying to get the labor market to hit a moving target? Or will new tech create a landing spot that doesn't exist yet? I expect investment in new tech to decline until the labor market is well adjusted. Once that occurs, tech investment will pick up again.

So the cycle goes...

I suppose I should add "all other things being equal." There are certainly other influences on the labor market that may either amplify or deaden the impact of advances in technology, and we know these other influences won't remain stagnant.

WRT "low non-IT innovation post 1973" and "early 1990s having been worse than most people realize": it is striking to me that you feel comfortable providing a big-picture macroeconomic analysis without discussing how the price of oil is inversely correlated with these economy-wide productivity gains: http://www.wtrg.com/oil_graphs/oilprice1947.gif and many others. There is no mystery about what happened to the world economy in 1973!

Posts like this one strengthen my sense that energy-based analysis of big economic picture, as popularized by the Peak Oil crowd, challenges academic economics at its intellectual foundations.

I too wondered if TGS and Race Against the Machine are largely describing similar phenomena from different angles.

1) There has been been overall slowdown since the 1970's, and large portions of our economy are no longer involved with wealth-creation.

2) Machines can increasingly do these non-wealth-creating jobs better than people. When the whole point of someone's job is to extract billable hours, insurance payments, or union concessions, it's increasingly easy for a computer to fill that person's job. The cyclical downturn just made it much more visible.

3) To incorporate Peter Theil's argument about credentialism, look at the private industries with mixed records of wealth creation: law, finance, business consulting, academia, and medicine, just to name a few. There is rampant credentialism in these fields. Now look at the two most innovative sectors in our economy right now: arguably, IT and to a lesser extent, Oil & Gas. Remarkably little credentialism.

Would it be too simplistic to parse the argument this way:

TGS: the pie stopped expanding in 1970

RAtM: capital is getting a bigger slice and labor a smaller one

If so, the facts are not at issue; both stories are true (or at least compatible). The disagreement, if any, is in what to do about it.

Comments for this post are closed