Scott Sumner on digital deflation

He makes many good points, but this is my favorite:

From 1995 to 2004, productivity and real GDP rose at an unusually rapid rate.  The IT cheerleaders told us that this fast productivity growth was the long delayed fruits of the IT revolution.  Now we have very slow growth, and the digiterati tell us it’s also caused by the IT revolution, which is generating lots of stuff that doesn’t get picked up in the output data, because it’s free.  While I’m impressed by an explanation that’s as flexible as a circus contortionist, I’d prefer something that isn’t consistent with any possible state of the universe.  I’m no Popperian, but I like my theories to be at least a little bit falsifiable.

In other words, we know what a boom looks like, and this ain’t it.  I would, however, assent to and indeed stress two propositions:

1. Infovores are indeed much better off from the recent digital revolution.  And since most journalists and tech leaders are infovores (many academics too), they extrapolate too readily from themselves.

2. The “rate of productivity growth in consumption” is more mis-measured than is the rate of productivity growth in production.  Facebook really is fun for a lot of people, and unpriced on the consumption side; I sometimes say that I am a happiness optimist and a revenue pessimist.  But the production side of the economy matters in its own right, and indeed that is why they call it productivity.  Debts and bills must be paid, and jobs must be created at wages people will take, whether or not you’re having fun with Angry Birds or cursing at your (least) favorite blogger.  In fact we just had a recession where the jobless probably had more fun than ever before, due mostly to the internet.  It was still an event of significance.

So don’t aggregate consumption gains (e.g., learning to enjoy your Brussels sprouts more) with productivity gains, proudly parading a single number and claiming that everything is fine.  It is better, and more accurate, to say: “We’ve now learned to really love those Brussels sprouts, good for us, but we still may be in deep doo-doo.”


How did we go from "Against photos (rant)" to "Infovores are indeed much better off" so quickly? As an infovore, I'm not convinced things haven't moved backwards in the past 8 years.

Surely the majority of the benefits of highly connected internet goes to those least educated, lowest social classes? The biggest areas of the net are things like Facebook, youtube cat videos, porn and gaming. Hardly the province of the "inforvore".

Social media helps the dregs of society meet each other, reproduce, endure their miserable jobs and then divorce with cheap legal documents downloaded off the Internet.

I'm not sure where people are getting the idea that "productivity" is somehow in trouble:

Yes it did grow at a higher rate between 1995-2004, but I'm not so sure that this had much to do with IT. This was also the period of globalization.

Automation and IT may have plaid a role in it, but once they're ubiquitous throughout the economy, how much more can you gain? Same with globalization. Once the low hanging fruits have been picked, they can't be picked again. There's not a new China or a new India or Mexico opening up every decade.

Second, there obviously are some measurement errors here since the BLS's measure of productivity excludes certain industries and activities where input-output can't be measured so well.

So a simple answer to the question would be to look at productivity within industries. Which industries "slowed down", and which, if any, increased or maintained an increased rate from 2004. BLS ought to be able to provide an answer to this, and this would shed insight if this "slowing down" in the sectors affected most by IT, or not.

Third, other things started happening around 2004. Mainly, labor participation rates by different age groups. I.e., the "baby boomers" entered their "least productive" stage of their careers, and they started to become a larger share of the labor force (as more young people stayed in school longer). As the labor force becomes older, it also becomes less productive.

But seems to me, the aggregate productivity measure won't tell you much. You've got to at least look at industries making up that number.

PS: Of course, starting in 2007 at least there were troubling signs of a recession, so saying that this decade has been slower than "usual" (although it doesn't seem to be so. Seems to be pretty standard), isn't saying much. A massive recession and the slowest recovery on record, probably have much more to do with this than anything else.

"Yes it did grow at a higher rate between 1995-2004, but I’m not so sure that this had much to do with IT. This was also the period of globalisation."

I read a Fortune article 16 yrs back saying that 80% of the growth in US productivity in the '90's was from WalMart: globalisation AND IT! [& logistics]

Sure, globalization and IT. And labor market composition. And massive recession. And probably lots of other things.

Also, if productivity is inputs-outputs, I can think of at least one major input that tripled in price around 2004-2005.

The IT revolution of the 1990s was quite a bit different in character than the social networking and SaaS story of the present. So you're putting together two different narratives because you believe some of the narrators are the same for both.

right, I think economists are looking at "IT" in way too broad a fashion compared to how actual practitioners encounter it. There's web stuff, which is often free and probably not well captured in the statistics. But there was also a ton of Corporate IT back office automation that really did make businesses more efficient. Think databases, ERP systems, payroll automation, supply chain management... Un-sexy work but probably really did move the numbers.

A bit more on this, from my own experience. I started a job in 1995 that involved running a lot of statistical analysis. I did these on a desktop Compaq with a 33Mhz chip, and I would have to wait hours for the bigger runs to complete. When I was doing those same runs on a 90Mhz Pentium later in the 90s, it really was 3 times faster. Some people thought I was really good at my job, but it was just that the clock speed was increasing my productivity. Well, once processors were in the Ghz range, the computer wasn't the bottleneck.

And actually, by then I had switched into proper IT work. During that decade of 1995-2004 we got a paid a ton in IT to eliminate lots of mundane jobs. All the money being invested in our services would have increased GDP, and knocking out those mundane jobs would have increased productivity (on a per-worker level). But... That work is done now. It wasn't some trend-line that was going to continue forever, it was a bounty that we harvested. Is this not what theory would predict? There was a lot of investment in IT capital to reduce labor expenses, and after that has happened we see fewer jobs and the spending is lower (that investment isn't a forever spend).

Why am I supposed to be surprised? This is not a sarcastic question - I'm not an economist so I don't know why your theories expect something different than what happened.

But I look at the modern world and see it is full of things that are yet to be automated.

Once upon a time a process involved someone monitoring something with her eyeballs, later a CCD was introduced, but the resulting images were then inspected by eyeball. Later image analysis software was added, but still launched by hand. Then the process of taking and processing snaps got automated, but that the results are not automatically used by the rest of the computers in the fatory, and a on and on it goes.

Nothing in life is"free", including what's available on your computer or smart phone. But what about all those "free" apps, the "free" internet, and the "free" services like those offered by Facebook? Manipulation takes many forms, some obvious, some subtle, the most successful forms of manipulation being the subtle kind.

"I’m no Popperian": why, in God's name, not?

One reason his "science" is dismal.

No one's a Popperian, except about other people's theories.

Various people who automate inductive reasoning cite back to Popper approvingly, and wire the concepts into their algorithms. Perhaps if it was practical they too would selectively apply rigor only to attack the theories of their rivals. However, they labor under the technical constraint that their creations (speech recognition systems, robot control systems, and other things) must operate fast and unattended, which gives them no practical alternative than to automate the application of these ideas robotically impartially.

Just because it's very human to criticize one's rivals' spelling or grammar or handwriting or arithmetic more severely than one's allies' isn't very good evidence that there's no such thing as good grammar or handwriting or spelling, or correct arithmetic.

I don't think any macroeconomist can be a true Popperian, given the near impossibility of empirically disproving any macroeconomic model.

Popper was on the right track in some important ways, but he doesn't seem to have been all that good at thinking about inference from realistically messy data and evaluating theories based on their performance on historical data (as opposed to controlled-experimental) theories. That's not a limitation that can easily be waved away in economics. In particular AFAIK Popper never seriously connected to work about information theory or even probability distributions, a lapse that doesn't seem to reflect well on Popper or academic philosophy given that Solomonov induction was developed well before Popper stopped working. And S. induction, and its intellectual descendants (like the catchphrase 'learning is compression', and various concepts and techniques advancing under the banner of Bayes) make it much easier (than reading Popper and then sitting under an apple tree hoping to be hit with brilliant insights about how to get from there to the real world) to evaluate historical data, systematize Occam's Razor, quantify overfitting, and cope with probabilities and increasing confidence instead of unrealistically sharp (in the context of inductive reasoning) dichotomies about perfectly known truth or perfect falsification. So I recommend being a Solomonovian or Vapnikian or Rissanenian or something like that.

I think various grant-funded fields, including various fields of economics, definitely including macroeconomics, could straightforwardly become less characteristically grant-funded "science" by becoming a lot more Vapnikian or Solomonovian. As they stand, it sure doesn't look to me as though they're serious about overfitting and related issues. In economics and climate modeling in particular I really don't know how naive enthusiasm for hopefully-not-overfit hindcasting can be presented with a straight face, because both fields routinely encourage mathematical hairiness which is significantly gnarlier than the math required to express whether a theory or model is any simpler than the historical regularity that it has been crafted to match. (Or the Vapnikians would likely put it in different terms, um, something like whether the data set and the closeness of the match to it is sufficient to pick the model out of the appropriately agnostically huge family of alternative models, maybe.)

And then possibly, information can bring productivity down?! Past story of low productivity: Absence of Right Information. Present story of low productivity: Excessive Distractive Information. Rational Actors are Social,Emotional Fools!

Thinking only in terms of the passive consumption by 'infovores' misses a lot of the (un-priced) value, for example:

1. The internet is invaluable in deciding what to buy (through reviews) and where to get a good price and fast shipping. I cringe when I think about all the time I used to waste driving around to shop. But while this is a big efficiency gain for me, it's a net negative for measured GDP.

2. The net is a god-send for do-it-yourselfers. Recently, my bike developed an annoying clunk. A little research determined the likely problem. A youtube video demonstrated the fix. A couple more searches turned up the specs for the part and where to buy it. I spent $25 total vs probably $100 at the bike shop and the time needed for the fix was less than schlepping the bike back and forth. Another efficiency gain and net GDP loss.

3. The net is a similarly great for cooks. Think of the old inefficiency of collecting cookbooks and recipes from friends vs the new efficiency of googling recipes as needed. When I've nothing planned for dinner, I search for recipes based on the ingredients I happen to have on hand, and I can almost always come up with something really good. Cooking now is almost like Netflix -- no real reason to watch the same movie/make exactly the same dish twice because there are so many other possibilities. We eat better at home and dine out less often -- again, an improvement in our lives and a loss of measured GDP.

4. Travel. Another night and day difference. No need for tour-guides and companies. We now avoid hotels whenever possible in favor of cheaper and better house/condo/apartment rentals. We never get lost and are rarely unpleasantly surprised by lousy restaurants or disappointing destinations. Again, we spend less and get more.

5. Weather. Knowing that there's a 50% chance of rain this afternoon is only mildly useful. Being able to see that the blob on the radar that will get here by about 3:00 is vastly better. It's so much easier now to avoid canceling plans unnecessarily, to plan around bad weather, and to avoid getting caught out in storms.

Absolutely Right! Endorse that fully! Place to get helpful information for active use in Life! Do-it-yourself for illnesses! For verifying quality of one's service providers say doctors! Folks, Google worth far more than all of SM put together?

Absolutely agree about travel. On my trips now I spend so much more time actually enjoying things and less time searching or being lost. When going on trips with my parents as a kid the model was that we'd drive, find the tourist information center and then call around for a hotel and ask about local attractions. And of course you needed your maps, even with which you still ended up stopping at gas stations periodically to ask for directions. Now I book everything in advance, have a pretty good idea of the quality of the hotel beforehand and can comparison shop. I'm rarely if ever lost and can help decide how to spend my time with the use of sites like tripadvisor and wikitravel. Then, of course, there is less stuff to lug around, with maps, travel info (e.g. confirmation numbers, contact info) and a video recorder/camera--which BTW is digital, helping to ensure you get the shots you want and have superior memories unlike the day where you had to wait to develop the film and hopes it turned out OK--all on my phone (although we also take an actual SLR), plus a Kindle instead of books. It's just amazing, and outside of the cost savings from being able to comparison shop I have no idea how to quantify all of that.

No "getting lost." No stopping to talk to people to ask for directions. No being unsure what you're actually going to do for fun that day. No waiting excitedly for your photos to come back from the developer. So what exactly is the fun of travel for people nowadays?

Properly used, technology is more a booster of serendipity than a hindrance. Now you can wander off track more without fear of getting truly lost and not being able to find your way back. And you can use your smart phone to make last-minute plans or change your plans midstream. As for photos -- don't look until you get home (we usually don't -- there's too much else to do).

What about the negative externalities of the net? It is easier to be mislead by a group website than ever. All those antivaxers and whatever fly by night website they read. "If it's on the Internet it must be true". Or does this blog not believe in externalities?

That's pretty rich coming from Scott Sumner. Market Monetarism isn't falsifiable in the Popperian sense either! There does not exist a state of the world that isn't describable by it:

Sumner would disagree

Isn't the overall purpose of an economy consumption? And investment just a way to increase future consumption at the cost of current consumption?

To somehow say increased consumption is not of value to the economy seems like it may be missing the point of the economy in general. What use is increased production if it is not tied to increased consumption? Perhaps we can make a value judgement that some consumption is better than others, but that's not the same judgement as devaluing consumption itself.

Caveat: not an economist. Just read some blogs.

When it comes to economic gains I'm tempted to side with those who prefer instead to try and measure "prosperity". Metrics that (arguably) correspond more directly to "human flourishing" than the rate at which goods are being bought and sold (e.g. economic activity).

So, for instance, life expectancy at birth. Suicide rate. Violent crime rate. Possibly even employment rate among males 25-55, only because of the impact of unemployment on emotional well-being. Possibly inflation-adjusted per capita charitable giving, since I'd expect it to correlate strongly with individuals' sense of their own material prosperity.

A spreadsheet program, to take one example, is a pretty spiffy thing. Twenty five years ago, it cost you a pretty good amount of money if you wanted one. Today, you have a choice of excellent spreadsheets at no cost. I remember in the 90s, we used to say that the computer you want is five thousand dollars, but the computer you need is two thousand...and it always will be. Twenty years later, those figures have dropped by almost an order of magnitude in real terms. I don't think we've wandered into Utopia without realizing it, but I'm not particularly impressed by an analysis that seems not to notice that lots of IT has gotten really, really cheap.

That's the problem, isn't it?

The fact that everything in IT has become incredibly cheap, and that IT is eating the word, making everything it touches extremely cheap is not a theory open to falsification, it's an observation that at most can be refuted.

Yet, Tyler is falsifying the observation it with his pet theory, that GDP accurately measures wealth. Maybe he does need to read Popper again.

Tyler, I've been thinking about this problem and I think there are some very tricky things going on. For example, if we are trying to measure the GDP effects of the IT / internet revolution, it seems incorrect to not measure the enormous activities of Foxconn and the other electronics giants of Shenzen, which employ hundreds of thousands (millions?) of people building iPhones, laptops, tablets, routers etc. Also the general Chinese manufacturing revolution was enabled by global IT infrastructure. In other words, the U.S. and China have commingled their manufacturing supply chains to the extent that they need to be considered together - the production statistics you are looking for are captured in the enormous Chinese growth rates of the last 20 years, which were by and large not driven by Chinese demand but by U.S. and European demand. Also, consider the software & service sector revolution in India - it is smaller than China, but service outsourcing to India is still significant. In other words, many of the GDP gains from IT revolution are showing up in the *global* economy more so than in the U.S. economy. The U.S. is paying for a lot of it and consuming it and taking a lot of consumer surplus, but not actually producing the hardware or infrastructure which would show in GDP.

I think the folks like buddyglass above are on the right track (human flourishing is a bigger deal than GDP per se). For bits, at least, we are entering a post-scarcity economy and this really is uncharted territory. We are held back by legacy institutions but that is an effect of social gridlock rather than a failure of the technology economy (old industries don't die or change as fast as they ought to because people's habits are slow to change).

"I am a happiness optimist and a revenue pessimist."

Happiness is barely a coherent empirical concept.

But if we get to a point where we can produce staple goods at reasonable price without making people (and animals!) suffer excruciating pain and boredom, I'll concede a truly better world.

As an infovore, Tyler consistently underplays (or ignores) the role of *entertainment* and interpersonal communication in this context. Yes, people who vacuum up info about the latest academic studies are a tiny slice of the population, but people who share photos, watch cat videos, play games, etc., are not. And even on the info side, you have to include things like all the arguments IMDB has made obsolete.

On this blog Tyler overrates the productive value of popular culture entertainment. Is Angry Birds better than a walk? He is posting constantly about books nobody will have heard of in 100 years.

Not sure what the Brussels sprout comment is supposed to be about. I enjoy mine pan fried with bacon and finished with a glaze. I've also had kimchi versions, they can be pretty good.

I’m no Popperian

Wow, no fucking kidding!!!

Excerpts from my new ebook: Macroeconomics Redefined.

It may not be inappropriate to say a few words in passing about US productivity. Since the Great Recession productivity growth has been abysmal by historical standards and it has worsened instead of improving. But there is nothing really surprising about this. It was only in October 2014 that annual Real Gross Private Domestic Investment returned to the level it had in January 2006. During that period real GDP grew 12%. Real Net Private Domestic Investment fared even worse. Its value in 2013, the latest for which figures are available, was at the level it was in 1994. During that period real GDP grew nearly 60%. If there is a puzzle that needs explaining it is the fact that productivity has continued to grow although investment per dollar of GDP has plunged. The US economy seems to have grown super-efficient.

Intellectual property violations are certainly adding to consumption, which is supposed to be measured in GDP. There isn't much intellectual property I pay for - my assets are near zero, I live in a left-wing state, and I'm using my landlord's open ISP.

That said, I am certainly better off with almost unlimited access to free games, movies, books, and MP3 than not. I'm not sure that any of this "harms" anyone, as I wouldn't pay for any of it any ways - I would just watch more Netflix.

On the other hand, my new girlfriend loves to cook, and the amount I spend eating out has fallen from like 800 a month to around 200 per month. GDP has contracted. Yet the girl is the Pablo Picasso of food, and I am certainly better off.

Tyler, do you consider yourself to be a "Popperian" (as Scott puts it)? Do you have a philosophical problem with that or some other kind of problem with that? Thanks.

In the 1980s, a video game player had to continually throw quarters into a machine for fun. Today that same player can buy a game for $30 and play it for days on end.

Flappy Bird was created for almost nothing, produced very little direct economic output (aside from a few million dollars in ad revenue) and was played billions of times.

In terms of dollars/leisure hour, video games have gotten much more productive but we're not capturing that in the statistics.

Sure, but there are substitution effects and adjustment effects.

Video games now often substitute for actual friendships (at least for some). And since people get used to e.g. graphics standards, the actual enjoyment per minute per dollar has not grown quite like one might expect.

(I still want to point out, even if it is unpopular, that substituting animal products will increasingly good plant products can do much more for the quality of life on this planet than video games can.)

with* instead of will

It is hard to imagine a world without flappy bird. We'd be much worse off.

On the production side I think the digital revolution represents a simple positive shift of the supply curve, more stuff can be had at the same price.

I also think it might cause a negative shift in the demand curve. People demand less stuff. The internet, in many ways, reduces demand. If you get enough spoilers for the mid-level movie (i.e. Jurassiac World), you have less cause to buy a ticket. Yelp can keep you from trying 3 new places to eat in a week because it will help you zero in on the 1 spot worth a shot. That is great for the one spot that sees increased sales but it comes at a cost of 2 meals for the runners up that would have earned your money when you were ignorant.

This implies to me:

1. The net impact on the economy is whatever curve is impacted the most. One could argue early on the benefit mostly accrued to the supply curve but now it is shifting demand more.

2. The amount of constant stimulus required to maintain 'normal full demand/employment' has increased.

3. Measuring economic performance by GDP has gotten more complicated and less reliable. If I want to eat at one good place per week, I'd rather not go out 3 times but as far as GDP is concerned me going out 3 times is thrice as good. Ideally full employment can be maintained by the gov't adding enough stimulus to make up fo rthe 2 meals I don't end up buying, but in the meantime some GDP destroying apps are actually untapped potential that we are leaving on the table.

Brussel sprouts? You meant non-GMO, gluten free, kale and chia seeds...

New theory: The IT revolution did contribute tremendously to productivity growth but its also been responsible for tremendous loss in productivity too. So we grew from 1995 to 2004; what happened in 2004? Facebook launched.

Access to information is great. But with a master's degree I would have thought I could access plentiful work in something paying more than the minimum wage. I think I would rather better work opportunities and access to decent university library (with journal access, etc.).

Comments for this post are closed