U.S. IT investment as a share of gdp

ITinvestment

The pointer is from Matt Yglesias.

Addendum: Claudia Sahm refers us to this chart of declining IT prices.  It also can be argued that IT spending moved into other, more general business categories.

Comments

Well, the price of Android didn't change during that time. Neither did the price of Linux, Apache, LibreOffice, Eclipse .....

There IS a price for Android... it's called licensing for the proprietary prices of the OS from Google...

These people don't quite agree - http://www.cyanogenmod.org/

'it’s called licensing for the proprietary prices of the OS from Google'

I think you meant 'pieces' instad of 'prices,' but you are right - the parts of Android that are not covered by free software licenses are proprietary. However, as noted above, everything necessary to run a phone/tablet can be done without a single piece of software not covered by a free license. As a matter of fact, these people apparently offer an actual 'Android' product without a single piece of Google's proprietary software - http://oneplus.net/one

Looks like a Y2K bump would explain part of the spike. Wouldn't the shift to cloud computing move business spending from investment into operational budgets?

And for the cloud itself, how are their investments accounted for. Do you call that IT investment?

For cloud providers, equipment is traditional capital expense on tangible equipment with depreciation. Routers, firewalls, servers...

For consumers of cloud services, it is a recurring operating expense like a lease. It could easily be buried in the budget of, say, accounting or hr. Not much reason to have an it department manage your cloud services when it is much more like a contractual agreement than a technology exercise.

Cloud computing also allows more efficient allocation of resources and greater economies of scale. Instead of each company having a server for their accounting system that is only 20% utilized all day and none all night, the cloud provider can service ten such customers, globally, with the same server.

Also, data center space has diminishing marginal costs. Building a million square foot data center offers cheaper cost per unit of computing power than a ten thousand foot one.

The Y2K bump coincided with the first Internet boom. Good times.

Once everyone had a computer at their desk with Internet access and everyone who was going to get a company phone got a company phone, what else was there to invest in except for newer, cheaper versions of the same stuff? Also the point above about software as a service probably has played a role post-recession.

The main IT innovations in the past decade have been consumer goods,seemingly.

Only if you think of IT as automating existing processes. The more interesting challenge is changing how information moves through an organization.

Amazon is the obvious extremely example of the latter. It's not Macy's with a bunch of computers added.

The org I work in has lotsa tech but it still *works* like it's in the 18th century. Have you dealt with a mover lately? A lot of industries seem relatively untouched by IT.

You can say that we reached the limits of what could be done with off the shelf software. But this is just the low hanging fruit.

I think when you say "phone" you agree with Tyler's "that IT spending moved into other, more general business categories." I presume iPhones and iPads are not "IT spending."

Big Data is the latest craze, but it can probably only improve a business marginally. Netflix is still no good at guessing which movies I would like to watch compared to a human.

The next big IT gain will come when software can automate the work of knowledge workers. Complex tasks which require judgment at every step and in which unforseen difficulties often occur are still very hard to automate. Even Tyler's human-computer chess team analogy doesn't work yet for most knowledge worker jobs because the computer hasn't yet learned to play "complex chess" in most cases. Perhaps some of the problem is that Silicon Valley would rather focus on consumer marketing and exciting apps and green ideas which save the world rather than much more difficult problems like automating a complex process.

It might be a mistake to split off the "suggestion" part of Netflix, and declare that a failure. I really doubt that the "big data" (in another sense) pushed around by Netflix is counted as IT spending either. Or is it?

This is a rate not a level. The rate isn't going to increase forever. It is still higher than it was at any time before the late 1990s.

My favorite is when Proggers take the US labor force participation rate from 1960-1990 and extrapolate a linear trend out to 2020. If only they continued the extrapolation to 2120 to see their folly of >100% labor force participation rate.

But you are arguing that its wrong to argue that the manual labor jobs are going away and replaced with high tech knowledge jobs!?!

Computers are going to replace welders, machinists, plumbers, steel workers, so everyone needs to learn to be computer programmers, thus cutting the cost of education by getting rid of shop class and the dummy track trade course in high school and post-high school tech schools.

How can taxes be cut if manual labor is not eliminated by computers,

This is IT investment. Falling cost per unit of "IT" capital is sufficient to explain this result.

IT investment can also be rising, but GDP is rising faster. Not only do you have non-IT investment, but personal consumption driving GDP up.

Rising market share of imported chips could also offset this.

Maybe there isn't any low hanging fruit available to pick, so you just buy something cheaper next time around.

I see two different investments. One is implementations of complex systems that automate production. They work well and save money.

The other is investments that make it cheaper to do the same thing. Instead of buying 45 licenses for an expensive vertical software package, set up a virtualization scheme where 10 licenses will supply the need. DItch MS Office for some cloud thing. Etc. Get rid of paper handling by shuffling it off to your vendors (grrr).

Actually three. Implement some system to meet a regulatory edict. As cheap as possible, simply a checkmark on a list.

Paging Robert Gordon
http://www.ted.com/talks/robert_gordon_the_death_of_innovation_the_end_of_growth

This measure is so heavily dependent on hold-up-your-thumb estimates -- hedonics and depreciation schedules -- I find it hard to confidently adjust any priors based upon it. No very useful takeaway for me here.

I would be interested to see what it looks like as a function of purchases, rather than depreciation.

I don't think hedonics matter for a ratio of nominal spending, and of course it wouldn't really make sense to change a ratio of dollars actually spent to try to represent their relative utility. Or maybe this isn't that obvious, but clearly you don't want to say someone spent 30% on X in 1984 if they actually spent 25 of every 100 on X, that's just silly.

I did considerable work beginning in the mid-90's analyzing IT spending for state and local governments. For organizations providing a broad range of services, IT spending as a % of revenues was generally in the 3% to 4% range. This was up from prior decades, but remained pretty stable through the dot-com crash and for at least a decade afterward when I stopped working in that arena. I see no reason to doubt that 3.5% is more or less optimal for the economy as a whole. Deloitte benchmarks this kind of data on a frequent and detailed basis for subscribers to their consulting services.

Hey guise, here's a chart! It doesn't mean a damn thing, but here's a chart!

True, it is just a chart … but it raises a lot of issues to explore further. One example, if you want something more to read: "Is the Information Technology Revolution Over?"
http://www.federalreserve.gov/pubs/feds/2013/201336/201336pap.pdf

The conclusion of that paper:

"Is the information technology revolution over? In light of the slower pace of productivity gains since the mid-2000s, Robert Gordon has argued that the boost to productivity growth from adoption of IT largely had run its course by that point. Erik Brynjolfsson and others make the opposite case, arguing that dramatic transformations related to IT continue and will leave a significant imprint on economic activity. We bring three types of evidence to this debate, focusing on the IT capital that underlies IT-related innovations in the economy.

What does this evidence show? Our analysis indicates that the contributions of IT to labor productivity growth from 2004 to 2012 look much as they did before 1995, supporting Gordon’s side of the argument. Our baseline projection of the trend in labor productivity points to moderate growth, better than the average pace from 2004 to 2012, but still noticeably below the very long-run average rate of labor productivity growth. On the more optimistic side, we present evidence that innovation for semiconductors is continuing at a rapid pace, raising the possibility of a second wave in the IT revolution, and we see a reasonable prospect that the pace of labor productivity growth could rise to its long-run average of 21⁄4 percent or even above. Accordingly, with all the humility that must attend any projection of labor productivity, our answer to the title question of the paper is: No, the information technology revolution is not over."

this is in nominal dollars.

Try real dollars as a share of real gdp.

You will get a very different picture.

That is because the deflator for IT is still falling sharply.

I'd be interested to see how SaaS technologies are impacting this. Often times, because they're both off prem and OpEx, they bought by the business.

Consider also companies who develop their own in-house proprietary software for their own use (Don't most Fortune 500 companies develop their own software or at least update the software of some company they've acquired?). Pretty sure that goes on the books as R&D not IT.

Which goes back to my point above about the difficulty in automating complex processes. Most attempts to do so are done in-house. Meanwhile, the best and brightest computer scientists are sitting around at Google or Facebook. These same people would be more valuable to, say, Halliburton, but Google is able to out-bid Halliburton for their labor because the former has the luxury, due to capital and investors with a different philosophy, to invest in some projects which may take ten years to pay off instead of three.

Companies coding their own systems is operating expense if it automates some process or aervices a specific need. It is capital if it is a product that can be reused pretty much as is in many situations.

It is only r&d until you start putting it to some specific use. R&d is expensed like op ex.

Most CFOs prefer cap ex because it can be depreciated over time. I learned long ago to present my pet projects as cap ex products because funding is more readily available.

Dirk, have you ever considered that your support for control economies might be related to your arrogant assumption that you alone know what's best for the rest of us? Vicarious authoritarianism?

That's seems like a non-sequitur. Relevance to dirk's actual comment?

Do you mean to ask how a history of comments in support of control economies is relevant to a comment in support of a control economy?

Are you planning on starting a charity so Haliburton can afford the programmers it 'needs'?

Agreed. The chart means nothing at all:

We have exponentially more computing power at exponentially lower cost per mip or gflop or whatever. The overall investment in computing power is also widely distributed now and is almost certainly larger on an absolute USD basis as well if you include phones and embedded microcontrollers. But the magnitude of the power increase and it's ubiquity is what's important.

Increased productivity in the production and use of software seems to be a major factor here. The same code can be used to meet more needs than before. Likewise investors waste less money buying/renting software that isn't needed or used.

Spencer is right. See http://research.stlouisfed.org/fred2/graph/?g=GU9

Behold, a graph of marginal utility as a function of exponentiating processing power :)

This jibes pretty well with the notion of diminishing returns and low-hanging fruit. A PC in the 1980s that could run Excel was world-changing in a way that's hard for any new tech to ever replicate, much like, say, the internal combustion engine, or antibiotics.

I'm not sure why commenters are saying using real numbers would make a difference, unless you're using different deflators for the nominator and denominator, that is to say ax/bx = a/b.

As someone pointed out, a big reason for the spike prior to 2000 was preparation for Y2K. And there was a great deal of overspending in this area because IT management did not understand that just because a piece of hardware or software did something with a date, it did not necessarily mean that it was prone to a Y2K problem. So, a lot of hardware and software was upgraded needlessly. And over the last 10 years, IT management has started to realize that it has overspent on equipment in the past. Previously, you'd have dedicated servers and disk drives for applications. Analysis of resource utilization on servers and disk drives was showing there was far more capacity than needed. And you'd have people being given PC's and laptops that were overkill for users who were primarily using them for e-mail, word processing, and spreadsheets. So the trend has been to use shared servers, disk drive arrays, and inexpensive clients such as Chromebooks.

What i see here is twofold

1) We got a lot of the low hanging fruit, so spending can decrease now that we have a good installed base
2) We are getting better at IT. Its easier and less costly to implement the IT solutions that most companies need so our overall spending is lower.

Or simply that the influx of competition has lead to lower amounts of new entry into the market. Couldn't this explain the sharp decline just as much as the accounting theory she is proposing?

Comments for this post are closed