Last week David Autor offered his latest paper on technology and inequality. It's an interesting paper, but one thing he does that I don't think is supportable is try to draw broad conclusions from the fact that private investment in IT equipment has declined as a share of GDP:
Autor takes this as a sign that American firms have slowed down their investment in high-tech goods. I'm not so sure. Analogies are dangerous, but in this case I'd say thinking about a personal budget has some merit. Ten years ago, I had an aluminum PowerBook G4 that cost $1,799. Today, that's roughly the combined price I paid for an 11 inch MacBook Air, an iPhone 5S, and a Retina iPad Mini. Of course, that doesn't include the price of the Verizon contract that subsidized the 5S. But then again, back in 2004 my Motorola RAZR cost $500 with contract.
All of which is to say that Matt Yglesias spends a considerably smaller portion of his income on computers in 2014 than he did in 2004 but he has much more computing power at his disposal.
The price of computer equipment and software has fallen so dramatically over time that measuring IT investment in dollar terms is unlikely to tell you very much. What you probably want is some kind of measurement of "computing power per worker" which would tell you how IT-intensive firms are today versus 10 or 15 or 20 years ago. Justin Fox takes a stab at it, using the Bureau of Economic Analysis' official price index here and finds a continually increasing trend:
My view is that this official price index is almost certainly too conservative. How good was the Microsoft Office suite you could purchase in 1999 compared to the free Google Documents you can use today? How many tweets could you send from your clamshell phone ten years ago? Any kind of real-world index of the sheer amount of computing power deployed in a modern workplace is going to show a vast increase.