The Friedman Magic

by on November 28, 2006 at 7:19 am in Economics | Permalink

One of my favorite Friedman papers is "The Effects of Full-Employment Policy on Economic Stability: A Formal Analysis" which you can find in Essays in Positive Economics.

Friedman sets up a very simple model, Z(t)=X(t)+Y(t) where Z(t) is income at time t, X(t) is what income would be if there were no counter-cyclical government policy and Y(t) is the amount added to or subtracted from X(t) by the history of government policy.

You wouldn’t think that much could come out of such a simple model but Friedman takes the model, notes that the formula for the variance of two random variables is V(Z)=V(X)+V(Y)+2 r(X,Y) Sd(X) Sd(Y) (where V is variance, r correlation and Sd is standard deviation) and proceeds to show that:

In order to cut the variance of income fluctuations in half (which would cut the standard deviation by less than a third), r(x,y) must exceed .7.

The result is powerful because once you start thinking about the correlation coefficient, r, it’s hard to see how it could be as high as .7.  Very few government actions taken in time t have an effect in time t – there are lags between recognizing a problem, deciding what to do about the problem and implementing a policy.  Once the policy is implemented there are lags before the policy takes effect.  All of these lags are of uncertain and changing length so actions taken in t-5, t-4, t-3, and t-1, may influence Y(t) making a high correlation between X and Y unlikely.  Moreover, Friedman’s bound is an upper bound, requiring optimally sized interventions – when we recognize that the size of the intervention might be too little or too much and that in both cases this will reduce the decrease in variance we have a strong case for skepticism about the efficacy of counter-cyclical policy.

But was Friedman right?  In the thirty or so years after he wrote, when counter-cyclical policy was in vogue, the variance of the US economy was much lower than in the pre-World War I years.  Reality it appeared, refuted Milton Friedman.

Friedman, however, lived to see his simple model proved correct (Essays in Positive Economics!).  In a series of papers beginning in 1986, Christina Romer showed that the pre-WWI volatility was an artifact of the way the data was collected.  Once the pre-WWI and post-WWII data were collected consistently, using the same methods, the post-WWII economy showed no big drop in volatility.

Almost nothing in, a surprising and powerful result out, and an implicit prediction proven correct after thirty years.  That’s the Friedman magic. 

1 dsquared November 28, 2006 at 7:52 am

Hmmmm I think that’s a pretty tendentious reading of Christina Romer’s work. Brad DeLong has a couple of good bits on this from way back.

The important point to note is that “the standard deviation of incomes” is not an interesting policy variable. When people are worried about instability of incomes, they’re not worried about standard deviation, they’re worried about recessions. The purpose of counter-cyclical policy isn’t to reduce the standard deviation; it’s to alleviate downturns, and there is no symmetrical policy goal of reducing booms.

Christina Romer did find that since the second world war, recessions have been shorter and less deep. That’s the visible benefit of stabilisation policy. So I’d classify this particular piece of Friedman as a conclusion that he was going to assert anyway, a model that doesn’t really provide any support to it, and a prediction that wasn’t borne out by the evidence.

2 caveatBettor November 28, 2006 at 9:23 am

dsquared, isn’t standard deviation of incomes a fair proxy for instability of incomes? Perhaps you might propose alternative measures (that can be practically measured). That would give your rebuttal more teeth. Thanks in advance.

3 jn November 28, 2006 at 10:24 am

The purpose of counter-cyclical policy isn’t to reduce the standard deviation; it’s to alleviate downturns, and there is no symmetrical policy goal of reducing booms.

Correct me if I’m wrong, but if I recall correctly, according to Keynes, a government should raise taxes and cut spending during growth periods. This would constitute a brake on the economy, reducing the upper end of the standard deviation.

4 Rich Berger November 28, 2006 at 12:58 pm

Dsquared –

I think that’s a pretty tendentious reading of Christina Romer’s work.

From the introduction to the paper: “The bottom line of this analysis is that
economic fluctuations have changed somewhat over time, but neither as much nor
in the way envisioned by Burns. Major real macroeconomic indicators have not
become dramatically more stable between the pre-World War I and post-World
War II eras, and recessions have become only slightly less severe on average.
Recessions have, however, become less frequent and more uniform over time.”

Also: “At the same time, however, there have been a series of episodes
in the postwar era when monetary policy has sought to create a
moderately sized recession to reduce inflation. It is this rise of the policy-induced
recession that explains why the
economy has remained volatile in the postwar era.”

Finally, Brad DeLong provided helpful comments and suggestions to CR’s paper.

5 Jason November 28, 2006 at 1:55 pm

Hmmm. Friedman’s model seems dubious to me. Friedman defines X(t) = the income curve assuming laissez-faire policy; and Z(t) = the income curve assuming some other policy. So far, so good. X and Z are like alternate universes. Then Friedman defines Y(t) = Z(t) – X(t) and claims that Y(t) reflects government policy in the Z universe. Which is sort of true, but it’s not going to be a *simple* function of government policy. Suppose the only policy difference between Z and Y is that in Z, there’s this government-sponsored experiment where a butterfly flaps its wings (or if you prefer, the government confiscates a horseshoe nail). By chaos theory, this has tremendous consequences. Y(t) will be some kind of random walk. I think that by ascribing a meaning to Y(t), implying that it effectively captures policy choices and nothing else, Friedman (or Alex) goes astray.

I don’t think it takes chaos theory to poke holes in this model. To think of government policy as applying additively to the laissez-faire baseline X strikes me as preposterous. To us in universe Z, X isn’t even knowable. Any sensible model of the effects of government policy will have dZ/dt (or something) determined by some function of the history of policy. X will be nowhere in sight.

Let me be clear–I imagine Friedman’s math is flawless, r(X, Y) would indeed have to be pretty high. But he has cause and effect backwards. Government doesn’t need to predict X over time in order to achieve this. Rather, if the government does a good job of damping economic fluctuations, the result r(X, Y) > 0.7 will just fall out of the math. It’s easy to see why: by definition Y(t) = Z(t) – X(t); so if Z(t) is stable, then of course the curve of Y will follow the curve of X. Duh.

(Disclaimer: I am not an economist. I’m a programmer who knows a little about electronic control systems.)

6 srp November 28, 2006 at 7:09 pm

Jason’s critique is off. It’s a perfectly operational thought experiment to set Y(t) to zero–no attempt at countercyclical stabilization. The key is that Y(.) represents intentional, not unintended, macropolicy. The question is whether we can reduce the variance of Z if we consciously try to. All non-stabilization aspects of government policy (e.g., fighting wars and enforcing contracts) would be part of X. X is not laissez-faire–it’s no discretionary countercyclical macropolicy.

7 Brenda Rosser November 29, 2006 at 1:42 am

With Friedman you have to forget the maths and concentrate on his extremely flawed logic and lack of wisdom.

Friedman uses an economic model that says that all economic processes occur in a thermodynamically CLOSED system in which the factors of production and finished goods and services cycle endlessly between firms and households. Energy from land (nature) does not have a central role in the theory.

His economics does not have a theory to determine when natural resource shortages and surpluses will occur.

The laws of energy and matter and ecology are not integrated into his economic theories. The impacts on the human spirit, health and ecosystems is brushed aside. Forces of nature are treated as property and the contribution of nature to economic processes is treated as a free gift.

Matter and energy don’t enter or exit his mathematical system and land labour and capital are seen as independent entities and mutually interchangeable.

(Refer: The decline of the Age of Oil by Brian J Fleay, page 10-11)

No amount of convincing mathematics will make up for irrational and dangerous logic.

8 Jason November 29, 2006 at 10:58 am

srp writes: “The key is that Y(.) represents intentional, not unintended, macropolicy.”

I see your point, but that distinction isn’t central to my argument. Let me try again.

Suppose I come up with a macroeconomic policy that would reduce the variance of Z close to zero. (Just bear with me for a moment.) The graph of Y(t) would look like a mirror image of X(t). Gee, where did that copy of X(t) come from? I must be a genius at figuring out what X(t) would have been, and taking exactly the right policy actions to counteract it. Or so goes Alex’s reasoning.

But it turns out any number of dumb policies can produce this outcome. Suppose you simply tax all income above $20,000 at a marginal rate of 100%. (I don’t know if median personal income is the variable meant by “income”, but pick any variable and you can formulate a similar dumb policy.) Z(t) will be pinned to $20,000; the variance of Z will be zero. Clearly the mirror image of X(t) in the graph of Y(t) did not come from my policy.

Conclusion: The ability to predict X(t) is **not** a prerequisite of formulating policy that results in a high r(X, Y). Likewise, understanding the relationship between policy actions and Y(t) is **not** a prerequisite to success here.

It seems to me Alex’s argument falls apart once you realize this.

9 Barbar November 29, 2006 at 5:52 pm

Agree with Jason. The intuition here is that X and Y can’t possibly be highly correlated — but why not? It’s not like X and Y are independently generated random processes that can only align through magic.

To use a simple example, maybe X usually goes up 5% a year, but runaway investor speculation leads it to oscillate wildly (+ or -50% every 10 years). Say a simple government policy can eliminate this kind of “bad” investor behavior, so that instead X goes up 5% every year. Magic! X and Y are amazingly correlated — whenever X shoots up surprisingly, Y GOES DOWN BY NEARLY THE SAME AMOUNT, AND VICE VERSA!!!! How could the government possibly do that?

This is simply an appeal to poor mathematical intuition, in order to reinforce an already held belief. I would expect better.

10 Jason November 30, 2006 at 12:09 pm

Alex, thank you for the enlightening explanation. Now I see. I should have just bought the book.

And I apologize for, you know, attaching your name to my own misinterpretation of your post.

11 Rich Berger December 1, 2006 at 1:04 pm

After finishing CR’s paper, I think Alex’s original post was correct. CR’s conclusion was that volatility (and total lost output) was not significantly different in the era of government counter-cyclical actions than before. Friedman’s conjecture was that the counter-cyclical policy would have to be unrealistically adroit to make a significant difference, and CR’s findings are consistent with his conclusion.

I think dsquared’s assertion that ” So I’d classify this particular piece of Friedman as a conclusion that he was going to assert anyway, a model that doesn’t really provide any support to it, and a prediction that wasn’t borne out by the evidence.” is a textbook example of tendentious. He reached a pre-arranged conclusion and cherry-picked selected sentences in the paper in support.

12 aion kina March 20, 2009 at 8:42 pm

Comments on this entry are closed.

Previous post:

Next post: