Hansen’s work is the most technical and most difficult to explain to a layperson. The brief version is that in 1982 Hansen developed the Generalized Method of Moments a new and elegant way to estimate many economic models that requires fewer assumptions and is often more powerful than other methods.

Here is the basic idea in a nutshell. The method of moments is an old and intuitive technique for estimating the parameters of a data generating process. A moment is an expectation of the form E(X^r), where r can be an integer. For example if r=1 then the first moment, E(X), is just the mean (you may also know that together the first and second moments, E(X^2), define the variance). If the true mean in the data generating process is M then we can write a moment condition, E(X)-M=0. Now the method of moments says to estimate M we should solve that condition by replacing, E(X), with the sample mean. In other words, a good estimator for the unknown population mean is the sample mean, i.e. the mean in the data that you have. Pretty obvious so far.

Now let’s start to generalize. First, there are many moments other than the mean and variance. Indeed economic theory often provides moment conditions that may be written E(f(X,M))=0 where M now stands for a moment not necessarily the mean and f can be a non-linear function. For example, rational expectation models often provide conditions that E(f(X))-M=0, i.e. that forecasts should equal true values, or macro models imply that various differences, such as in consumption levels should not be correlated and so forth. Indeed, finance and macroeconomic theory provided a *surplus* of moment conditions and many of these different conditions imply something about the same parameter. Now, and this is key, when we have more moment conditions than parameters we can’t choose the parameters to make all the moment conditions true, i.e. we can’t make all those moment conditions equal to zero. So what to do?

What Hansen did with the generalized method of moments is show that when we have more moment conditions than parameters we can best estimate those parameters by giving more weight to the conditions that we have better information about. In other words, if we have two conditions and we can’t force both of them to zero by a choice of parameter then choose the parameter such that the moment condition we know the most about (least variance) is closer to zero than the one we know less about. Again, the idea is intuitive, but Hansen showed how to make these choices and then he proved that when the parameters are chosen in this way they have good statistical properties such as consistency (they get closer to the true values as the sample size increases). Importantly, estimating a model using these moment conditions does not require untenable assumptions on the entire distribution of returns. Hansen then also showed, such as with Hansen and Hodrick (1980) and Hansen and Singleton (1982, 1983) how these methods could be applied to a large class of macro models and finance models including asset pricing, the latter of which links Hansen with the work of Fama and Shiller as does the important bound discovered by Hansen and Jaganatthan (1991).

This guy is a right wing hack whose slavish devotion to efficient markets…wait, sorry, wrong post. [/satire]

Did you mean to write “Fama” instead of Fame at the end?

This may be too much work but it would be helpful to my understanding the contribution to provide an example or two of the different outcomes from the original method of moments to GMM specifically as GMM was able to provide more precise parameters in some specific research. What does the world look like without GMM?

Another minor copyediting note: in parenthetical in the next to last sentence I think this is supposed to be: “*as* the samples increase. Or otherwise I’m even more confused than I thought I was.

you are not confused!, corrected.

Tim, this is not a neat example but an alternate method is maximum likelihood estimation. In that case, you specify the full distribution of the data process of interest (say a log normal distribution of income) and then estimate the parameters (mean, covariance, etc). You make one wrong assumption about the distribution and the whole thing is “garbage” or at least the results are not even right on average (unbiased). The beauty of GMM is you make fewer assumptions and the weights in the estimator focus in on the “good” assumptions. You can make some wrong assumptions, the estimator partially ignores those, and the results are still going to tend to correct (consistent) as the sample gets big. I surely did some injustice to the details here but trying to be illustrative.

So now to the soapbox … applied economists, like myself, should take some blame for someone like Hansen being inscrutable. Economist push their empirical results in abstracts, blogs, and tweets but spend less time telling people about the assumptions that got them there. And with powerful tools like GMM that require fewer assumptions were should spend even more time debating the validity of those assumptions.

In GMM when you use f(X,M) ; how do you get that without knowing the underlying distribution?

There are a lot of nice statistical properties of MLEs: asymptoticly unbiased, efficient, etc. Do moment estimators have any such properties? I never thought of a moment estimator as anything more than a sometimes useful trick to get an initial guess for a likelihood maximization algorithm.

Without GMM, we would have to call the method non-linear three-stage least squares (NL3SLS). And we would have to cite a paper from the 1970s instead of the 1980s. And we would have to give the Nobel to Takeshi Amemiya. Other than that, it’s pretty hard to think of how the world would be different.

One reason I like GMM is that it illustrates how silly the structural vs. reduced form fight is. GMM allows us to estimate individual equations from a “structural” model in a way that is quite similar to least squares (actually they’re identical under some circumstances). Vocal reduced form people have somehow come to the conclusion that estimating a structural model requires a bunch of heroic assumptions while estimating simple reduced form linear relationships is self-evidently reasonable.

Ultimately, the fight is just about functional forms. Is a linear functional form more realistic than a functional form chosen because it can be derived from a structural model? I don’t think so. Both approaches require pretty heroic assumptions about functional forms, and each approach is useful for various research questions. The broader point is that tools like GMM make it just as easy to estimate relationships arising from theoretical models as it is to estimate atheoretic (or treatment effect theoretic) linear relationships.

who still believes that, didn’t Sims himself establish that you need exactly-as-heroic orthogonalization assumptions with reduced-form shocks?

Just a technical remark: the optimal weighting of the different moments nicely described by Alex Tabarrok is not essential for consistency, only for efficiency.

I’ve always had a fondness for GMM. One of the great contributions from economics. Suck it, statisticians.

As a probabilist, while it’s not the “method of moments” that I’m most familiar with (I think of proving convergence in distribution), I fully support this prize. It is an elegant applied method.

The original idea of GMM comes from engineering

Don’t tell the nobel committee or they’ll have to revoke the VAR prize too.

prix viagra pharmacie,

Actually, that’s why it is called “Generalized” method of moments…you don’t need to plan ahead and such

test

Viagra spam gets posted but legitimate comments are blocked? Interesting.

Too many comments apparently, as my posts are being blocked too by the anti-spam filter. I posted three times belittling this year’s E-con Nobelians and showing how they plagiarized work done elsewhere (e.g. for Hansen, in statistics, maximum-likelihood estimation (MLE) is a method of estimating the parameters of a statistical model) and I get blocked.

The Viagra spammer is probably spamming on a large scale and earning income from his activities, and can thus, thanks to economies of scale, finance the R&D needed to not get blocked by the spam filter.

On the other hand, you have very little incentive to do so, so you probably don’t do it, and your posts are thus more likely to be blocked.

I like the write-up! I took a crack at going even more simplified, more challenging than I’d thought to explain to a general audience:

http://simplystatistics.org/2013/10/14/why-did-lars-peter-hansen-win-the-nobel-prize-generalized-method-of-moments-explained/

Which makes your version even more impressive.

@ Jeff L. — I like your explanation better. Having taken a few engineering and stat courses it seems that GMoM is a sort of torque equation, or in stats the maximum likelihood technique, as you say (Wikipedia: “In statistics, maximum-likelihood estimation (MLE) is a method of estimating the parameters of a statistical model”).

So essentially Hansen plagiarized for economists a well known statistical technique, MLE. That would be the anti-patent, anti-IP way of putting it. The more pro-patent, pro-IP way of putting it is that Hansen legitimized MLE for economic models, and as such he made an important advance.

Economists understand MLE and are not giving a Nobel for well known ideas, GMOM is an advance on MLE as it requires fewer assumptions. Indeed, if anything MLE is subset of GMOM.

The idea that you can have things like this plagiarised in econometrics is what I’d call “not even wrong,” but let’s pretend for a momen it’s just plain old “wrong”.

In that case, you’ve misunderstood the Wikipedia page. When it says “MLE is a method of estimating the parameters of a statistical model” it is not being exclusive. GMM and OLS are also “methods of estimating the parameters of a statistical model.”

In fact for many distributions MLE is actually a special case of GMM. Consider estimation procedures like OLS and MLE subsets of GMM, if that helps.

“So essentially Hansen plagiarized for economists a well known statistical technique, MLE.”

As Ben says, this isn’t even wrong.

But to try to correct a couple of huge misconceptions: economists had been using MLE for decades, and indeed continue to do so, it’s a nice statistical tool.

What Hansen came up with is different from MLE. As the name implies, it harkens back to the Method of Moments (which I’d always associated with Karl Pearson but according to wikipedia he came up with the idea of moments but not the Method of Moments). GMM is different from MLE, and is a very ingenious idea, fully Nobel worthy. (Though I’m curious about the connection to engineering that another commenter claimed. It is true that a lot of the best ideas in econometrics came from other fields.)

Also, Alex’s write-up on GMM was excellent.

Ray, stick to something you understand, like third-grade arithmetic.

There is no such thing as a “Nobel Prize in Economics”. There is a “Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel.” Alfred Nobel set up exactly five prizes: physics, chemistry, physiology or medicine, literature and for peace. They’ve been awarded since 1901.

The Nobel Foundation official website always calls it the “Economics Prize”. You won’t find the phrase “Nobel Prize in Economics” anywhere. It’s not correct to call it the Nobel Prize in Economics. Call it what it is: the Riksbank Prize or the Economics Prize.

Among the most vocal critics of the Prize in Economics is the Swedish human rights lawyer Peter Nobel, a great-grandnephew of Alfred Nobel. Swedish economist Gunnar Myrdal and former Swedish minister of finance Kjell-Olof Feldt have also advocated that the Prize in Economics should be abolished. If he had been asked about the establishment of the Prize before receiving it, Friedrich Hayek stated that he would “have decidedly advised against it.”

Why is the distinction important? Apparently the Nobel Foundation don’t seem to mind else they’d have not associated themselves with the later addition.

In any case, what’s the nature of the criticism? I bet there’s people critical of the peace prize and how it is awarded too. So what?

In addition to the fact that the Nobel Foundation associates the Economics Prize with its other prizes on its website, the Nobel Foundation explicitly forbade the creation of any further new prizes thus implicitly acknowledging what had been done for economics. Moreover, while the Economics Memorial Prize is not literally the same as the originals, it seems that the Peace Prize has deviated from the letter of the original prizes inasmuch as the award is now given to organizations and multiple individuals instead of limiting the award to only three people as is still the rule for the science, literature, and econ Nobels. But obviously, some people who either dislike economics, or want to signal something about their concern that Economics is not a science like to harp on this distinction every single time the Prize is mentioned. Yet I doubt it would have stilled the trolls’ unhappiness had this been called the Nobel Prize in Economics without the word “science” attached. Tough for you.

“What Hansen did with the generalized method of moments is show that when we have more moment conditions than parameters we can best estimate those parameters by giving more weight to the conditions that we have better information about.”

If I read you correctly, he’s solved why Republicons become more convinced of their delusions the more facts you give them, rather than less, but nobody’s noticed that yet.

Congrats to all the economists who won Nobel Prize in 2013 for forecasting asset value, or stock market in a longer term. But maximum of the theorem applied mathematical model then statistics what general public understands easily & could apply while dealing in stock market. Many factors are responsible for the stock prices that as no one could get proper solution yet. Fibonacy was not proven. So will thetherem by these economists bound to fail. Examples should be practical markets while crashing or rising like current DOWJ price rise. It seems artificial, will fall then people will lose what they gained from the recent rise from 6k to 15k index. So please more examples how to apply in daily practical life then math.

Comments on this entry are closed.