Score one for the signaling model of education

In the new AER there is a paper by Melvin Stephens Jr. and Dou-Yan Yang, the abstract is this:

Causal estimates of the benefits of increased schooling using US state schooling laws as instruments typically rely on specifications which assume common trends across states in the factors affecting different birth cohorts. Differential changes across states during this period, such as relative school quality improvements, suggest that this assumption may fail to hold. Across a number of outcomes including wages, unemployment, and divorce, we find that statistically significant causal estimates become insignificant and, in many instances, wrong-signed when allowing year of birth effects to vary across regions.

In other words, those semi-natural experiments for the return to education, when some regions move with extra doses of compulsory schooling before others and we estimate differential wage effects, maybe don’t show as much as we used to think.  As I’ve remarked to Bryan Caplan, if there is a criticism of a famous or politically correct result (or better yet both) getting published in the AER, you can up your Bayesian priors on that criticism being on the mark.

There are ungated copies of the paper here.

Comments

"Differential changes across states during this period, such as relative school quality improvements, suggest that this assumption may fail to hold."

So the paper says that after controlling for differences in schools, it turns out that schools don't matter?

Exactly. This sort of econ paper reminds me of the economists paper way back when (it must have been the late 1980s) that I once read where they stated modern electricity-driven refrigerators don't matter, since after controlling for differences with old-fashioned ice boxes, it turns out ice boxes were OK after all. Balderdash. But provocative.

No, the quoted paragraph says that after controlling for differences in the trends of school quality, the amount of schooling does not matter. Note that “amount” has a specific and limited interpretation here, the result is only valid at the margin affected by the relevant compulsory schooling laws, likely an extra year of schooling at age 15 or something like that. I haven’t read it in full but I would assume that the original work found additional schooling (ie presence of compulsory schooling laws) to have a positive affect and this paper found that this is instead explained by a previously unobserved improvement in the quality of the schools in the state(s) with compulsory schooling laws relative to the control state(s). Which is not balderdash at all.

Whatsthat makes a good point below. Not all studies on this topic use this identification strategy, and the better ones (such as those using a RDD, or quarter/month of birth instruments) do not have this problem.

Thanks, but my balderdash comment was reserved for iceboxes.

Comments for this post are closed

Comments for this post are closed

Comments for this post are closed

When you control for the quality of schools, it turns out that _schools_ matter, not simply _going to school._

In other, over simplified words, the old studies looked at Outcome = B(years of school) and found a positive beta. The new study looks at Outcome = B(years of school) + B(quality of school) and found that B(years of school) = 0 while B(quality of school) is positive.

Comments for this post are closed

Comments for this post are closed

Surely it's the benefits that would be "causal" not the estimates?
"this assumption may fail to hold": oh for heaven's sake. Whatever happened to English?

Got it! Writing clumsy, pompous English is signalling.

"probably"

Comments for this post are closed

Comments for this post are closed

Comments for this post are closed

This criticism is only valid for studies that use that type of instrument.

There are other ways of getting causality, and I have seen at least one paper that used an RDD.

Looks like overselling.

Comments for this post are closed

Comments for this post are closed