Data Source

Many of us think this diagram shows there has been some kind of structural break in the labor market, and/or that recovery is proceeding slowly.  Paul Krugman, very recently, suggests that structural factors play little role because the measured unemployment rate is now below five percent.

But in fact labor market indicators are quite mixed, and furthermore the best and latest research out of MIT indicates the structural story does indeed carry real weight.  See also Alan Krueger’s work, or recent research from the AER.  And there are plenty of markers of a more persistent shift in economic activity, as reflected in CBO markdowns of expected productivity growth, based partly on trends which preceded the recession.  That all might be wrong, but the mere citation of the current 4.9 unemployment rate doesn’t persuade me otherwise.

Let’s not forget what Krugman wrote in 2012:

My current favorite gauge of the jobs picture is the employment-population ratio for prime-age adults (25-54). EP ratio instead of unemployment rate, because U may be distorted by workers dropping out…Everything else is just noise.

At least as of yesterday, the preferred labor market indicator was once again the unemployment rate, no mention of 2012.  That was then, this is now, I suppose.

The rest of Krugman’s history on recovery is curious.  Very early on he predicted a rapid recovery (if not right away), then he predicted for several years a long-standing secular stagnation, now he seems to be citing “a recovery of demand.”  I don’t see anything wrong with such a change in emphasis, as the facts change, and Krugman himself makes this meta-point fairly frequently.  Still it is odd for him to be criticizing the predictive record of others on these issues.  He’s been through what appears to be three distinct positions on recovery, and two distinct positions on which labor market indicators really matter, and we are still not sure exactly which views are correct.

Bryan Caplan is pleased that he has won his bet with me, about whether unemployment will fall under five percent.  I readily admit a mistake in stressing unemployment figures at the expense of other labor market indicators; in essence I didn’t listen enough to the Krugman of 2012.  This shows there were features of the problem I did not understand and indeed still do not understand.  I am surprised that we have such an unusual mix of recovery in some labor market variables but not others.  The Benthamite side of me will pay Bryan gladly, as I don’t think I’ve ever had a ten dollar expenditure of mine produce such a boost in the utility of another person.

That said, I think this episode is a good example of what is wrong with betting on ideas.  Betting tends to lock people into positions, gets them rooting for one outcome over another, it makes the denouement of the bet about the relative status of the people in question, and it produces a celebratory mindset in the victor.  That lowers the quality of dialogue and also introspection, just as political campaigns lower the quality of various ideas — too much emphasis on the candidates and the competition.  Bryan, in his post, reaffirms his core intuition that labor markets usually return to normal pretty quickly, at least in the United States.  But if you scrutinize the above diagram, as well as the lackluster wage data, that is exactly the premise he should be questioning.

As I’m the only one in this exchange fessing up to what I got wrong, and what I still don’t understand, and what the complexities are, in a funny way…I feel I’m the one who won the bet.

Addendum: Here is the graph of the ratio for prime age workers only, it too shows partial but by no means complete recovery.  And note this: the more optimistic you are about interpreting the labor market side, the more pessimistic you ought to be about the productivity picture, a conclusion which is anathema to Caplan at least.  Given recent configurations of data, it really is hard to avoid carving out room for structural factors as a significant part of the story.

Tobias J. Moskowitz has a recent paper on this question, the results are illuminating:

I use sports betting markets as a laboratory to test behavioral theories of cross-sectional asset pricing anomalies. Two unique features of these markets provide a distinguishing test of behavioral theories: 1) the bets are completely idiosyncratic and therefore not confounded by rational theories; 2) the contracts have a known and short termination date where uncertainty is resolved that allows any mispricing to be detected. Analyzing more than a hundred thousand contracts spanning two decades across four major professional sports (NBA, NFL, MLB, and NHL), I find momentum and value effects that move betting prices from the open to the close of betting, that are then completely reversed by the game outcome. These findings are consistent with delayed overreaction theories of asset pricing. In addition, a novel implication of overreaction uncovered in sports betting markets is shown to also predict momentum and value returns in financial markets. Finally, momentum and value effects in betting markets appear smaller than in financial markets and are not large enough to overcome trading costs, limiting the ability to arbitrage them away.

SSRN and video versions of the paper are here.  The underlying idea here is neat.  The marginal utility of consumption is unlikely to be correlated with the outcomes of sporting events, so we can test some propositions of finance theory without having to worry much about those risk factors.  Lo and behold, a version of momentum results still holds up.  And if you would like an exposition of that approach, do see my earlier dialogue with Cliff Asness.  And here is Cliff on Fama on momentum.

The break in the prison population’s unremitting growth offers an overdue reprieve and a cause for hope for sustained reversal of the nearly four-decade growth pattern. But any optimism needs to be tempered by the very modest rate of decline, 1.8 percent in the past year. At this rate, it will take until 2101 — 88 years — for the prison population to return to its 1980 level.

And this:

Other developments should also curb our enthusiasm. The population in federal prisons has yet to decline. And even among the states, the trend is not uniformly or unreservedly positive. Most states that trimmed their prison populations in 2012 did so by small amounts — eight registered declines of less than 1 percent. Further, over half of the 2012 prison count reduction comes from the 10 percent decline in California’s prison population, required by a Supreme Court mandate. But even that state’s achievement is partly illusory, as it has been accompanied by increasing county jail admissions.

Three states stand out for making significant cuts in their prison populations in the past decade: New York (19 percent), California (17 percent), and New Jersey (17 percent). The reductions in New York and New Jersey have been in part a function of reduced crime levels, but also changes in policy and practice designed to reduce the number of lower-level drug offenders and parole violators in prison. But the pace of reductions in most other states has been quite modest. Moreover, 22 states still subscribed to an outdated model of prisoner expansion in 2012.

There is more here from Marc Mauer and Nazgol Ghandnoosh.

The authors are Christopher H. Achen and Larry M. Bartels, and the subtitle is Why Elections Do Not Produce Representative Government.  This book is brutally depressing, not to mention very well presented, though I cannot say the core message is surprising at this point.  Voters choose on the basis of partisan loyalties, and these days party voting has a much bigger influence on state and local elections than it used to.  So where is the accountability?  Some voters engage in “retrospective voting,” but on the basis of super-short time horizons, and often the voters hold politicians accountable for matters those politicians cannot control, even storms and other natural disasters.  The authors really do demonstrate these points with lots of rigorous analysis.

OK, now a segue.  Given all this, the natural and appropriate policy response should be to a) expand the responsibilities of democratic government, or b) consider limiting the responsibilities of democratic government?

You are allowed only two guesses…

The book is due out in April.

In Launching the Innovation Renaissance I argued that students were not graduating with the degrees that pay (see also my piece in the Chronicle of Higher Education).

In 2009 the U.S. graduated 37,994 students with bachelor’s degrees in computer and information science. This is not bad, but we graduated more students with computer science degrees 25 years ago! The story is the same in other technology fields such as chemical engineering and math and statistics.

If students aren’t studying science, technology, engineering and math, what are they studying?

In 2009 the U.S. graduated 89,140 students in the visual and performing arts, more than in computer science, math and chemical engineering combined and more than double the number of visual and performing arts graduates in 1985.

So what has happened since 2009? The good news is that enrollment in STEM fields has increased dramatically. The number of graduates with computer science degrees, for example, has increased by 34%, chemical engineering degrees are up by a whopping 49.5% and math and statistics degrees have increased by 32%.

The bad news is that we are still graduating more students in the visual and performing arts than in computer science, math and chemical engineering combined. As I said in Launching nothing wrong with the visual and performing arts but those are degrees which are unlikely to generate spillovers to society.

We are also graduating more students in communications and journalism than in computer science, math and chemical engineering combined and more students in psychology than in computer science, math and chemical engineering combined. Here’s what I said about psychology:

In 2009 we graduated 94,271 students with psychology degrees at a time when there were just 98,330 jobs in clinical, counseling and school psychology in the entire nation. The latter figure isn’t new jobs — it’s total jobs!

Despite these problems, the number of psychology degrees conferred annually has increased since 2008-2009 by an astounding 21.4%! Visual and performing arts degrees have increased by 9.7% and communication and journalism degrees are up 8.1%. Do you think that jobs in these fields have gone up by equal percentages?

Stated differently, in 2012-2013 we graduated 20,418 more students in computer science, chemical engineering and math and statistics than we did in 2008-2009 but we also graduated 20,179 more students in psychology alone! We have a long way to go.

Here is the data:

EducationData

*Unequal Gains*

by on February 3, 2016 at 9:27 am in Books, Data Source, Economics, History | Permalink

That is the forthcoming book by Peter H. Lindert and Jeffrey G. Williamson, and the subtitle is American Growth and Inequality since 1700.  The sections on recent America, while unobjectionable, are ordinary, but the early coverage of American history is very interesting indeed.  Here is one excerpt:

Why the Old South reversal of fortune?  A benign part of the story seems to have been that the colonial South was still a labor-scarce frontier region with high returns to coastal land producing export crops, like indigo, rice, and tobacco. Its decline after 1774 was echoed in two other frontier cases many decades later.  One was the dramatic relative decline of the West South Central income per capita between 1840 and 1860 — from 60 percent of the U.S. average to just 9.5 percent above it…The other was the loss of the Pacific region’s gold-discovery-generated super-incomes after the 1850s and early 1860s (the Pacific states were 213.3 percent above the US average in 1860, and the mountain states were 30.5 percent above).

I hope to report on other interesting sections of the book soon; it is due out in April.  Again, most business cycles in history have been real business cycles.

A majority of the top dividend-paying stocks on the Straits Times Index are government-linked…

This Andy Mukherjee piece makes numerous good points about the current problems faced by Singapore, more than just the usual and focusing on the internal, and the difficulties of maintaining an adequate level of competition in the economy.

As I’ve said before, when you examine flows — such as government spending as a percentage of gdp — Singapore looks and indeed is quite free market.  When you consider stocks and wealth…well, that is a very different story.  Singapore has a strong market-oriented component, but it is not a free market miracle.

Every year since 2006 more democracies have experienced erosion in political rights and civil liberties than have registered gains, as we find in our annual Freedom in the World report. In all, 110 countries, more than half the world’s total, have suffered some loss in freedom during the past 10 years.

That is from Mark P. Lagon and Arch Puddington at the WSJ.  I would like to see a good theory of how liberty, democracy, and liberalism — or however we wish to characterize that bundle — comove across the globe, in both positive and negative times.

We match individual-level survey data with information on the historical lifeways of ancestors, focusing on Africa, where the transition away from such modes of production began only recently. Within enumeration areas and occupational groups, we find that individuals from ethnicities that derived a larger share of subsistence from agriculture in the pre-colonial era are today more educated and wealthy. A tentative exploration of channels suggests that differences in attitudes and beliefs as well as differential treatment by others, including less political power, may contribute to these divergent outcomes.

That is from a recent paper by Michalopoulos, Putterman, and Weil.  Here are video and ungated versions.

Although Stockfish and Komodo have differences in their evaluation scales—happily less pronounced than they were 1 and 2 years ago—they agree that the world’s elite made six times more large errors when on the lower side of equality.

We don’t know how general this phenomenon is, but interestingly it seems to hold much more strongly for top players than for weak players.  That is from chess of course.

Here is much more detail from Ken Regan, along with some suggested hypotheses and resolutions.

Mexican non-oil exports to USA in December (y/y): -4.5%. Excluding autos: -8.7%.

That is from Genevieve Signoret, via this source.

It’s funny how these numbers seem to indicate someone is starting to enter a recession.  Who might that be?  Maybe it’s just noise, I don’t see any other mediocre economic reports wandering around these parts…  Or maybe it’s Mexico that’s the problem

The excellent Susan Athey addresses that question on Quora, here is one excerpt:

Machine learning is a broad term; I’m going to use it fairly narrowly here.  Within machine learning, there are two branches, supervised and unsupervised machine learning.  Supervised machine learning typically entails using a set of “features” or “covariates” (x’s) to predict an outcome (y).  There are a variety of ML methods, such as LASSO (see Victor Chernozhukov (MIT) and coauthors who have brought this into economics), random forest, regression trees, support vector machines, etc.  One common feature of many ML methods is that they use cross-validation to select model complexity; that is, they repeatedly estimate a model on part of the data and then test it on another part, and they find the “complexity penalty term” that fits the data best in terms of mean-squared error of the prediction (the squared difference between the model prediction and the actual outcome).  In much of cross-sectional econometrics, the tradition has been that the researcher specifies one model and then checks “robustness” by looking at 2 or 3 alternatives.  I believe that regularization and systematic model selection will become a standard part of empirical practice in economics as we more frequently encounter datasets with many covariates, and also as we see the advantages of being systematic about model selection.

…in general ML prediction models are built on a premise that is fundamentally at odds with a lot of social science work on causal inference. The foundation of supervised ML methods is that model selection (cross-validation) is carried out to optimize goodness of fit on a test sample. A model is good if and only if it predicts well. Yet, a cornerstone of introductory econometrics is that prediction is not causal inference, and indeed a classic economic example is that in many economic datasets, price and quantity are positively correlated.  Firms set prices higher in high-income cities where consumers buy more; they raise prices in anticipation of times of peak demand. A large body of econometric research seeks to REDUCE the goodness of fit of a model in order to estimate the causal effect of, say, changing prices. If prices and quantities are positively correlated in the data, any model that estimates the true causal effect (quantity goes down if you change price) will not do as good a job fitting the data. The place where the econometric model with a causal estimate would do better is at fitting what happens if the firm actually changes prices at a given point in time—at doing counterfactual predictions when the world changes. Techniques like instrumental variables seek to use only some of the information that is in the data – the “clean” or “exogenous” or “experiment-like” variation in price—sacrificing predictive accuracy in the current environment to learn about a more fundamental relationship that will help make decisions about changing price. This type of model has not received almost any attention in ML.

The answer is interesting, though difficult, throughout.  Here are various Susan Athey writings, on machine learning.  Here are other Susan Athey answers on Quora, recommended.  Here is her answer on whether machine learning is “just prediction.”

They have a new and excellent summary paper (pdf), and that is Gordon not Robin Hanson:

China’s emergence as a great economic power has induced an epochal shift in patterns of world trade. Simultaneously, it has challenged much of the received empirical wisdom about how labor markets adjust to trade shocks. Alongside the heralded consumer benefits of expanded trade are substantial adjustment costs and distributional consequences. These impacts are most visible in the local labor markets in which the industries exposed to foreign competition are concentrated. Adjustment in local labor markets is remarkably slow, with wages and labor-force participation rates remaining depressed and unemployment rates remaining elevated for at least a full decade after the China trade shock commences. Exposed workers experience greater job churning and reduced lifetime income. At the national level, employment has fallen in U.S. industries more exposed to import competition, as expected, but offsetting employment gains in other industries have yet to materialize. Better understanding when and where trade is costly, and how and why it may be beneficial, are key items on the research agenda for trade and lab or economists.

This is some of the most important work done by economists in the last twenty years.

Facts about business travel

by on January 26, 2016 at 3:16 am in Data Source, Economics, Travel | Permalink

More populous countries have more business travel in both directions, but the volume is less than proportional to their population: a country with 100% more population than another has only about 70% more business travel. This suggests that there are economies of scale in running businesses that favor large countries.

By contrast, a country with a per capita income that is 100% higher than another receives 130% more business travelers and sends 170% more people abroad. This means that business travel tends to grow more than proportionally with the level of development.

While businesspeople travel in order to trade or invest, more than half of international business travel seems to be related to the management of foreign subsidiaries. The global economy is increasingly characterized by global firms, which need to deploy their know-how to their different locations around the world. The data show that there is almost twice the amount of travel from headquarters to subsidiaries as there is in the opposite direction. Exporters also travel twice as much as importers.

That is from Ricardo Hausmann, with further interesting points.

From Ryan Avent:

Anthony Randazzo of the Reason Foundation, a libertarian think-tank, and Jonathan Haidt of New York University recently asked a group of academic economists both moral questions (is it fairer to divide resources equally, or according to effort?) and questions about economics. They found a high correlation between the economists’ views on ethics and on economics. The correlation was not limited to matters of debate—how much governments should intervene to reduce inequality, say—but also encompassed more empirical questions, such as how fiscal austerity affects economies on the ropes. Another study found that, in supposedly empirical research, right-leaning economists discerned more economically damaging effects from increases in taxes than left-leaning ones.

There is considerably more at the link.  The Randazzo and Haidt study is from Econ Journal Watch.