Data Source

That has been the received wisdom, but it is now challenged by a new paper (pdf) by Christina and David Romer:

This paper revisits the aftermath of financial crises in advanced countries in the decades before the Great Recession. We construct a new series on financial distress in 24 OECD countries for the period 1967-2007. The series is based on narrative assessments of the health of countries’ financial systems that were made in real time; and it classifies financial distress on a relatively fine scale, rather than treating it as a 0-1 variable. We find little support for the conventional wisdom that the output declines following financial crises are uniformly large and long-lasting. Rather, the declines are highly variable, on average only moderate, and often temporary. One important driver of the variation in outcomes across crises appears to be the severity and persistence of the financial distress itself when distress is particularly extreme or continues for an extended period, the aftermath of a crisis is worse.

There is Justin Lahart coverage here, including a contrast with Reinhart and Rogoff.

I remember this question being debated extensively circa 2009-2011, and those who said there was a (limited) role for mismatch unemployment were mocked pretty mercilessly.  Well, Sahin, Song, Topa, and Violante have a piece in the new American Economic Review entitled “Mismatch Unemployment.”  (You can find various versions here.)  It’s pretty thorough and state of the art.  Their conclusion:? “…mismatch, across industries and three-digit occupations, explains at most one-third of the total observed increase in the unemployment rate.”  The people thrown out of work could not be matched as well as the unemployed workers of the past.

Much of the matching problem was for skilled workers, college graduates, and in the Western part of the country.  Geographical mismatch unemployment did not appear to be significant.  Now, “at most one-third” is not the main problem, but it is not small beans either.  That’s a lot of people out of work because of matching problems.

Again, the Great Recession arose from a confluence of supply and demand problems.

There is a new paper (pdf) by Nicola Gennaioli and Hans-Joachim Voth, forthcoming in The Review of Economic Studies:

Powerful, centralized states controlling a large share of national income only begin to appear in Europe after 1500. We build a model that explains their emergence in response to the increasing importance of money for military success. When fiscal resources are not crucial for winning wars, the threat of external conflict stifles state building. As finance becomes critical, internally cohesive states invest in state capacity while divided states rationally drop out of the competition, causing divergence. We emphasize the role of the “Military Revolution”, a sequence of technological innovations that transformed armed conflict. Using data from 374 battles, we investigate empirically both the importance of money for military success and patterns of state building in early modern Europe. The evidence is consistent with the predictions of our model.

The pointer is from Mark Koyama.

It is by Eva Vivalt and is called “How Much Can We Generalize from Impact Evaluations?” (pdf).  The abstract is here:

Impact evaluations aim to predict the future, but they are rooted in particular contexts and results may not generalize across settings. I founded an organization to systematically collect and synthesize impact evaluations results on a wide variety of interventions in development. These data allow me to answer this and other questions across a wide variety of interventions. I examine whether results predict each other and whether variance in results can be explained by program characteristics, such as who is implementing them, where they are being implemented, the scale of the program, and what methods are used.  I find that when regressing an estimate on the hierarchical Bayesian meta-analysis result formed from all other studies on the same intervention-outcome combination, the result is significant with a coefficient  of 0.6-0.7, though the R-squared is very low.  The program implementer is the main source of heterogeneity in results, with government-implemented programs faring worse than and being poorly predicted by the smaller studies typically implemented by academic/NGO research teams, even controlling for sample size.  I then turn to examine specification searching and publication bias, issues which could affect generalizability and are also important for research credibility.  I demonstrate that these biases are quite small; nevertheless, to address them, I discuss a mathematical correction that could be applied before showing that randomized control trials (RCTs) are less prone to this type of bias and exploiting them as a robustness check.

Eva is on the job market from Berkeley this year, her home page is here.  Here is her paper “Peacekeepers Help, Governments Hinder” (pdf).  Here is her extended bio.

The drunk utilitarian

by on October 28, 2014 at 2:36 am in Data Source, Food and Drink, Philosophy | Permalink

Here is a new paper by Aaron A. Duke and Laurent Bègue:

The hypothetical moral dilemma known as the trolley problem has become a methodological cornerstone in the psychological study of moral reasoning and yet, there remains considerable debate as to the meaning of utilitarian responding in these scenarios. It is unclear whether utilitarian responding results primarily from increased deliberative reasoning capacity or from decreased aversion to harming others. In order to clarify this question, we conducted two field studies to examine the effects of alcohol intoxication on utilitarian responding. Alcohol holds promise in clarifying the above debate because it impairs both social cognition (i.e., empathy) and higher-order executive functioning. Hence, the direction of the association between alcohol and utilitarian vs. non-utilitarian responding should inform the relative importance of both deliberative and social processing systems in influencing utilitarian preference. In two field studies with a combined sample of 103 men and women recruited at two bars in Grenoble, France, participants were presented with a moral dilemma assessing their willingness to sacrifice one life to save five others. Participants’ blood alcohol concentrations were found to positively correlate with utilitarian preferences [emphasis added] (r = .31, p < .001) suggesting a stronger role for impaired social cognition than intact deliberative reasoning in predicting utilitarian responses in the trolley dilemma. Implications for Greene’s dual-process model of moral reasoning are discussed.

The gated version is here.  The original pointer is from SteveStuartWilliams.

I’ve long wanted to read a paper on this topic and I just ran across a 2011 essay in the American Sociological Review, by Delhey, Newton, and Welzel.  Most papers on trust work with general questionnaire responses, but those queries often conflate whether you trust the people you know, or the people who surround you, with whether you trust your government and other larger social institutions.  You can imagine for instance that a country could have strong interpersonal trust at the micro level but also lots of cynicism about its establishment power structures.

The innovation of this paper is to compare micro trust measures with macro trust measures and see where there are big differences.  Not surprisingly, the most trusting coutries, such as Sweden, Norway, and Switzerland, score high on both the micro and macro measures of trust.

The countries where asking the macro question makes the biggest difference in overall trust rank are South Korea (falls 18 places when macro considerations are considered explicitly), Thailand (falls 17 places), and China and Romania.  Argentina, Poland, and Slovenia gain the most in their relative trust rankings when the radius of trust is brought into play.  In general, when we account explicitly for the macro governance dimension, Asian countries decline in the trust rankings and Latin countries go up in the trust rankings by some modest amount.

Sentences to ponder

by on October 26, 2014 at 3:54 pm in Data Source, Religion, Uncategorized | Permalink

In “A More Perfect Union,” Mr. DuBois downloaded 19 million profiles from 21 online dating sites. He then wrote software to sort them by ZIP code, and determine the words most frequently used in each location. In the resulting maps, the top-ranked words replace city names. New York is “Now.” Atlanta is “God.”

That is from Steve Lohr at The New York Times.

Loren Adler and Adam Rosenberg report:

…the disproportionate role played by prescription drug spending (or Part D) has seemingly escaped notice. Despite constituting barely more than 10 percent of Medicare spending, our analysis shows that Part D has accounted for over 60 percent of the slowdown in Medicare benefits since 2011 (beyond the sequestration contained in the 2011 Budget Control Act).

Through April of this year, the last time the Congressional Budget Office (CBO) released detailed estimates of Medicare spending, CBO has lowered its projections of total spending on Medicare benefits from 2012 through 2021 by $370 billion, excluding sequestration savings. The $225 billion of that decline accounted for by Part D represents an astounding 24 percent of Part D spending. (By starting in 2011, this analysis excludes the direct impact of various spending reductions in the Affordable Care Act (ACA), although it could still reflect some ACA savings to the extent that the Medicare reforms have controlled costs better than originally anticipated.) Additionally, sequestration is responsible for $75 billion of reduced spending, and increased recoveries of improper payments amount to $85 billion, bringing the total ten-year Medicare savings to $530 billion.

The full piece is here, via Arnold Kling.

There is a new paper from Andrew M. Francis and Hugo M. Mialon:

In this paper, we evaluate the association between wedding spending and marriage duration using data from a survey of over 3,000 ever-married persons in the United States. Controlling for a number of demographic and relationship characteristics, we find evidence that marriage duration is inversely associated with spending on the engagement ring and wedding ceremony.

What is the mechanism?  Are signal-requiring and financial commitment-requiring marriages more likely to be fragile?  Or, to put forward a politically incorrect interpretation, do the high expenditures indicate the wife has too much bargaining power in the relationship?  That hardly seems like a plausible explanation.  By the way, weddings with a large number of attendees are likely to last longer, as are weddings accompanied by honeymoons.  Those correlations are easier to understand.

This piece is by a factor of more than five the most frequently downloaded SSRN paper over the last two months.

There is a new NBER paper by Campbell R. Harvey, Yan Liu, and Heqing Zhu, and it is a startler though perhaps not a surprise:

Hundreds of papers and hundreds of factors attempt to explain the cross-section of expected returns. Given this extensive data mining, it does not make any economic or statistical sense to use the usual significance criteria for a newly discovered factor, e.g., a t-ratio greater than 2.0. However, what hurdle should be used for current research? Our paper introduces a multiple testing framework and provides a time series of historical significance cutoffs from the first empirical tests in 1967 to today. Our new method allows for correlation among the tests as well as missing data. We also project forward 20 years assuming the rate of factor production remains similar to the experience of the last few years. The estimation of our model suggests that a newly discovered factor needs to clear a much higher hurdle, with a t-ratio greater than 3.0. Echoing a recent disturbing conclusion in the medical literature, we argue that most claimed research findings in financial economics are likely false.

The emphasis is added by me.  There are ungated versions of the paper here.

For the pointer I thank John Eckstein.

This is from Larry Summers and Lant Pritchett:

…knowing the current growth rate only modestly improves the prediction of future growth rates over just guessing it will be the (future realized) world average.  The R-squared of decade-ahead predictions of decade growth varies from 0.056 (for the most recent decade) to 0.13.  Past growth is just not that informative about future growth and its predictive ability is generally lower over longer horizons.

The main point of this paper is to argue that Chinese growth rates will become much lower, perhaps in the near future, here is a summary of that point from Quartz:

Summer and Pritchett’s calculations, using global historical trends, suggest China will grow an average of only 3.9% a year for the next two decades. And though it’s certainly possible China will defy historical trends, they argue that looming changes to its  authoritarian system increase the likelihood of an even sharper slowdown.

The piece, “Asiaphoria Meets Regression Toward the Mean,” is one of the best and most important economics papers I have seen all year.  There is an ungated version here (pdf).  I liked this sentence from the piece:

Table 5 shows that whether or not China and India will maintain their current growth or be subject to regression to the global mean growth rate is a $42 trillion dollar question.

And don’t forget this:

…nearly every country that experienced a large democratic transition after a period of above-average growth…experienced a sharp deceleration in growth in the 10 years following the democratizing transition.

As Arnold Kling would say, have a nice day.

Germany fact of the day

by on October 18, 2014 at 3:03 am in Data Source, History, Uncategorized | Permalink

From 1973 to 1985 German inflation was most of the time over two percent a year, sometimes much over two percent.  In 1973 it hit eight percent and in the early eighties it exceeded six percent a year.  Source here (pdf), see p.6.

From 1951-1973, the Germans seemed happy with roughly the same inflation rate as what Americans had.  Source here (pdf), see p.9, and also p.13, passim.  In the early 1970s, the rate averaged almost seven percent a year for a few years (p.15).  It is fine to note the role of oil shocks here, and in the earlier period Bretton Woods, but still Germans tolerated the higher inflation rates.  They expected the alternatives would be worse and probably they were right.

The claim that the current German dislike of inflation dates back to unique memories of Weimar hyperinflation is dubious.  Rightly or wrongly, today’s Germans associate high rates of inflation with wealth transfers away from Germany and toward other nations.  More broadly, Germany is a more flexible country than outsiders often think, not always to the better of course.

I have supported the various QEs from the beginning, while seeing them as limited in their efficacy.  At the time, and still, I feared deflationary pressures more than high inflation.  Still, recently the question has arisen whether those QEs boosted the risk of high inflation.  Ashok Rao looks at options data to pull out the best answer I have seen so far:

…did the risk of high inflation increase after the Fed engaged in QE2? (Note this establishes a correlation, not causation)…

And you see two very interesting trends: the probability of high inflation (that above 6%, which is the largest traded strike) sharply increased over the latter half of 2010 and early 2011, the time period over which the effects of QE2 were priced in. This is a general trend across all maturities. While the 3, 5, and 10 year option follow a similar path afterwards, the 1-year cap is much more volatile (largely because immediate sentiments are more acute). Still, you see the probability of high inflation pic up through 2012, as QE3 is expanded.

The takeaway message from this is hard to parse. This market doesn’t exist in the United States before 2008, and isn’t liquid till a bit after that, so it’s tough to compare this with normal times. While the sharp increase in the probability of high inflation would seem to corroborate the Hoover Institution letter, that wouldn’t mean much if it simply implied a return to normalcy. That’s just a question we’ll have to leave for a later day.

What about the probability of deflation? We’ll the interesting point is that for the three higher maturity options, the probability for high inflation and probability of deflation were increasing at the same time. This was a time of relatively anchored 5 year implied inflation, but the underlying dynamics were much more explosive, as can be seen in the above charts.

There are some very useful pictures in the post, and do note the variety of caveats which Ashok wisely (and characteristically) offers.  He notes also that for the United States deflationary risk was never seen as very likely, but the QEs lowered that risk even further.

Inequality and parenting style

by on October 15, 2014 at 1:50 pm in Data Source, Economics | Permalink

Greg Mankiw refers us to this graph (there is further explanation here), which of course can be interpreted in a variety of ways, with causation running either way or perhaps not at all:

inequality and parenting style

He has a new paper (pdf) on this topic, with Jorda and Schularick, based on data from seventeen advanced economies since 1870.  In an email he summarizes the main results as follows:

1. Mortgage lending was 1/3 of bank balance sheets about 100 years ago, but in the postwar era mortgage lending has now risen to 2/3, and rapidly so in recent decades.

2. Credit buildup is predictive of financial crisis events, but in the postwar era it is mortgage lending that is the strongest predictor of this outcome.

3. Credit buildup in expansions is predictive of deeper recessions, but in the postwar era it is mortgage lending that is the strongest predictor of this outcome as well.

Here is VoxEU coverage of the work.  On a related topic, here is a new paper by Rognlie, Shleifer, and Simsek (pdf), on the hangover theory of investment, part of which is applied to real estate.  It has some Austrian overtones but the main argument is combined with the zero lower bound idea as well.