Month: October 2011

A simple approach to macroeconomic theorizing

I can’t say it is guaranteed to work, but I give it a high “p”:

1. Take the macroeconomic theory you hold and stick it into a box.

2. Take the major competing macroeconomic theory, the one you dislike, but taking care that you have selected an approach endorsed by high-IQ researchers.  If you dislike them too, that does not disqualify the theory, quite the contrary.

3. Stick theory #2 into the same box.

4. Average the two theories.

5. Pull the average out of the box, and call it your new theory.

How many times should you apply this method?  At least once I say.

I am indebted to Hal Varian for a useful conversation on related topics.

A model of political corruption

Lessig takes on the model of lobbying as “legislative subsidy” developed by political scientist Richard Hall and economist Alan Deardorff as an alternative to the naive lobbying-as-bribe model. Legislators come to Washington passionate about several issues. Quickly, though, they come to depend on the economy of influence for help in advancing an agenda. They need the policy expertise, connections, public-relations machine, and all the rest that lobbyists can offer. Since this legislative subsidy is not uniformly available, the people’s representatives find themselves devoting more of their time to those aspects of their agenda that moneyed interests also support. No one is bribed, but the political process is corrupted.

That is from Matt Yglesias.

Matt Rognile does not like the LM curve (nor does David Romer)

Read the whole post, here is one excerpt:

…many countries now operate under an inflation targeting framework, in which responding to inflation is the key feature of the policy rule. In this environment, depicting policy as a relationship between “Y” and “i” misses what’s really going on—better to abandon the upward-sloping LM curve altogether and use a simple horizontal line to depict the current policy rate. I’m not alone in this sentiment.

David Romer wrote an entire piece for the JEP in 2000 called Keynesian Macroeconomics without the LM Curve. (As the title suggests, he shares my feelings on the matter.) Tyler Cowen puts this at #6 on his list of grievances. It’s a pretty obvious point—yet, for reasons I don’t entirely understand, we still print thousands of undergraduate textbooks a year with LM front and center.

Matt is an economics Ph.d student at MIT and an expert in macroeconomics.

Here is a good quotation from the above-cited David Romer piece (Romer, by the way, is not a member of my tribe, in fact he is a tenured professor at Berkeley):

In short, recent developments work to the disadvantage of IS-LM. This observation suggests that it is time to revisit the question of whether IS-LM is the best choice as the basic model of short-run fluctuations we teach our undergraduates and use as a starting point for policy analysis. The thesis of this paper is that it is not.

A simple theory of regulations, new and old

Q = min (#laws, #regulators)

The number of laws grows rapidly, yet the number of regulators grows relatively slowly.  There are always more laws than there are regulators to enforce them, and thus the number of regulators is the binding constraint.

The regulators face pressure to enforce the most recently issued directives, if only to avoid being fired or to limit bad publicity.  On any given day, it is what they are told to do.  Issuing new regulations therefore displaces the enforcement of old ones.

If the best or most fundamental regulations are the ones issued first, over time the average quality of regulation will decline.

Critics from both sides will claim, at the same time, that “regulation is too high,” and “regulation is too low.”  They will both be describing aspects of the same proverbial elephant.

The ability of a new regulation to pass a (partial equilibrium) cost-benefit test does not mean said regulation is a good idea, at least not without adjusting for the “crowding out” effect.

Hiring more regulators will address the dilemma only temporarily (assuming that the number of regulators cannot keep up with the number of laws).  At first more regulations will be enforced, but as time passes the quality gap between the enforced regulations and the most important regulations will again open up.

If you think you just passed some really, especially important regulations, slow down the pace of future regulations.

If you are pro-deregulation, slowing down the pace of forthcoming regulations won’t much help your cause.  It will shift back attention and labor to slightly more important regulations, however.  It is possible that the net regulatory impact goes up.

Sunset provisions may induce regulators to treat, enforce, and monitor the old regulations more like the new ones, and vice versa.  What other institutions might contribute toward this end?

Complaints Choir of Singapore

They sing complaints about their city-state, here is one excerpt:

Stray cats get into noisy affairs

At night my neighbor makes weird animal sounds

People put on fake accents to sound posh

And queue up 3 hours for donuts

Will I ever live till eighty five to collect my CPF?

It sounds like a terrible place:

Old National Library was replaced by an ugly tunnel

Singaporean men can’t take independent women

People blow their nose into the swimming pool

And fall asleep on my shoulder in the train

Full lyrics and explanation are here.  Yet it is now legally banned for foreigners to sing the complaints.  Here is a video of the Choir, definitely recommended, best video I’ve seen this year and do watch it through to the end.

For the pointer I thank Chug Roberts.

Scott Sumner on IS-LM

I favor an ad hoc approach to models–use the simplest model that gets at the issues you are interested in.  Start with a simple economy with money and goods, no bonds.  The supply and demand for money determines the price level and/or NGDP.  That’s most of human history.  Add wage price stickiness and you get demand-side business cycles.  Add interest rates and you get . . . well it’s not clear what you get.  Interest rates almost certainly have an influence on the demand for money.  Do they play a major role in the transmission mechanism between money and aggregate demand?  Hard to say.  Short term Treasury yields probably don’t have much impact.  Other asset prices might, but then there is generally no zero bound for other asset prices.  On the other hand monetary policy often operates through purchase of short term T-securities.  Bottom line, it’s complicated.

There are many excellent parts, read the whole thing, I won’t excerpt the best part.  And also there is this:

Friedman thought it was more useful to take a partial equilibrium approach to macro.  By doing so he was able to avoid the mistakes of those who looked at the Depression from an IS-LM perspective.  He was interested in how monetary policy determined NGDP, and then used a separate Phillips Curve approach with a natural rate to explain output fluctuations, to partition NGDP into RGDP and P.  He viewed interest rate movements as a sort of epiphenomenon.  Monetary policy affected rates in a complex way, which made interest rates an unreliable indicator of the stance of monetary policy.

Silvestre Pantaleón trailer inglés

That is a forthcoming Jonathan Amith documentary on Nahua culture in the Rio Balsas region of Mexico.  The trailer video is here; it is set in San Agustin Oapan, where I did the field work for my book Markets and Cultural Voices.  Recently I saw the film at National Geographic and loved it, admittedly it is not for all tastes.  I’ll let you all know when a DVD becomes available.

A brief description of the film is here.

The baby sitting co-op story, examined in light of the original source

From Matthew Klein, here is a lengthy post about the real story behind the baby-sitting co-op example which has become so popular; think of it as analogous to Coase’s take on the lighthouse in economics.  It is not easy to excerpt, so I recommend that you read the whole thing.  “Money” does still matter, but it shows how many unconsidered aspects of the story there have been, ranging from why the shortage of exchange media developed in the first place (“contractionary fiscal policy”) to why the experts’ inflationary “solution” didn’t work out very well.  For one thing, the system ended up with too much scrip:

The price of baby sitting is constitutionally pegged at one unit of scrip for every one-half hour of baby sitting. Hence, this system of price controls means the inflationary pressure does not drive up the scrip-price of baby sitting, inflation is suppressed, and shortages are found.

[…]

Now there is great difficulty rounding up sitters for all those who want to go out. This is a classic sort of inflationary pressure—too much money (scrip) chasing too few goods (sitters).

Here is a link to the original document on the history of the co-op.

Personalized Medicine

Patient X was rushed to the hospital for emergency surgery. As she entered the hospital she said to the anesthesiologist, “You may not want to use suxamethonium on me.”

“Have you had a previous reaction?” inquired the anesthesiologist.

“No.”

“Ah, a family member must have had a reaction.”

“No.”

“Why then are you concerned about this drug?”

“I’ve had a good portion of my genome sequenced,” the patient replied, “and I found that I have a genetic variation in the enzyme that breaks down suxamethonium and am part of the 5% of patients who respond unusually to this drug. I thought you should be aware of this information.”

The flabbergasted anesthesiologist wondered how long it would be before more of her patients came prepared with their own genetic code.

I made up the details of the conversation above, but otherwise the story is true. The patient was a customer of 23andme, a service that for around $200 will give you information on about half a million sites on your genome, how you differ from other people at those sites, and which of your variations are associated with various diseases, behaviors and capabilities.

The costs of sequencing are falling so rapidly it will soon make sense for everyone to carry their entire genetic code with them on a USB drive (23andme only identifies part of the code). In 2001 it cost Craig Venter $100,000,000 to sequence the first human genome (his own.) Today, it costs just $16,000; in a few years, it will cost less than $1,000–a 100,000-factor decrease in costs in less than two decades!

That’s me from a piece called The FDA and Personalized Medicine written to help launch a new blog from the Manhattan Institute, Medical Progress Today. I go on to argue that if we are to take advantage of the new possibilities for personalization “we must move the FDA away from pre-market gatekeeping and towards post-market surveillance and information provision.”

David Henderson, Rita Numerof and Paul Howard all comment.

Putting the IS-LM debate in context

Here is a response from Paul Krugman on the topicStephen Williamson’s post is too polemic for my tastes, but I find he nonetheless places this debate in useful perspective:

Generations of textbook writers found IS-LM a very convenient model to use in getting basic Keynesian ideas across to undergraduate students. However, frontier macroeconomic researchers did not take IS-LM seriously after the early 1970s. By about 1980, IS-LM had essentially disappeared from the top economics journals and from the top PhD programs in economics. But one could still find some version of IS-LM in undergraduate textbooks.

How is IS-LM used today? You do not see it in published macroeconomic research, as a framework for discussion among policymakers, or in PhD programs in economics. It is certainly not necessary to use it in teaching Keynesian economics to undergraduates. In the third edition of my intermediate macro textbook, you will not find an IS-LM model. I have found what I think are more straightforward and instructive ways to get Keynesian economics across, and to get it across in line with what modern Keynesian researchers actually do. For example, I do a version of a Keynesian coordination failure model that looks like what Roger Farmer did in the early 1990s, and an undergraduate version of a Woodford sticky-price model.

Williamson is correct.  There is more at the link, including some of the more polemic parts of the post.  An anonymous commentator adds:

The New Keynesian intermediate texts like Chad Jones have dispensed with IS/LM and replaced with the 3-equation IS-PC-MR (monetary rule) model.

Addendum: By the time the 1980s had rolled around, even Sir John Hicks had pretty much repudiated the IS-LM model, some partial detail is here.  Here is Brad DeLong on IS-LM in 2005; whatever the model is making predictions about, it is not the contemporary economy.