A Few Favorite Books from 2013

by on December 22, 2013 at 8:33 am in Books, Economics, History, Law | Permalink

Tom Jackson asked me for a couple of best books for his year end column. I don’t read as many books as Tyler so consider these some favorite social science books that I read in 2013.

In The Undercover Economist Strikes Back, Tim Harford brings his genius for storytelling and the explanation of complex ideas to macroeconomics. Most of the popular economics books, like The Armchair Economist, Freakonomics, Predictably Irrational and Harford’s earlier book The Undercover Economist, focus on microeconomics; markets, incentives, consumer and firm choices and so forth. Strikes Back is that much rarer beast, a popular guide to understanding inflation, unemployment, growth and economic crises and it succeeds brilliantly. Mixing in wonderful stories of economists with exciting lives (yes, there have been a few!) with very clear explanations of theories and policies makes Strike Back both entertaining and enlightening.

Stuart Banner’s American Property is a book about property law, which sounds like an awfully dull topic. In the hands of Banner, however, it is a fascinating history of what we can own, how we can own it and why we can own it. Answers to these questions have changed as judges and lawmakers have grappled with new technologies and ways of life. Who owns fame? Was there a right to own one’s own image? Benjamin Franklin, whose face was used to hawk many products, would have scoffed at the idea but after the invention of photography and the onset of what would later be called the paparazzi thoughts began to change. In the early 1990s, Vanna White was awarded $403,000 because a robot pictured in a Samsung advertisement turning letters was reminiscent of her image on the Wheel of Fortune. American Property is a great read by a deep scholar who writes with flair and without jargon.

On June 3, 1980, shortly after the Soviet Union’s invasion of Afghanistan, the U.S. president’s national security adviser was woken at 2:30 am and told that Soviet submarines had launched 220 missiles at the United States. Shortly thereafter he was called again and told that 2,200 land missiles had also been launched. Bomber crews ran to their planes and started their engines, missile crews opened their safes, the Pacific airborne command post took off to coordinate a counter-attack. Only when radar failed to reveal an imminent attack was it realized that this was a false alarm. Astoundingly, the message NORAD used to test their systems was a warning of a missile attack with only the numbers of missiles set to zero. A faulty computer chip had inserted 2′s instead of zeroes. We were nearly brought to Armageddon by a glitch. If that were the only revelation in Eric Schlosser’s frightening Command and Control it would be of vital importance but in fact that story of near disaster occupies just one page of this 632 page book. The truth is that there have been hundreds of near disasters and nuclear war glitches. Indeed, there have been so many covered-up accidents that it’s clear that the US government has come much closer to detonating a nuclear weapon and killing US civilians than the Russians ever did. Thankfully, we have reduced our stockpile of nuclear weapons in recent years but, as in so many other areas, we are also more subject to computers and their vulnerabilities as we make decisions at a faster, sometimes superhuman, pace. Command and control, Schlosser warns us, is an illusion. We are one black swan from a great disaster and if this is true about the US handling of nuclear weapons how much more fearful should we be of the nuclear weapons held by North Korea, Pakistan or India?

celestus December 22, 2013 at 9:31 am

“…there have been so many covered-up accidents that it’s clear that the US government has come much closer to detonating a nuclear weapon and killing US civilians than the Russians ever did.”

Or that the Russians have been better at covering up accidents, which I suppose would be even more concerning.

Finch December 23, 2013 at 9:58 am

It’s pretty clear that we don’t know what happened in Russia. We have only a little information about what happened in the US. And, accidents aside, we have no idea what almost transpired intentionally.

Alex’s statement is, unusually for him, puffery.

Ray Lopez December 22, 2013 at 9:51 am


Rahul December 22, 2013 at 10:23 am

“if this is true about the US handling of nuclear weapons how much more fearful should we be of the nuclear weapons held by North Korea, Pakistan or India? “

In the India-Pak situation I don’t think either of the nations consider a nuke as a feasible strategic device.Our nuclear program was more like a victory arch to impress the home crowds & stir up some pre-election fervor. That and a bureaucratic scientific establishment itching to grandstand in the absence of much other real achievement.

Doug December 22, 2013 at 11:02 am

I don’t see why computer glitches from 35 years ago should inspire fear today. Computers, both in hardware and software, have kind of gotten a little bit better since then. By any metric modern computer systems are orders of magnitude more stable and reliable than their predecessors from pre-1990. Even disregarding the technological advances, the standards and practices of modern software development is aeons ahead of even fifteen years ago.

This is like the equivalent of getting scared to fly a Boeing 777 because you read about how often the Wright Brothers crashed their prototypes.

Bill December 22, 2013 at 12:37 pm

And, these systems are as safe from hackers, and malware,



Willitts December 22, 2013 at 10:15 pm

Target’s payments system is designed to be accessed externally from thousands of retail stores.

NORAD’s was not. A closed system is always more secure than an open one. A closed system requires physical intrusion. The only vulnerability to that system was the president’s football, the most closely guarded one-time pad in the galaxy. And I doubt that even a stolen football and a water-boarded president could launch an attack. There has to be redundancies and checks.

Rahul December 23, 2013 at 7:01 am

Fair point. On the other side of the scale is testing. Open, widely used systems get bugs discovered & ironed out faster. Complex, proprietory systems must employ rigorous internal testing & they do, yet as size & complexity scale up the battle-against-bugs gets increasingly harder to win.

The actual trigger logic is easy and unlikely to be buggy. The more insidious part is the weapons / launch logic & all its interactions.

Finch December 23, 2013 at 9:53 am

This is the “Normal Accidents” theory of complex systems. My impression is that, despite a brief period of popularity 30 years ago, this is not generally thought to be the way things work by modern safety thinkers. Making the technology more capable generally makes things safer. Modern compiled software is easier to debug than assembler, not harder. That’s part of why we can make more interesting things with it.

At one point, on the heels of TMI and Challenger, people thought we were going to enter an era of more frequent technological accidents based on the theory you describe. But that didn’t really happen.

Axa December 23, 2013 at 1:53 am

This is not about computer glitches, it’s about nuclear weapons and people taking decisions under stress.

Philo December 22, 2013 at 11:46 am

“I don’t read as many books as Tyler . . . .” This is not news!

AndrewL December 22, 2013 at 12:00 pm

The fact that there were so many glitches, and yet no missiles launched means that the system worked. The system being the integration of computer systems and human controllers, It’s not so frightening that the computer system has glitches because the computers cannot launch the missiles, It’s more frightening that the nuclear command has gotten rather sloppy: http://www.lowellsun.com/todaysheadlines/ci_24771254/warnings-rot-u-s-nuclear-missiles-system

FC December 22, 2013 at 1:52 pm

Amazon shows that “Command and Control” was blurbed by no scholars of nuclear policy but was blurbed by Jonathan Franzen. This does nothing to inspire confidence.

Ray Lopez December 22, 2013 at 7:34 pm

Anybody worth their salt in nuclear policy, including SALT negotiators, are under an NDA so they cannot tell you the really good stuff. This happens in my field too. If you only knew the stuff I know it would blow your mind. For example, they are even closer to making humans immortal than you think. Some news of this leaked out a while ago, I sent the link to TC but he has not yet published it.

JSIS December 22, 2013 at 9:56 pm

| Pakistan or India
This is a convenient western fiction. At the top levels, there is a much better understanding of each other’s thresholds. Their weapons are not mated with delivery systems. The close distance makes keeping track of each other’s moves much easier than between America and Soviet Union.
If there is a nuclear war between these two, it will be due to deliberate choice, not an accident.

P.S: for anyone thinking modern hardware is more stable, The slow winter .

Steve Sailer December 22, 2013 at 9:57 pm

There are lots and lots of amazing stories from the Cold War.

Willitts December 22, 2013 at 10:09 pm

Nothing better confirms the concept of Mutually Assured Destruction saving the day. Given information of a massive attack, key personnel kept their cool and verified the attack before launching. They understood the consequences of a glitch.

The fact that the USSR likely had similar glitches and restrained themselves is also remarkable.

John Dougan December 23, 2013 at 5:05 pm
www.fourcorners.ie December 27, 2013 at 5:59 pm

I’m extremely inspired with your writing abilities
and also with the structure for your weblog.
Is this a paid subject matter or did you modify it your
self? Either way keep up the excellent quality writing, it is uncommon to see a
great weblog like this one today..

Comments on this entry are closed.

Previous post:

Next post: