Kris on Twitter asks that question. I have a few hypotheses, none confirmed by any hard data, other than my “lyin’ eyes”:
1. Twitter exists as a kind of parallel truth/falsehood mechanism, and it is encroaching on traditional academic processes, for better or worse.
2. Hypotheses blaming people or institutions for failures and misdeeds will be more popular on Twitter than in academia, but over time they are spreading in academia too, in part because of their popularity on Twitter. Blame makes for a more popular tweet.
3. Often the number of Twitter followers resembles a Power law, and thus Twitter raises the influence of very well known contributors. Twitter also raises the influence of the relatively busy, compared to say the 2009 world where blogs held more of that influence. Writing blog posts required more time than does issuing tweets.
4. I believe Twitter raises the relative influence of women. For one thing, women can coordinate with each other on Twitter more easily than they can in academic life across different universities.
5. Twitter can damage the career prospects of some of the more impulsive tweeting white males.
6. On Twitter is is easier to judge people by their (supposed) intentions than in academia, so many more people will be accused of acting and writing in bad faith.
7. On Twitter more people do in fact act in bad faith.
8. Hardly anyone looks better on Twitter, so that contributes to the polarization of many professions, especially economics and those professions linked to political issues. Top economists don’t seem so glamorous any more, not even in their areas of specialization.
9. Academic fields related to current events will rise in status and attention, and those topics will garner the Power law retweets. Right now that means political science most of all but of course this will vary over time.
10. Twitter lowers the power of institutions more broadly, as institutions typically are bad at Twitter.
There is now transcript and audio from the Holberg debate in Bergen, Norway, courtesy of the CWTeam, here is their summary of the event:
This bonus episode features audio from the Holberg Debate in Bergen, Norway between Tyler and Slavoj Žižek held on December 7, 2019. They discuss the reasons Slavoj (still) considers himself a Communist, why he considers The Handmaid’s Tale “nostalgia for the present,” what he likes about Greta Thunberg, what Marx got right about the commodification of beliefs, his concerns about ecology and surveillance in communist states like China today, the reasons academia should maintain its ‘useless character,’ his beginnings as a Heideggerian, why he is distrustful of liberal optimism, the “Fukuyama dilemma” we face, the importance of “empty manners,” and more.
COWEN: You know the old joke, what’s the difference between a Communist and a Nazi? Tenure.
ŽIŽEK: You mean university tenure?
COWEN: Yes. It’s a joke, but the point is you don’t need Communism. You are much smarter than Communism.
I would describe the proceedings as “rollicking,” including the segment about “smoking the prick.”
I will be doing a Conversation with him, yes the Jason Furman. So what should I ask?
The Arthashastra, the science of wealth and politics, is one of the world’s oldest treatises on political economy. Written by Kautilya, legendary advisor to the Indian King Chandragupta Maurya (reign: 321–298 BCE), the Arthashastra has often been compared to Machiavelli’s The Prince and has been a touchstone in Indian political economy for well over a thousand years.
Vijay Kelkar and Ajay Shah, two long-time advisors to the Indian government, have written the new Arthashastra, In Service of the Republic: The Art and Science of Economy Policy. In Service doesn’t go into great detail on current policies in India (Joshi’s Long Road is the best recent overview), it instead distills timeless wisdom on the making of political economy.
When faced with a potential government intervention, it is useful to ask three key questions. Is there a market failure? Does the proposed intervention address the identified market failure? Do we have the ability to implement the proposed intervention?
Public policy failures are born of: (1) The information constraint; (2) The knowledge constraint; (3) the resource constraint; (4) The administrative constraint; and (5) The voter rationality constraint. These five problems interact, and jointly generate government failure, of both kinds; pursuing the wrong objectives and failing on the objectives that have been established.
A government organization that is riven with corruption is not one which was unlucky to get a lot of corrupt people. It is one where the rules of the game facilitate corruption.
The competitive market process should force the exit of low-productivity firms. This does not happen when the low-productivity firms violate laws–e.g. a low productivity firm may emit pollution, while the high-productivity firm incurs the higher costs associated with the pollution control required in law….When enforcement capabilities, of laws or of taxes, are improved…production will shift from low-productivity firms to high-productivity firms. This reallocation will yield GDP growth, in and of itself.
There are two pillars of intervention in banking in India. On one hand, the state regulates banking. In addition, the Indian state produces banking services through the ownership of bank….There are conflicts between these two [pillars]. Regulation by the state may be indulgent towards its own entities….this calls for strong separation between the two pillars.
Kelkar and Shah are especially concerned with policy making in the Indian context of low state-capacity:
A policy pathway that is very successful in (say) Australia may not work in India as it is being placed in a very different setting. Envisioning how a given policy initiative will work in India requires deep knowledge of the local context.
If the fine for driving through a red light is Rs 10,000, there will be pervasive corruption. Jobs in the highway police will be sought after; large bribes will be paid to obtain these jobs. There will be an institutional collapse of the highway police. It is better to first start with a fine of Rs 100, and build state capacity.
(On that theme see also my paper with Rajagopalan, Premature Imitation.)
In Service to the Republic is the book that every policy maker and future policy maker should be given while being told, “before you do anything, read this!”
Addendum: I will be in India next week and after a visit to Agra and Hampi, I will be giving some talks at Ramaiah University in Bangalore and later in the month at the Indian School of Public Policy.
As Tyler argued last week one of the most common analytical inaccuracies on Twitter is to blame the Fed for being too conservative with monetary policy over the last few years. I see this problem on both the left and the right. One of the ways the argument goes is as follows::
This month’s unemployment rate is lower than last month’s unemployment rate. Thus, we could not have been at full employment last month.
Monetary policy should be less conservative. If only we had been more aggressive earlier, we could have reached where we are sooner and made millions of people better off.
All of this is wrong. To begin, full employment does not mean the lowest possible unemployment rate. We are at full employment when we are at the natural rate of unemployment and as Milton Friedman wrote:
The ‘natural rate of unemployment’….is the level that would be ground out by the Walrasian system of general equilibrium equations, provided there is imbedded in them the actual structural characteristics of the labor and commodity markets, including market imperfections, stochastic variability in demands and supplies, the cost of gathering information about job vacancies and labor availabilities, the costs of mobility, and so on.
The natural rate can change over time, even in a sustained direction, as the structural characteristics of the economy change, as demand, supply, demographics, information and so forth change. Change does not mean disequilibrium. When the production of apples is bigger this year than last year we don’t jump to the conclusion that last year the apple market was out of equilibrium. Similarly, the fact that unemployment was lower this year than last year does not mean that we weren’t at full employment last year.
The point of Friedman’s 1968 piece was that monetary policy can’t do much to influence the natural or full employment rate. Thus, the second half of the argument also doesn’t follow. In other words, it doesn’t follow from the fact that unemployment is declining that monetary policy last year could have achieved this year’s unemployment rate last year. My children are taller this year than last year but that doesn’t mean I could have accelerated their growth by feeding them more last year.
Monetary policy can make a big difference in arresting a negative spiral of declining spending leading to declining income leading to declining spending….Keynes was right. Scott Sumner was also right to call for more aggressive monetary policy in 2008-2010. But that was a disequilibrium event, now long over. When children are starving, you can get them to grow faster by feeding them more, but don’t try using that rule in normal times. Today we are in normal times. The economy has been growing steadily for over a decade. We are not in a downward spiral and wages and prices are not stuck at 2008 levels. In fact, since the end of the recession a large majority of workers are in new jobs! Indeed, a good chunk of the labor force has retired since 2008 to be replaced by entirely new workers. Nothing sticky there.
Standard macro models do not imply that monetary policy can always lower unemployment. (I can’t believe I have to write that in 2020 but the great forgetting is well upon us). Indeed, the standard models, as Tyler discussed, are all about testing and deepening our understanding of the Friedman list, most notably “the cost of gathering information about job vacancies and labor availabilities.” Bottom line is that nobody ever said that we had to like the Walrasian equilibrium but like it or not, monetary policy can’t do much to change it.
John Cochrane, in a series of interesting observations on State Capacity Libertarianism, notes:
I don’t see just why nuclear power needs “state support,” rather than a clear workable set of safety regulations that are not excuses for anyone to stop any project.
Apart from the fact that our government created nuclear power at great expense and hurry, I would most of all cite the Price-Anderson Nuclear Indemnities Act of 1957 Here is Wikipedia:
The Act establishes a no fault insurance-type system in which the first approximately $12.6 billion (as of 2011) is industry-funded as described in the Act. Any claims above the $12.6 billion would be covered by a Congressional mandate to retroactively increase nuclear utility liability or would be covered by the federal government. At the time of the Act’s passing, it was considered necessary as an incentive for the private production of nuclear power — this was because electric utilities viewed the available liability coverage (only $60 million) as inadequate.
I am less clear on where the insurance industry stands on this matter today, but in general American society has become far more litigious, and it is much harder to build things, and risk-aversion and infrastructure-aversion have risen dramatically. Furthermore:
- Jurisdiction is automatically transferred to federal courts no matter where the accident occurred.
- All claims from the same incident are consolidated into one Federal court, which is responsible for prioritizing payouts and sharing funds equitably should there be a shortfall.
- Companies are expressly forbidden to defend any action for damages on the grounds that an incident was not their fault.
- An open-ended time limit is applied, which allows claimants three years to file a claim starting from the time they discover damage.
- Individuals are not allowed to claim punitive damages against companies.
So the odds are that without a Price-Anderson Act America’s nuclear industry would have shut down some time ago, with no real chance of a return.
More generally, I am not sure which level or kind of liability should be associated with “the free market,” especially when the risks in question are small, arguably ambiguous, but in the negative scenarios involve very very high costs. Which is then “the market formula”? That question does not make much sense to me, so it seems to me that, details of the Price-Anderson Act aside, all scenarios are by definition somewhat governmental.
Germany’s closing of nuclear power stations after Fukishima cost billions of dollars and killed thousands of people due to more air pollution. Here’s Stephen Jarvis, Olivier Deschenes and Akshaya Jha on The Private and External Costs of Germany’s Nuclear Phase-Out:
Following the Fukashima disaster in 2011, German authorities made the unprecedented decision to: (1) immediately shut down almost half of the country’s nuclear power plants and (2) shut down all of the remaining nuclear power plants by 2022. We quantify the full extent of the economic and environmental costs of this decision. Our analysis indicates that the phase-out of nuclear power comes with an annual cost to Germany of roughly$12 billion per year. Over 70% of this cost is due to the 1,100 excess deaths per year resulting from the local air pollution emitted by the coal-fired power plants operating in place of the shutdown nuclear plants. Our estimated costs of the nuclear phase-out far exceed the right-tail estimates of the benefits from the phase-out due to reductions in nuclear accident risk and waste disposal costs.
Moreover, we find that the phase-out resulted in substantial increases in the electricity prices paid by consumers. One might thus expect German citizens to strongly oppose the phase-out policy both because of the air pollution costs and increases in electricity prices imposed upon them as a result of the policy. On the contrary, the nuclear phase-out still has widespread support, with more than 81% in favor of it in a 2015 survey.
If even the Germans are against nuclear and are also turning against wind power the options for dealing with climate change are shrinking.
Hat tip: Erik Brynjolfsson.
Slow labor market recovery does not have to mean the core fix is or was nominal in nature, even if the original negative shock was nominal:
Recent critiques have demonstrated that existing attempts to account for the unemployment volatility puzzle of search models are inconsistent with the procylicality of the opportunity cost of employment, the cyclicality of wages, and the volatility of risk-free rates. We propose a model that is immune to these critiques and solves this puzzle by allowing for preferences that generate time-varying risk over the cycle, and so account for observed asset pricing fluctuations, and for human capital accumulation on the job, consistent with existing estimates of returns to labor market experience. Our model reproduces the observed fluctuations in unemployment because hiring a worker is a risky investment with long-duration surplus flows. Intuitively, since the price of risk in our model sharply increases in recessions as observed in the data, the benefit from creating new matches greatly drops, leading to a large decline in job vacancies and an increase in unemployment of the same magnitude as in the data.
That is from a new NBER working paper by Patrick J. Kehoe, Pierlauro Lopez, Virgiliu Midrigan, and Elena Pastorino. Essentially it is a story of real stickiness, institutional failure yes but not primarily nominal in nature.
Perhaps more explicitly yet, from the new AER Macro journal, by Sylvain Leduc and Zheng Liu:
We show that cyclical fluctuations in search and recruiting intensity are quantitatively important for explaining the weak job recovery from the Great Recession. We demonstrate this result using an estimated labor search model that features endogenous search and recruiting intensity. Since the textbook model with free entry implies constant recruiting intensity, we introduce a cost of vacancy creation, so that firms respond to aggregate shocks by adjusting both vacancies and recruiting intensity. Fluctuations in search and recruiting intensity driven by shocks to productivity and the discount factor help bridge the gap between the actual and model-predicted job-filling rate.
Again, a form of real stickiness more than nominal stickiness. The claim here is not that the market is doing a perfect job, or that the Great Depression was all about a big holiday, or something about video games that you might see mocked on Twitter. There is a very real and non-Pareto optimal coordination problem. Still, this model does not suggest that “lower interest rates” or a higher price inflation target from the Fed, say circa 2015, would have led to a quicker labor market recovery.
Even though the original shock had a huge negative blow to ngdp as a major part of it (which could have been countered more effectively by the Fed at the time).
I am not sure there is any analytical inaccuracy I see on Twitter more often than this one, namely to blame the Fed for being too conservative with monetary policy over the last few years.
And please note these pieces are not weird innovations, they are at the core of modern labor and macro and they are using fully standard methods. Yet the implications of such search models are hardly ever explored on social media, not even on Facebook or Instagram! You have a better chance finding them analyzed on Match.com.
Rather than fading away, solitary imprisonment, a form of torture in my view, has become more common:
Criminal Justice Policy Review: Solitary confinement is a harsh form of custody involving isolation from the general prison population and highly restricted access to visitation and programs. Using detailed prison records covering three decades of confinement practices in Kansas, we find solitary confinement is a normal event during imprisonment. Long stays in solitary confinement were rare in the late 1980s with no detectable racial disparities, but a sharp increase in capacity after a new prison opening began an era of long-term isolation most heavily affecting Black young adults. A decomposition analysis indicates that increases in the length of stay in solitary confinement almost entirely explain growth in the proportion of people held in solitary confinement. Our results provide new evidence of increasingly harsh prison conditions and disparities that unfolded during the prison boom.
Hat tip: Kevin Lewis.
M.B. Malabu, travel grant to come to the D.C. area for helping in setting up a market-oriented think tank in Nigeria.
Nolan Gray, urban planner from NYC, to be in residence at Mercatus and write a book on YIMBY, Against Zoning.
One other, not yet ready to be announced. But a good one.
Here are previous MR posts on Emergent Ventures.
In Why Online Education Works I wrote:
The future of online education is adaptive assessment, not for testing, but for learning. Incorrect answers are not random but betray specific assumptions and patterns of thought. Analysis of answers, therefore, can be used to guide students to exactly that lecture that needs to be reviewed and understood to achieve mastery of the material. Computer-adaptive testing will thus become computer-adaptive learning.
Computer-adaptive learning will be as if every student has their own professor on demand—much more personalized than one professor teaching 500 students or even 50 students. In his novel Diamond Age, science fiction author Neal Stephenson describes a Young Lady’s Illustrated Primer, an interactive book that can answer a learner’s questions with specific information and also teach young children with allegories tuned to the child’s environment and experience. In short, something like an iPad combining Siri, Watson, and the gaming technology behind an online world like Skyrim. Surprisingly, the computer will make learning less standardized and robotic.
In other words, the adaptive textbook will read you as you read it. The NYTimes has a good piece discussing recent advances in this area including Bakpax which reads student handwriting and grades answers. Furthermore:
Today, learning algorithms uncover patterns in large pools of data about how students have performed on material in the past and optimize teaching strategies accordingly. They adapt to the student’s performance as the student interacts with the system.
…Studies show that these systems can raise student performance well beyond the level of conventional classes and even beyond the level achieved by students who receive instruction from human tutors. A.I. tutors perform better, in part, because a computer is more patient and often more insightful.
…Still more transformational applications are being developed that could revolutionize education altogether. Acuitus, a Silicon Valley start-up, has drawn on lessons learned over the past 50 years in education — cognitive psychology, social psychology, computer science, linguistics and artificial intelligence — to create a digital tutor that it claims can train experts in months rather than years.
Acuitus’s system was originally funded by the Defense Department’s Defense Advanced Research Projects Agency for training Navy information technology specialists. John Newkirk, the company’s co-founder and chief executive, said Acuitus focused on teaching concepts and understanding.
The company has taught nearly 1,000 students with its course on information technology and is in the prototype stage for a system that will teach algebra. Dr. Newkirk said the underlying A.I. technology was content-agnostic and could be used to teach the full range of STEM subjects.
Dr. Newkirk likens A.I.-powered education today to the Wright brothers’ early exhibition flights — proof that it can be done, but far from what it will be a decade or two from now.
Deborah Lucas has studied this question, and here is the core of her results:
This review develops a theoretical framework that highlights the principles governing economically meaningful estimates of the cost of bailouts. Drawing selectively on existing cost estimates and augmenting them with new calculations consistent with this framework, I conclude that the total direct cost of the 2008 crisis-related bailouts in the United States was on the order of $500 billion, or 3.5% of GDP in 2009. The largest direct beneficiaries of the bailouts were the unsecured creditors of financial institutions. The estimated cost stands in sharp contrast to popular accounts that claim there was no cost because the money was repaid, and with claims of costs in the trillions of dollars. The cost is large enough to suggest the importance of revisiting whether there might have been less expensive ways to intervene to stabilize markets. At the same time, it is small enough to call into question whether the benefits of ending bailouts permanently exceed the regulatory burden of policies aimed at achieving that goal
You will note that 3/4 of that sum comes from the bailouts of the government mortgage agencies. I am myself uncertain how to think about this problem. First, is it useful to think of the additional bailout expenditure as being monetized, if only indirectly through the mix of Fed/Treasury policy? If yes (debatable), and the monetization itself limits a harmful further deflation, can it be said that this monetization is not a transfer away from citizens in the usual sense that an inflation in Zimbabwe might be? But rather a net gain for citizens or at least a much smaller loss? Is the interest paid on those monetized reserves the actual cost?
In any case, where exactly does the “3.5% of gdp” loss “come from”?
I do not know!
This year I want to discuss mostly science and technology. First, some thoughts on China’s technology efforts. Then I’ll present a few reflections on science fiction, with a focus on Philip K. Dick and Liu Cixin. Next I’ll discuss books I read on American industrial history. I save personal reflections for the end.
Dan now lives in Beijing. He left out music, however…
In the decades between 1850 and 1950, the United States decisively transformed its place in the world economic order. In 1850, the US was primarily a supplier of slave-produced cotton to industrializing Europe. American economic growth thus remained embedded in established patterns of Atlantic commerce. One hundred years later, the same country had become the world’s undisputed industrial leader and hegemonic provider of capital. Emerging victorious from the Second World War, the US had displaced Britain as the power most prominently situated — even more so than its Cold War competitor — to impress its vision of a global political economy upon the world. If Britain’s industrial revolution in the late eighteenth century marked the beginning of a ‘Great Divergence’ (Pomeranz) of ‘the West’ from other regions around the world, American ascendance in the decades straddling the turn of the twentieth century marked a veritable ‘second great divergence’ (Beckert) that established the US as the world’s leading industrial and imperial power.
That is an excerpt from a new essay in Past and Present by Stefan Link and Noam Maggor. (You’ll find the best summary of the actual thesis in the last few pages of the piece, not in the beginning.) It is one of the more interesting economic history pieces I have read in some time. The pointer is from Pseudoerasmus, who also has been doing some running commentary on the article in his afore-linked Twitter feed.
Having tracked the libertarian “movement” for much of my life, I believe it is now pretty much hollowed out, at least in terms of flow. One branch split off into Ron Paul-ism and less savory alt right directions, and another, more establishment branch remains out there in force but not really commanding new adherents. For one thing, it doesn’t seem that old-style libertarianism can solve or even very well address a number of major problems, most significantly climate change. For another, smart people are on the internet, and the internet seems to encourage synthetic and eclectic views, at least among the smart and curious. Unlike the mass culture of the 1970s, it does not tend to breed “capital L Libertarianism.” On top of all that, the out-migration from narrowly libertarian views has been severe, most of all from educated women.
There is also the word “classical liberal,” but what is “classical” supposed to mean that is not question-begging? The classical liberalism of its time focused on 19th century problems — appropriate for the 19th century of course — but from WWII onwards it has been a very different ballgame.
Along the way, I believe the smart classical liberals and libertarians have, as if guided by an invisible hand, evolved into a view that I dub with the entirely non-sticky name of State Capacity Libertarianism. I define State Capacity Libertarianism in terms of a number of propositions:
1. Markets and capitalism are very powerful, give them their due.
2. Earlier in history, a strong state was necessary to back the formation of capitalism and also to protect individual rights (do read Koyama and Johnson on state capacity). Strong states remain necessary to maintain and extend capitalism and markets. This includes keeping China at bay abroad and keeping elections free from foreign interference, as well as developing effective laws and regulations for intangible capital, intellectual property, and the new world of the internet. (If you’ve read my other works, you will know this is not a call for massive regulation of Big Tech.)
3. A strong state is distinct from a very large or tyrannical state. A good strong state should see the maintenance and extension of capitalism as one of its primary duties, in many cases its #1 duty.
4. Rapid increases in state capacity can be very dangerous (earlier Japan, Germany), but high levels of state capacity are not inherently tyrannical. Denmark should in fact have a smaller government, but it is still one of the freer and more secure places in the world, at least for Danish citizens albeit not for everybody.
5. Many of the failures of today’s America are failures of excess regulation, but many others are failures of state capacity. Our governments cannot address climate change, much improve K-12 education, fix traffic congestion, or improve the quality of their discretionary spending. Much of our physical infrastructure is stagnant or declining in quality. I favor much more immigration, nonetheless I think our government needs clear standards for who cannot get in, who will be forced to leave, and a workable court system to back all that up and today we do not have that either.
Those problems require state capacity — albeit to boost markets — in a way that classical libertarianism is poorly suited to deal with. Furthermore, libertarianism is parasitic upon State Capacity Libertarianism to some degree. For instance, even if you favor education privatization, in the shorter run we still need to make the current system much better. That would even make privatization easier, if that is your goal.
6. I will cite again the philosophical framework of my book Stubborn Attachments: A Vision for a Society of Free, Prosperous, and Responsible Individuals.
7. The fundamental growth experience of recent decades has been the rise of capitalism, markets, and high living standards in East Asia, and State Capacity Libertarianism has no problem or embarrassment in endorsing those developments. It remains the case that such progress (or better) could have been made with more markets and less government. Still, state capacity had to grow in those countries and indeed it did. Public health improvements are another major success story of our time, and those have relied heavily on state capacity — let’s just admit it.
8. The major problem areas of our time have been Africa and South Asia. They are both lacking in markets and also in state capacity.
9. State Capacity Libertarians are more likely to have positive views of infrastructure, science subsidies, nuclear power (requires state support!), and space programs than are mainstream libertarians or modern Democrats. Modern Democrats often claim to favor those items, and sincerely in my view, but de facto they are very willing to sacrifice them for redistribution, egalitarian and fairness concerns, mood affiliation, and serving traditional Democratic interest groups. For instance, modern Democrats have run New York for some time now, and they’ve done a terrible job building and fixing things. Nor are Democrats doing much to boost nuclear power as a partial solution to climate change, if anything the contrary.
10. State Capacity Libertarianism has no problem endorsing higher quality government and governance, whereas traditional libertarianism is more likely to embrace or at least be wishy-washy toward small, corrupt regimes, due to some of the residual liberties they leave behind.
11. State Capacity Libertarianism is not non-interventionist in foreign policy, as it believes in strong alliances with other relatively free nations, when feasible. That said, the usual libertarian “problems of intervention because government makes a lot of mistakes” bar still should be applied to specific military actions. But the alliances can be hugely beneficial, as illustrated by much of 20th century foreign policy and today much of Asia — which still relies on Pax Americana.
It is interesting to contrast State Capacity Libertarianism to liberaltarianism, another offshoot of libertarianism. On most substantive issues, the liberaltarians might be very close to State Capacity Libertarians. But emphasis and focus really matter, and I would offer this (partial) list of differences:
a. The liberaltarian starts by assuring “the left” that they favor lots of government transfer programs. The State Capacity Libertarian recognizes that demands of mercy are never ending, that economic growth can benefit people more than transfers, and, within the governmental sphere, it is willing to emphasize an analytical, “cold-hearted” comparison between government discretionary spending and transfer spending. Discretionary spending might well win out at many margins.
b. The “polarizing Left” is explicitly opposed to a lot of capitalism, and de facto standing in opposition to state capacity, due to the polarization, which tends to thwart problem-solving. The polarizing Left is thus a bigger villain for State Capacity Libertarianism than it is for liberaltarianism. For the liberaltarians, temporary alliances with the polarizing Left are possible because both oppose Trump and other bad elements of the right wing. It is easy — maybe too easy — to market liberaltarianism to the Left as a critique and revision of libertarians and conservatives.
c. Liberaltarian Will Wilkinson made the mistake of expressing enthusiasm for Elizabeth Warren. It is hard to imagine a State Capacity Libertarian making this same mistake, since so much of Warren’s energy is directed toward tearing down American business. Ban fracking? Really? Send money to Russia, Saudi Arabia, lose American jobs, and make climate change worse, all at the same time? Nope.
d. State Capacity Libertarianism is more likely to make a mistake of say endorsing high-speed rail from LA to Sf (if indeed that is a mistake), and decrying the ability of U.S. governments to get such a thing done. “Which mistakes they are most likely to commit” is an underrated way of assessing political philosophies.
You will note the influence of Peter Thiel on State Capacity Libertarianism, though I have never heard him frame the issues in this way.
Furthermore, “which ideas survive well in internet debate” has been an important filter on the evolution of the doctrine. That point is under-discussed, for all sorts of issues, and it may get a blog post of its own.
Here is my earlier essay on the paradox of libertarianism, relevant for background.
Happy New Year everyone!