Hawtrey came from a family long associated with Eton, where he was educated himself, before coming up to Trinity in 1898. In 1901 he was 19th Wrangler; in 1903 he briefly entered the Admiralty, before going to the Treasury, where he found his vocation as an economist and remained for forty-one years. He was a very faithful Apostle, attending every annual dinner until 1954, when he was prevented from going by ill health. He was devoted to Moore, whose impassioned singing of Die Beiden Grenadiere made him realize how horrible war was for the soldiers who actually did the fighting: this constituted an epiphany for Hawtrey, and reinforced his life-long Liberalism. Moore was so much the most important influence on the life and career of Sir Ralph Hawtrey that he spent his last years working on a systematic philosophical treatise (inspired also by Robin Mayor), which was to have been a summa of his twenty-odd books and the hundreds of letters he published in The Times. He was married to the famous pianist Titi d’Aranyi.
That is from Paul Levy’s book Moore: G.E. Moore and the Cambridge Apostles. Here is more on Titi, also known as Hortense, who studied with Bartok and received numerous letters from him. And here is Scott Sumner on Hawtrey, one of the great monetary economists.
This paper studies the heterogeneous impacts of the US-China trade war through linkages in global value chains. By building a two-stage, multi-country, multi-sector general equilibrium model, this paper discusses how imports tariffs effect domestic producers through internal linkage within industry and external linkage across industries. The model validates that imports tariffs on Chinese upstream intermediate goods negatively affects US downstream exports, outputs and employment. Effects are strong in the US industries that rely much on targeted Chinese intermediate goods. In addition, this paper differentiates the impacts of the two rounds of the trade war by comparing tariffs on intermediate goods and consumption goods. This paper estimates that the trade war increases US CPI by 0.09% in the first round and 0.22% in the second round. Finally, this paper studies the welfare effects of the trade war. This paper estimates that the trade war costs China $35.2 billion, or 0.29% GDP, costs US $15.6 billion, or 0.08% GDP, and benefits Vietnam by $402.8 million, or 0.18% GDP.
That is by Yang Zhou of the University of Minnesota, via the excellent Kevin Lewis. Those numbers should not come as a surprise, they do indicate that both countries are worse off, but they also show that a lot of the bargaining power does in fact reside on the side of the United States.
For a short time the Brazilian city of Manaus, in the heart of the Amazon rainforest, offered a glimmer of hope in the search for herd immunity from Covid-19.
After a devastating wave in May killed about 3,400 people and infected many more, the prevalence of the virus subsided rapidly, leading some scientists to theorise that the city of 2m had reached a form of collective immunity.
That hypothesis is now in doubt as a resurgence in cases in Manaus poses fresh challenges to the authorities and difficult questions for the scientists and policymakers worldwide who have been edging towards herd immunity policies as an alternative to harsh lockdowns.
“How do you explain the number of [daily] deaths being in the 30s yesterday and the 50s today?” said Arthur Virgilio, the mayor of Manaus. “What has caused the death rate in Manaus to increase?”
Here is more from the Financial Times.
3. In case you had forgotten this ongoing story: “By Thursday evening’s fourth round the 29-year-old from Oslo had extended his world record unbeaten streak to 125 games, with his last defeat coming in July 2018.”
4. Which 21st century works will merit a close reading or rereading in 2050? I tend to think virtually everything will be superseded, but I mean that as praise for what is to come, not pessimism about current work.
6. Further results on falling mortality rates and diminishing viral load. The broad upshot is that diminishing viral load seems to be more important than we had thought, and a variety of other factors less important.
Reason has released the first of a four part series on the hi-tech Hayekians and the cypherpunk movement. As Tyler already mentioned, Don Lavoie at GMU played an early role in bringing economics and computer scientists together. I was at a few of the first seminars with Lavoie and people like Mark Miller, although it took decades for me to realize how far Lavoie was ahead of his time.
I do not feel qualified to have an opinion here, but this piece, by Benjamin Y. Hayden and Yael Niv, seems of some interest:
Much of traditional neuroeconomics proceeds from the hypothesis that value is reified in the brain, that is, that there are neurons or brain regions whose responses serve the discrete purpose of encoding value. This hypothesis is supported by the finding that the activity of many neurons and brain regions covaries with subjective value as estimated in specific tasks. Here we consider an alternative: that value is not represented in the brain. This idea is motivated by close consideration of the economic concept of value, which places important epistemic constraints on our ability to identify its neural basis. It is also motivated by the behavioral economics literature, especially work on heuristics. Finally, it is buoyed by recent neural and behavioral findings regarding how animals and humans learn to choose between options. In light of our hypothesis, we critically reevaluate putative neural evidence for the representation of value, and explore an alternative: that brains directly learn action policies. We delineate how this alternative can provide a robust account of behavior that concords with existing empirical data.
Via Benjamin Lyons.
This study evaluates evidence pertaining to popular narratives explaining the American public’s support for Donald J. Trump in the 2016 presidential election. First, using unique representative probability samples of the American public, tracking the same individuals from 2012 to 2016, I examine the “left behind” thesis (that is, the theory that those who lost jobs or experienced stagnant wages due to the loss of manufacturing jobs punished the incumbent party for their economic misfortunes). Second, I consider the possibility that status threat felt by the dwindling proportion of traditionally high-status Americans (i.e., whites, Christians, and men) as well as by those who perceive America’s global dominance as threatened combined to increase support for the candidate who emphasized reestablishing status hierarchies of the past. Results do not support an interpretation of the election based on pocketbook economic concerns. Instead, the shorter relative distance of people’s own views from the Republican candidate on trade and China corresponded to greater mass support for Trump in 2016 relative to Mitt Romney in 2012. Candidate preferences in 2016 reflected increasing anxiety among high-status groups rather than complaints about past treatment among low-status groups. Both growing domestic racial diversity and globalization contributed to a sense that white Americans are under siege by these engines of change.
Here is the article, by Diana C. Mutz, via someone on Twitter whom I have forgotten!
Nancy Pelosi warned that a Covid-19 vaccine should not be authorised for use in the US based on data from British trials, amid fears that the Trump administration is planning to rush out an inoculation before election day.
The Democratic speaker of the House of Representatives on Friday cast doubt on the British system for testing and approving medicines, further politicising the race to develop a vaccine for Covid-19.
“We need to be very careful about what happens in the UK. We have very stringent rules in terms of the Food and Drug Administration here, about the number of clinical trials, the timing, the number of people and all the rest,” Ms Pelosi told reporters in Washington.
1. Video of good GeoWizard guessing, a little slow to start.
6. The Monty Hall problem with many doors (goats).
That is the new book by Lt. General David Barno and Nora Bensahel, here is one excerpt;
This emphasis on decentralized, independent battlefield actions, long a part of German military thinking, once again became a central tenet of German army doctrine in the modest force of the post-Versailles period. Mission orders were regularly emphasized and practiced during peacetime training exercises. moreover, the German army relentlessly critiqued the performance of its leaders and units in exercises and war games. Commanders and staff officers at all levels were expected to do so candidly and objectively, without regard to personal embarrassment or potential career damage. This candor extended to critiquing the performance of senior officers and higher headquarters as well. These principles made German doctrine inherently adaptable in the face of battle.
And then a few pages later:
In stark contrast to the Germans, in the French army there was “no large-scale examination of the lessons of the last war by a significant portion of the officers corps.” Partly as a result, the lessons that the French army drew from world War I led to a warfighting doctrine that was nearly the polar opposite of that developed by the Germans. The French army assumed that the next war in Europe would largely resemble the last. The staggering number of French casualties during World War I led French leaders to conclude that an offensive doctrine would prove both indecisive and prohibitively costly. They reasoned that a defensive doctrine would best preserve their fighting power and prevent the enemy from winning another major war through an offensive strike. As a result, nearly all French interwar thinking focused on leveraging defensive operations to prevail in any future war.
Overall, it is striking to me just how much substance there is in this book per page — a rarity to be treasured! You can order it here.
I will be doing a Conversation with him. So what should I ask?
Here is his um…Wikipedia page, presumably it is fairly accurate!
This started in the late 1980s, and was led by GMU economist Don Lavoie, who earlier had been a computer programmer. Here is one bit from Don’s extensive essay, co-authored with Howard Baetjer and William Tulloh:
The market for scholarly ideas is now badly compartmentalized, due to the nature of our institutions for dispersing information. One important aspect of the limitations on information dispersal is the one-way nature of references in scholarly literature. Suppose Professor Mistaken writes a persuasive but deeply flawed article. Suppose few see the flaws, while so many are persuaded that a large supportive literature results. Anyone encountering a part of this literature will see references to Mistaken’s original article. References thus go upstream towards original articles. But it may be that Mistaken’s article also provokes a devastating refutation by Professor Clearsighted. This refutation may be of great interest to those who read Mistaken’s original article, but with our present technology of publishing ideas on paper, there is no way for Mistaken’s readers to be alerted to the debunking provided by Clearsighted. The supportive literature following Mistaken will cite Mistaken but either ignore Professor Clearsighted or minimize her refutations.
In a hypertext system such as that being developed at Xanadu, original work may be linked downstream to subsequent articles and comments. In our example, for instance, Professor Clearsighted can link her comments directly to Mistaken’s original article, so that readers of Mistaken’s article may learn of the existence of the refutation, and be able, at the touch of a button, to see it or an abstract of it. The refutation by Clearsighted may similarly and easily be linked to Mistaken’s rejoinder, and indeed to the whole literature consequent on his original article. Scholars investigating this area of thought in a hypertext system would in the first place know that a controversy exists, and in the second place be able to see both (or more) sides of it with ease. The improved cross-referencing of, and access to, all sides of an issue should foster an improved evolution of knowledge.
A potential problem with this system of multidirectional linking is that the user may get buried underneath worthless “refutations” by crackpots. The Xanadu system will include provisions for filtering systems whereby users may choose their own criteria for the kinds of cross-references to be brought to their attention. These devices would seem to overcome the possible problem of having charlatans clutter the system with nonsense. In the first place, one would have to pay a fee for each item published on the system. In the second place, most users would choose to filter out comments that others had adjudged valueless and comments by individuals with poor reputations. In other words, though anyone could publish at will on a hypertext system, if one develops a bad reputation, very few will ever see his work.
Miller and Drexler envision the evolution of what they call agoric open systems–extensive networks of computer resources interacting according to market signals. Within vast computational networks, the complexity of resource allocation problems would grow without limit. Not only would a price system be indispensible to the efficient allocation of resources within such networks, but it would also facilitate the discovery of new knowledge and the development of new resources. Such open systems, free of the encumbrances of central planners, would most likely evolve swiftly and in unexpected ways. Given secure property rights and price information to indicate profit opportunities, entrepreneurs could be expected to develop and market new software and information services quite rapidly.
Secure property rights are essential. Owners of computational resources, such as agents containing algorithms, need to be able to sell the services of their agents without having the algorithm itself be copyable. The challenge here is to develop secure operating systems. Suppose, for example, that a researcher at George Mason University wanted to purchase the use of a proprietary data set from Alpha Data Corporation and massage that data with proprietary algorithms marketed by Beta Statistical Services, on a superfast computer owned by Gamma Processing Services. The operating system needs to assure that Alpha cannot steal Beta’s algorithms, that Beta cannot steal Alpha’s data set, and that neither Gamma or the George Mason researcher can steal either. These firms would thus under-produce their services if they feared that their products could be easily copied by any who used them.
In their articles, Miller and Drexler propose a number of ways in which this problem might be overcome. In independent work, part of the problem apparently has already been overcome. Norm Hardy, senior scientist of Key Logic Corporation, whom we met at Xanadu, has developed an operating system caned KeyKOS which accomplishes what many suspected to be impossible: it assures by some technical means (itself an important patented invention) the integrity of computational resources in an open, interconnected system. To return to the above example, the system in effect would create a virtual black box in Gamma’s computer, in which Alpha’s data and Beta’s algorithms are combined. The box is inaccessible to anyone, and it self-destructs once the desired results have been forwarded to the George Mason researcher.
There is really quite a bit more at the link, noting that at the time Don had assembled a group of about ten people working on these ideas. As for the hyperlinks, I recall thinking at the time something like: “People don’t value reading so much, so making reading better with hyperlinks won’t have a huge marginal value!”
It is not mainly about NBA politics:
- US Open (golf) final round: down 56%
- US Open (tennis) was down 45% and the French open is down 57% so far
- Kentucky Derby: down 43%
- Indy 500: down 32%
- Through four weeks, NFL viewership is down approximately 10%
- NHL Playoffs were down 39% (Pre Stanley Cup playoffs was down 28% while the Stanley Cup was down 61%).
- NBA finals are down 45% (so far). Conference finals were down 35%, while the first round was 27% down. To match the viewership, activity on the NBA reddit fan community is also down 50% from the NBA finals last year.
That is from Daniel Frank, here are a few of his hypotheses:
- Sports are very social. People love talking about sports with their peers and without interacting with as many people, people have less opportunities to talk about sports with others. This has the effect of making fans feel less engaged and more casual fans less likely to start watching, creating a cascading effect on engagement.
- Watching sports is a great way for people to tune out, relax and distract themselves from normal life. With so many people working from home, having a less defined break from work to non-work, and potentially working less hard, watching sports feels like less of an escape than it used to.
- People have started consuming politics like they do sports and their interest in sports has been cannibalized by political fanaticism.
- Lots of people are experiencing mental health challenges and struggling and don’t have the same interest in things they used to enjoy like sports.
My intuitions are quite close to Daniel’s — what do you all think?
2. New Italian results on monoclonal antibodies. And “Being previously infected with coronaviruses that cause the “common cold” may decrease the severity of severe acute respiratory syndrome coronavirus (SARS-CoV-2) infections…”
Gender wage gaps appear even in markets where workplace discrimination is impossible or unlikely. Uber driver’s for example are assigned trips using a gender-blind algorithm and earn according to a known formula based on time and distance of trip. Yet, a small but persistent gender gap of about 7% exists which appears to be due mostly to the fact that male drivers drive a little bit faster, choose to work in more congested areas, and have a bit more experience. Litman et al. (2020) show that the same kind of difference also show up in earnings on Mechanical Turk
In this study we examined the gender pay gap on an anonymous online platform across an 18-month period, during which close to five million tasks were completed by over 20,000 unique workers. Due to factors that are unique to the Mechanical Turk online marketplace–such as anonymity, self-selection into tasks, relative homogeneity of the tasks performed, and flexible work scheduling–we did not expect earnings to differ by gender on this platform. However, contrary to our expectations, a robust and persistent gender pay gap was observed.
The average estimated actual pay on MTurk over the course of the examined time period was $5.70 per hour, with the gender pay differential being 10.5%.
In this case, however, neither experience nor task choice nor demographics appears to explain the difference. One interesting finding is that women are more likely to choose tasks with a lower advertised pay–perhaps men are just a bit lazier. Who knows? People are different.
N.B. The authors go out of their way to plead that they are not in fact politically incorrect.