Results for “department why not”
147 found

The 1991 Project

In 1991 on the verge of bankruptcy, India abandoned the License-Raj and freed its economy from many socialist shackles. Prime Minister Narasimha Rao announced to the nation:

We believe that a bulk of government regulations and controls on economic activity have outlived their utility. They are stifling the creativity and innovativeness of our people. Excessive controls have also bred corruption. Indeed, they have come in the way of achieving our objectives of expanding employment opportunities, reducing rural-urban disparities and ensuring greater social justice.

And he was serious–in the plan, tariffs and controls were lifted, thousands of licenses eliminated, entire departments undone. A No Confidence motion was mounted in parliament but the opponents made a tactical error and walked out, leaving just enough votes for Rao’s government to survive and the plan to pass. The result was an economic revolution. Economic growth increased and millions were lifted out of poverty. Yet, the 1991 Project was incomplete and many young Indian’s today have little appreciation of the gains that have been made or why they happened.

The 1991 Project is about understanding the history of economic liberalization in order to better chart the future. It begins with a superb essay by Shruti Rajagopalan on living under India’s socialist system. Did you know that under the License-Raj you needed a government permit to own a bicycle in some parts of the country?

Bicycles saw increasing demand as urban populations increased. Steel was government controlled and, given the heavy demand from the construction industry, only limited allotments were made to bicycle manufacturers. To increase their allotment of steel and meet the increasing demand for bicycles, they needed an expansion permit, which was rarely approved by the government given the shortage of steel.

The license and permit system for steel also created a shortage in bicycles, which was followed by the inevitable price controls. To ensure that demand was legitimate and all available bicycles were used, owning and riding a bicycle required a government-issued token in some parts of the country. Inspectors thrived on the bribes paid when they caught anyone riding without the requisite permit.

The middle class didn’t escape the problem, either. Through a collaboration with Vespa, Bajaj manufactured scooters in India, and they became popular with the middle-class. Denied permission to expand to meet the rising demand, the waitlist for a Bajaj scooter was ten years by the late 1970s.

Even though dowry is not just illegal but is a crime in India, the entrenched dowry culture in the arranged marriage system enables grooms to make outrageous demands of the bride’s family. A Bajaj scooter became a top dowry ask. Given the decade-long waiting period, parents took to purchasing them on the black market, and by the late 1970s the price of a secondhand/used Bajaj scooter available immediately was much higher than that of a brand-new vehicle with a 5- to 10-year waiting period.

It got so bad that when a girl child was born, well-wishers would – only half in jest – suggest to the parents that they should immediately book a scooter so it would arrive in time for the wedding. This was reminiscent of the old Soviet Union joke about a man paying for an automobile. The clerk tells him it will be delivered in ten years. The man asks, “Morning or afternoon?” “What difference does it make?” responds the clerk. “Well, the plumber is coming in the morning.”

Check out The 1991 Project and Rajagopalan’s essay.

Photo Credit: Manmohan Singh with PM Narasimha Rao in 1994. Photo: Sanjay Sharma/Hindustan Times

$1000 submission fee to the AER?

I saw that circulating as an April Fool’s joke, but is it such a crazy idea?  Here would be a few effects:

1. Submissions would decline, thus liberating some time for editors and referees.  This is valuable in its own right, and furthermore remaining decisions might be made with greater care.  And presumably the remaining submissions would be those with a higher chance of acceptance.

2. To some extent departments would pick up the submission fee.  This would favor researchers in wealthier departments, though whether this is good or bad I am not sure.  And even the most flush departments would find this pretty steep and I don’t think would offer carte blanche reimbursement.

3. It would favor senior and wealthier colleagues over junior colleagues.  That sounds bad to most people, but is it?  Favoring the wealthier senior colleagues might help limit the arms race for “here is my 90-page paper that has performed every possible cross-check of the results.”  It also might lower the return to technique, as younger researchers tend to be more up on the latest math but they are also less broad and by definition less experienced.

4. Graduate students in particular would be less likely to submit, especially from lower-tier departments.  It would be harder for job candidates from the non-top schools to prove themselves by publishing in the AER.

5. Papers would be “shopped around” more to seminars before being submitted.

6. Papers would become longer, which is probably a bad thing.

7. It might select for overconfident economists from wealthier families.

8. The AER would no longer “get all the best papers,” at least as such things are perceived.  That could very well be good!  Why should one journal have such a lock?

Would the AEA take in more revenue with this plan?

What else? What is in fact the optimal submission fee for a journal where publications can be worth tens of thousands of dollars (or sometimes much more) there?  Why should the authors/submitters be charged so little?

Personnel economics working in the supermarket

Yesterday I outlined my supermarket job from ages 16-18 in suburban New Jersey.  I did know plenty of economics at that time, including Adam Smith and Paul Heyne and most of classical economics, and here are some of the observations I made.  Please note this is all n = 1 or n = 2, these may or may not be generalizable.  Here goes:

1. Mockery was the relevant incentive at the margin, and the “enforcer of first resort.”  If you did something wrong you were mocked, sometimes mercilessly.  My first night on the job I put too many fruits and vegetables in the refrigerator, relative to expectations, and so I heard about that multiple times on my next appearance.  The jokes at my expense were funny.

2. There was strong competition to win overtime hours. Working Sunday 12-5 was a prime slot, and not hard work either because customer demand was slow that day.  Saturday 1-9:30 had extra payoffs as well.  These labor supply curves definitely sloped upwards.  And allocation of overtime hours served to keep the better workers around.

3. My sense was that the demand for labor was pretty inelastic in the following sense: once you were soundly established as someone who would show up, complete your list of tasks, and not steal too much, they really were not looking to fire you.  You were “a keeper,” and in principle they would pay you more in response to a minimum wage hike, rather than firing you.

4. My sense was that the demand for labor was quite elastic in the following sense: the lower-tier workers were given a lot of luxury hours.  For one thing, if you didn’t get an average of at least fifteen hours a week, you might leave for another job.  Second, and more importantly, a lot of the night hours were optional.  Did they really need you back there that Tuesday night after 6 or 7 p.m.?  Well, maybe yes, maybe no.  There was a sense that if customers came by with questions, it was useful to have someone around to help them.  But if the produce department was not making a lot of money, they would cut back on these hours quite readily.  In slow times I didn’t get the 5-10 p.m. slot a whole lot.

5. Department managers, including in produce, were paid an “efficiency wage time profile” of returns, a’ la Eddie Lazear.  That is, in early years they would pay you below marginal product, but pay you above marginal product in the later, outer years.  That schedule would keep you in line, because you needed to avoid getting fired to reap the high later returns.  That said, in the outer years you would end up getting canned, because the prescription is not entirely time consistent.  Why keep someone around who is getting paid above marginal product?  It was called “getting busted.”  At that point you would typically start all over again with another supermarket.  (I did understand this all at the time, though I hadn’t yet read Lazear and a lot of the work hadn’t yet been published.)  A minority of department managers ended up promoted to store managers, but that was hard to pull off, especially without a college degree.

6. There was plenty of employee theft, though never from me.  Things disappeared off the back of trucks, and in this time there was no CCTV.  At a smaller scale, to be caught eating or taking food without a receipt was considered a fireable offense, though if you were a good worker and kept it to brief snacking within limits they did not try too hard to catch you.  They didn’t want to have to fire you, yet they did want to keep the rule in place.  Collusion between male line workers and female cashiers sometimes was a problem, as it meant some people would just take foodstuffs home.

6b. Shoplifting was rampant, though much more in the meat department than in produce.  Overall, the customers and workers were less honest than the bosses.

7. Correctly or not, the line workers typically were cynical about the union.  You paid dues to it, and you were told it gave you higher wages, but otherwise it had no presence in your life.  People saw the dues that left their paycheck, but were not convinced they were getting comparably high wages because of the union.

8. Due to gas prices and commuting costs (you had to keep your car in OK shape, which took competence as well as money), there was a modest degree of monopsony.  Still, everyone understood that a higher cost of labor meant fewer hours and in the longer run fewer hires.  No one thought that allowing vastly more shoplifting would lead the company to hire more labor, which is in fact what the more radical monopsony models imply.  Nope, it wasn’t monopsony of that sort.

9. The store manager, and in turn the department manager, would be terrified when the regional boss would do a store walk-through, and typically that happened by surprise.  That was when they really wanted you to scurry and have everything looking spic and span.

10. Workers had various personality types, and within a given type only so much motivation was possible, no matter what the rewards.  All rewards were seen as temporary, and to be followed by an eventual firing or demotion.  Slackers were slackers, and you had to accept that and work around them accordingly.

Paul Milgrom, Nobel Laureate

Most of all this is a game theory prize and an economics of information prize, including game theory and asymmetric information.  Much of the work has had applications to auctions and finance.  Basically Milgrom was the most important theorist of the 1980s, during the high point of economic theory and its influence.

Here is Milgrom’s (very useful and detailed) Wikipedia page.  Most of his career he has been associated with Stanford University, with one stint at Yale for a few years.  Here is Milgrom on scholar.google.com.  A very good choice and widely anticipated, in the best sense of that term.  Here is his YouTube presence.  Here is his home page.

Milgrom, working with Nancy Stokey, developed what is called the “no trade” theorem, namely the conditions under which market participants will not wish to trade with each other.  Obviously if someone wants to trade with you, you have to wonder — what does he/she know that I do not?  Under most reasonable assumptions, it is hard to generate a high level of trading volume, and that has remained a puzzle in theories of finance and asset pricing.  People are still working on this problem, and of course it relates to work by Nobel Laureate Robert Aumann on when people should rationally disagree with each other.

Building on this no-trade result, Milgrom wrote a seminal piece with Lawrence Glosten on bid-ask spread.  What determines bid-ask spread in securities markets?  It is the risk that the person you are trading with might know more than you do.  You will trade with them only when the price is somewhat more advantageous to you, so markets with higher degrees of asymmetric information will have higher bid-ask spreads.  This is Milgrom’s most widely cited paper and it is personally my favorite piece of his, it had a real impact on me when I read it.  You can see that the themes of common knowledge and asymmetric information, so important for the auctions work, already are rampant.

Alex will tell you more about auctions, but Milgrom working with Wilson has designed some auctions in a significant way, see Wikipedia:

Milgrom and his thesis advisor Robert B. Wilson designed the auction protocol the FCC uses to determine which phone company gets what cellular frequencies. Milgrom also led the team that designed the 2016-17 incentive auction, which was a two-sided auction to reallocate radio frequencies from TV broadcast to wireless broadband uses.

Here is Milgrom’s 277-page book on putting auction theory to practical use.  Here is his highly readable JEP survey article on auctions and bidding, for an introduction to Milgrom’s prize maybe start there?

Here is Milgrom’s main theoretical piece on auctions, dating from Econometrica 1982 and co-authored with Robert J. Weber.  it compared the revenue properties of different auctions and showed that under risk-neutrality a second-price auction would yield the highest price.  Also returning to the theme of imperfect information and bid-ask spread, it showed that an expert appraisal would make bidders more eager to bid and thus raise the expected price.  I think of Milgrom’s work as having very consistent strands.

With Bengt Holmstrom, also a Nobel winner, Milgrom wrote on principal-agent theory with multiple tasks, basically trying to explain why explicit workplace incentives and bonuses are not used more widely.  Simple linear incentives can be optimal because they do not distort the allocation of effort across tasks so much, and it turned out that the multi-task principal agent problem was quite different from the single-task problem.

People used to think that John Roberts would be a co-winner, based on the famous Milgrom and Roberts paper on entry deterrence.  Basically incumbent monopolists can signal their cost advantage by making costly choices and thereby scare away potential entrants.  And the incumbent wishes to be tough with early entrants to signal to later entrants that they better had stay away. In essence, this paper was viewed as a major rebuttal to the Chicago School economists, who had argued that predatory behavior from incumbents typically was costly, irratoinal, and would not persist.

The absence of Roberts’s name on this award indicates a nudge in the direction of auction design and away from game theory a bit — the Nobel Committee just loves mechanism design!

That said, it is worth noting that the work of Milgrom and co-authors intellectually dominated the 1980s and can be identified with the peak of influence of game theory at that period of time.  (Since then empirical economics has become more prominent in relative terms.)

Milgrom and Roberts also published a once-famous paper on supermodular games in 1990.  I’ve never read it, but I think it has something to do with the possible bounding of strategies in complex settings, but based on general principles.  This was in turn an attempt to make game theory more general.  I am not sure it succeeded.

Milgrom and Roberts also produced a well-known paper finding the possible equilibria in a signaling model of advertising.

Milgrom and Roberts also wrote a series of papers on rent-seeking and “influence activities” within firms.  It always seemed to me this was his underrated work and it deserved more attention.  Among other things, this work shows how hard it is to limit internal rent-seeking by financial incentives (which in fact can make the problem worse), and you will see this relates to Milgrom’s broader work on multi-task principal-agent problems.

Milgrom also has a famous paper with Kreps, Wilson, and Roberts, so maybe Kreps isn’t going to win either.  They show how a multi-period prisoner’s dilemma might sustain cooperating rather than “Finking” if there is asymmetric information about types and behavior.  This paper increased estimates of the stability of tit-for-tat strategies, if only because with uncertainty you might end up in a highly rewarding loop of ongoing cooperation.  This combination of authors is referred to as the “Gang of Four,” given their common interests at the time and some common ties to Stanford.  You will note it is really Milgrom (and co-authors) who put Stanford economics on the map, following on the Kenneth Arrow era (when Stanford was not quite yet a truly top department).

Not what he is famous for, but here is Milgrom’s paper with Roberts trying to rationalize some of the key features of modern manufacturing.  If nothing else, this shows the breadth of his interests and how he tries to apply game theory generally.  One question they consider is why modern manufacturing has moved so strongly in the direction of greater flexibility.

Milgrom also has a 1990 piece with North and Weingast on the medieval merchant guilds and the economics of reputation, showing his more applied side.  In essence the Law Merchant served as a multilateral reputation mechanism and enforced cooperation.  Here is a 1994 follow-up.  This work paved the way for later work by Avner Greif on related themes.

Another undervalued Milgrom piece is with Sharon Oster (mother of Emily Oster), or try this link for it.  Here is the abstract:

The Invisibility Hypothesis holds that the job skills of disadvantaged workers are not easily discovered by potential new employers, but that promotion enhances visibility and alleviates this problem. Then, at a competitive labor market equilibrium, firms profit by hiding talented disadvantaged workers in low-level jobs.Consequently, those workers are paid less on average and promoted less often than others with the same education and ability. As a result of the inefficient and discriminatory wage and promotion policies, disadvantaged workers experience lower returns to investments in human capital than other workers.

With multiple, prestigious co-authors he has written in favor of prediction markets.

He was the doctoral advisor of Susan Athey, and in Alex’s post you can read about his auction advising and the companies he has started.

His wife, Eva Meyersson Milgrom, is herself a renowned social scientist and sociologist, and he met her in 1996 while seated next to her at a Nobel Prize dinner in Stockholm.  Here is one of his papers with her (and Ravi Singh), on whether firms should share control with outsiders.  Here is the story of their courtship.

How is Defunding the Police Going in Minneapolis?

Not well.

MPR News: The meeting was slated as a Minneapolis City Council study session on police reform.

But for much of the two-hour meeting, council members told police Chief Medaria Arradondo that their constituents are seeing and hearing street racing which sometimes results in crashes, brazen daylight carjackings, robberies, assaults and shootings. And they asked Arradondo what the department is doing about it.

…Just months after leading an effort that would have defunded the police department, City Council members at Tuesday’s work session pushed chief Medaria Arradondo to tell them how the department is responding to the violence…More people have been killed in the city in the first nine months of 2020 than were slain in all of last year. Property crimes, like burglaries and auto thefts, are also up. Incidents of arson have increased 55 percent over the total at this point in 2019.

Bear in mind this is coming after just a few months of reduced policing, due in part to extra demands and difficulty and probably in part due to police pulling back either out of fear or reluctance (blue flu) as also happened in Baltimore after the Freddie Gray killing and consequent protests and riots.

A few true believers still remain:

Cunningham also criticized some of his colleagues for seeming to waver on the promises they made earlier this year to transform the city’s public safety system.

“What I am sort of flabbergasted by right now is colleagues, who a very short time ago were calling for abolition, are now suggesting we should be putting more resources and funding into MPD,” Cunningham said.

I’m a supporter of unbundling the police and improving policing but the idea that we can defund the police and crime will just melt away is a fantasy. As with bail reform the defunders risk a backlash. Let’s start by decriminalizing more victimless crimes, as we have done in many states with marijuana laws. Let’s work on creating bureaus of road safety. But one of the reasons we do these things is so that we can increase the number of police on the street. The United States is underpoliced and the consequences of underpolicing, as well as overpolicing, fall on minority communities. As I have argued before, we need better policing so that we can all be comfortable with more policing. Getting there, however, will take time.

Should hiring schools coordinate on delaying their interviews?

The AEA emails me this (web version here):

The AEA suggests that employers wait to extend interview invitations until Monday, December 7, 2020 or later.

Rationale: the AEA will deliver signals from job candidates to employers on December 2. We suggest that employers wait and review those signals and incorporate them into their decision-making, before extending interview invitations.

…The AEA suggests that employers conduct initial interviews starting on Wednesday, January 6, 2021, and that all interviews take place virtually; i.e. either by phone or online (e.g. by Zoom). We also ask that all employers indicate on EconTrack when they have extended interview invitations (https://www.aeaweb.org/econtrack).

Rationale: In the past, interviews were conducted at the AEA/ASSA meetings. This promoted thickness of the market, because most candidates and employers were present at the in-person meetings, but had the disadvantage of precluding both job candidates and interviewers from fully participating in AEA/ASSA sessions. Since the 2021 AEA/ASSA meetings (which will take place Jan 3-5, 2021) will be entirely virtual, we suggest that interviews NOT take place during the AEA/ASSA meetings to allow job candidates and interviewers to participate in the conference.

Perhaps not surprisingly, they don’t offer much economic analysis of this recommendation.  I have a few remarks, none of which are beyond the analytical acumen of the AEA itself:

1. This proposal could well be a tax on the more conscientious departments, which will abide by the stricture while the more rogue departments jump the gun, giving them a relative advantage in finding job candidates.

2. It is common practice for the very top departments to make phone calls to advisors early, well before Christmas, and in essence tie up their future hires before the rest of the market clears (even if the ink on the contract is not dry until later on).  Whatever you might think of this practice, have any of those departments vowed to stop doing this?  If not, is the new recommendation simply an exhortation that other departments ought not to copy them, thus giving them exclusive use of this practice?  And did the AEA — which essentially is run by people from those top schools — ever complain about this practice?

3. In the more liquid market, as this proposal is designed to create, the better job candidates are likely to end up going to the more highly rated schools.  That is the opposite of how the NBA draft works — this year the Minnesota Timberwolves (a very bad team) pick first.  So maybe the more liquid market is best for the most highly rated schools — is that obviously a good thing?

4. Many job candidates don’t get any early offers at all, and this is likely to be all the more true with Covid-19 and tight state budgets.  Aren’t they better off if the market clears sooner rather than later?  Then they can either move on to other jobs searches, take jobs with community colleges, look for postdocs, or whatever.  Why postpone those adjustments?  Is their welfare being counted in this analysis?  Aren’t some of them the very neediest and also most stressed people in the economics job market?

5. Let’s say instead the market is done sequentially, where first you “auction off” the candidates in highest demand, ensuring that say a department rated #17 does not tie up an offer (fruitlessly, at that) to one of the very top candidates.  Won’t that #17 school then bid harder for the candidates one tier lower, thus making that part of the market more liquid?  I know it doesn’t have to work out that way, but surely that is one plausible scenario?

6. In finance, there are some results that you get less “racing” behavior with batched rather than continuous trading auctions. Again, that doesn’t have to be true, but surely it is no accident that many high-frequency traders oppose the idea of periodic rather than continuous securities auctions?  What exactly are the relevant conditions here?

7. Would many economists recommend that say the top tech firms not make any offers before a certain date, so as to keep that labor market “more liquid”?  What exactly is the difference here?

8. Might it be possible that a permanent shift to non-coordinated interview dates, and less temporally coordinated Zoom interviews and fly-outs, would permanently lower the status and import of said AEA?

I do not wish to pretend those are the only relevant factors.  But here is a simple question: does anyone connected with the AEA have the stones to actually write a cogent economic or game-theoretic analysis of this proposal?  Or does the AEA not do economics any more?

Unbundling the Police in Kentucky

In Why Are the Police in Charge of Road Safety? I argued for unbundling the police:

Don’t use a hammer if you don’t need to pound a nail…the police have no expertise in dealing with the mentally ill or with the homeless–jobs like that should be farmed out to other agencies. Notice that we have lots of other safety issues that are not handled by the police. Restaurant inspectors, for example, do over a million restaurant inspectors annually but they don’t investigate murder or drug charges and they are not armed. Perhaps not coincidentally, restaurant inspectors are not often accused of inspector brutality, “Your honor, I swear I thought he was reaching for a knife….”.

A small experiment was started several years ago in Alexandria, Kentucky.

Faced with a tight budget and rising demands on its 17 officer police department, the City of Alexandria in Campbell County tried something different. Instead of hiring an additional officer and taking on the added expenses of equipping that officer, the police chief at the time hired a social worker to respond in tandem with officers.

Anecdotally the results appear good:

“It was close to a $45,000 to $50,000 annual savings from hiring a police officer the first time to hiring a social worker,” [former Alexandria Police Department chief] Ward said. “They (police social workers) started solving problems for people in our community and for our agency that we’ve never been able to solve before.”

Ward believes the results in Alexandria, a city of less than 10,000, could be replicated in larger cities like Louisville, where officers respond to calls involving mental health, domestic disturbances, and homelessness an average of once every 10 minutes.

“Louisville is very big with services,” Pompilio said. “They have lots of things to offer families. It’s just a matter of a social worker connecting.”

Alexandria doubled down on its commitment and now employs two full-time social workers to work and respond with its 17 officers.

Hat tip: NextDraft.

Signaling vs. certification at Harvard

Harvard will be teaching solely on-line this fall (with some students in residence), yet charging full tuition rates.  Many commentators are thus suggesting this supplies evidence for the signaling theory of education.

But not exactly.  The signaling theory, taken quite literally, is that education is a very difficult set of hurdles to surmount, and if you can get through Harvard you must be really really smart and hard-working.  Caltech maybe, but Harvard like Stanford and many other top schools makes it pretty easy to get through with OK enough grades.

The hard part about Harvard is getting in.  By selecting you, Harvard certifies you (as long as you are not part of “the 43% percent,” legacy, athletes, etc…but wait that counts too!).

Why isn’t there a service that just certifies you directly?  Surely you could run a clone of the Harvard admissions department pretty cheaply.

Perhaps the logical conclusion is that both the “social connections/dating” services of Harvard and the certification services of Harvard are strong complements.  If you are certified by Harvard, but live on a desert island, or carry a contagious disease, that certification is worth much less.  So it is hard to unbundle the services and sell the certification on its own, without the associated social networks.  Nor is it so worthwhile to sell the social connections on their own.  Harvard grads are socially connected to their dry cleaning workers as it stands, but that does not do those workers much good.

It takes a good deal more work to get signaling to enter this story.  In the signaling story, you can’t tell who is high quality without actually running the tournament, and that is more or less the opposite of the certification story.

Keep also in mind that the restricted Harvard services are probably only for one year (or less), so most students will still get three years or more of “the real Harvard,” if that is what they value.  And they can use intertemporal substitution to do more networking in the remaining three years.  It’s like being told you don’t get to watch the first quarter of a really great NBA game.  That is a value diminution to be sure, but there will still be enough people willing to buy the fancy seats.  Most viewers in the arena don’t watch more than three quarters of the game to begin with.

When Police Kill

When Police Kill is the 2017 book by criminologist Franklin Zimring. Some insights from the book.

Official data dramatically undercount the number of people killed by the police. Both the Bureau of Justice Statistics’ Arrest-Related Deaths and the FBI’s Supplemental Homicide Reports estimated around 400-500 police kills a year, circa 2010. But the two series have shockingly low overlap–homicides counted in one series are not counted in the other and vice-versa. A statistical estimate based on the lack of overlap suggests a true rate of around 1000 police killings per year.

The best data come from newspaper reports which also show around 1000-1300 police killings a year (Zimring focuses his analysis on The Guardian’s database.) Fixing the data problem should be a high priority. But the FBI cannot be trusted to do the job:

Unfortunately, the FBI’s legacy of passive acceptance of incomplete statistical data on police killings, its promotion of the self-interested factual accounts from departments, and its failure to collect significant details about the nature of the provocation and the nature of the force used by police suggest that nothing short of massive change in its orientation, in its legal authority to collect data and its attitude toward auditing and research would make the FBI an agency worthy of public trust and statistical reliability in regard to the subject of this book.

The FBI’s bias is even seen in its nomenclature for police killings–“justifiable homicides”–which some of them certainly are not.

The state kills people in two ways, executions and police killings. Executions require trials, appeals, long waiting periods and great deliberation and expense. Police killings are not extensively monitored, analyzed or deliberated upon and, until very recently, even much discussed. Yet every year, police kill 25 to 50 times as many people as are executed. Why have police killings been ignored?

When an execution takes place in Texas, everybody knows that Texas is conducting the killing and is accountable for its consequences. When Officer Smith kills Citizen Jones on a city street in Dallas, it is Officer Smith rather than any larger governmental organization…[who] becomes the primary repository of credit or blame.

We used to do the same thing with airplane crashes and medical mistakes–that is, look for pilot or physician error. Safety didn’t improve much until we started to apply systems thinking. We need a systems-thinking approach to police shootings.

Police kill males (95%) far more than females, a much larger ratio than for felonies. Police kill more whites than blacks which is often forgotten, although not surprising because whites are a larger share of the population. Based on the Guardian data shown in Zimring’s Figure 3.1, whites and Hispanics are killed approximately in proportion to population. Blacks are killed at about twice their proportion to population. Asians are killed less than in proportion to their population.

A surprising finding:

Crime is a young man’s game in the United States but being killed by a police officer is not.

The main reason for this appears to be that a disproportionate share of police killings come from disturbance calls, domestic and non-domestic about equally represented. A majority of the killings arising from disturbance calls are of people aged forty or more.

The tendency  of both police and observers to assume that attacks against police and police use of force is closely associated with violent crime and criminal justice should be modified in significant ways to accord for the disturbance, domestic conflicts, and emotional disruptions that frequently become the caseload of police officers.

A slight majority (56%) of the people who are killed by the police are armed with a gun and another 3.7% seemed to have a gun. Police have reason to fear guns, 92% of killings of police are by guns. But 40% of the people killed by police don’t have guns and other weapons are much less dangerous to police. In many years, hundreds of people brandishing knives are killed by the police while no police are killed by people brandishing knives. The police seem to be too quick to use deadly force against people significantly less well-armed than the police. (Yes, Lucas critique. See below on policing in a democratic society).

Police kill more people than people kill police–a ratio of about 15 to 1–and the ratio has been increasing over time. Policing has become safer over the past 40 years with a 75% drop in police killed on the job since 1976–the fall is greater than for crime more generally and is probably due to Kevlar vests. Kevlar vests are an interesting technology because they make police safer without imposing more risk on citizens. We need more win-win technologies. Although policing has become safer over time, the number of police killings has not decreased in proportion which is why the “kill ratio” has increased.

A major factor in the number of deaths caused by police shootings is the number of wounds received by the victim. In Chicago, 20% of victims with one wound died, 34% with two wounds and 74% with five or more wounds. Obvious. But it suggests a reevaluation of the police training to empty their magazine. Zimring suggests that if the first shot fired was due to reasonable fear the tenth might not be. A single, aggregational analysis:

…simplifies the task of police investigator or district attorney, but it creates no disincentive to police use of additional deadly force that may not be necessary by the time it happens–whether with the third shot or the seventh or the tenth.

It would be hard to implement this ex-post but I agree that emptying the magazine isn’t always reasonable, especially when the police are not under fire. Is it more dangerous to fire one or two shots and reevaluate than to fire ten? Of course, but given the number of errors police make this is not an unreasonable risk to ask police to take in a democratic society.

The successful prosecution of even a small number of extremely excessive force police killings would reduce the predominant perception among both citizens and rank-and-file police officers that police have what amounts to immunity from criminal liability for killing citizens in the line of duty.

Prosecutors, however, rely on the police to do their job and in the long-run won’t bite the hand that feeds them. Clear and cautious rules of engagement that establish bright lines would be more helpful. One problem is that police are protected because police brutality is common (somewhat similar to my analysis of riots).

The more killings a city experiences, the less likely it will be that a particular cop and a specific killings can lead to a charge and a conviction. In the worst of such settings, wrongful killings are not deviant officer behavior.

…clear and cautious rules of engagement will …make officers who ignore or misapply departmental standards look more blameworthy to police, to prosecutors, and to juries in the criminal process.

Police kill many more people in the United States than in other developed countries, even adjusting for crime rates (where the U.S. is less of an outlier than most people imagine). The obvious reason is that there are a lot of guns in the United States. As a result, the United States is not going to get its police killing rate down to Germany’s which is at least 40 times lower. Nevertheless:

[Police killings]…are a serious problem we can fix. Clear administrative restrictions on when police can shoot can eliminate 50 to 80 percent of killings by police without causing substantial risk to the lives of police officers or major changes in how police do their jobs. A thousand killings a year are not the unavoidable result of community conditions or of the nature of policing in the United States.

Thursday assorted links

1. Tips for slowing livestock growth due to plant closures.

2. “The Arizona Department of Health Services told a team of university experts working on COVID-19 modeling to “pause” its work, an email from a department leader shows.

3. Florian Schneider has passed away.

4. Source code for the Imperial College model.  And Sue Denim is very upset about the quality of that source code.  Another reader with a strong technical background wrote me equally critical remarks.  Are there further opinions on this?

5. Sujatha Gidla on her experience with Covid-19 (NYT), and here is my earlier CWT with her, one of my favorite episodes.

6. A new real-time journal COVID Economics.

7. Tankersley interviews Hassett and covers the brouhaha (NYT).

8. Effective Altruist forum ranks Fast Grants as one of their top two projects.

10. Jerry Seinfeld on success.

11. “A county in Washington State dealing with a coronavirus outbreak has identified a confounding new source of spread: “Covid-19 parties” organized so that people can deliberately mingle with an infected person in the hope of getting their own illness out of the way.”  (NYT link)  I wonder what they play for the music.

12. How are the social sciences evolving?  Less rational choice, for one thing.

13. Why are meatpacking plants hit so hard?  Holds true for numerous countries — is it the deliberate circulation of cool air?

14. Emily Oster and Galit Alter have a new Covid public health information site.

Thursday assorted links

1. Why is the Eastern European response better? (WSJ)  And how are Swedish hospitals doing?

2. Further doubts on the LOA and Santa Clara serology stories, it now seems they really do not establish any particular results.

3. Why a vaccine will be tough, a depressing thread.  And are China’s early patients shedding coronavirus?

4. Ezra Klein on why we can’t build things.

5. Why two decades of pandemic planning failed.

6. New Joshua Gans book, Economics in the Age of Covid-19, MIT Press.

7. Hollis Robbins on why some saw this coming before others.

8. New Becker-Friedman Center podcast on Pandemic Economics.

9. Various forms of presenting state-level data.  What exactly is going on with Ohio?

10. Are we prepping for vaccine state capacity?

11. Department of Why Not?: “Former Labradoodle breeder tapped to lead U.S. pandemic task force.

12. The couple that meets on the German-Danish border during lockdown (NYT).

13. The Fed and saving cities (NYT).

Monday assorted links

1. Looming condom shortage?

2. Kotlikoff argues for group testing.

3. “The Trump administration is leaving untapped reinforcements and supplies from the U.S. Department of Veterans Affairs, even as many hospitals are struggling with a crush of coronavirus patients.

4. Why America took so long to test, and yes the FDA is largely to blame (NYT).

5. “Those shown to have developed immunity could be given a “kind of vaccination passport that allows them, for example, to be exempted from curbs on their activities”, Gérard Krause, a leading immunologist co-ordinating the study, told Der Spiegel magazine.”  (The Times)

6. Why Singaporean health care workers have remained relatively safe.

7. This Week in Virology podcast.  I have not heard it, but it comes recommended.

8. The Gottlieb/Rivers/McClellan/Silvis/Watson AEI policy paper.

9. Audrey Moore RIP, Fairfax County environmentalist, she influenced my life a great deal, both good and bad.  Fairfax County now has 427 parks, in part because of her.

10. Robin Hanson argues for variolation.

11. The gender gap in housing returns.  (Do women care more about the non-pecuniary factors?)

12. The Ebola scare helped Republicans.

13. New SEIR infectious disease model from NBER.  And a new James Stock paper with a model.

14. Summary of where John Cochrane is at.

15. MIT The Elevate Prizes, up to $5 million.

16. Viral load as a source of heterogeneity?

17. James Altucher interviews me about the coronavirus economy, podcast.

Artificial Intelligence Applied to Education

In Why Online Education Works I wrote:

The future of online education is adaptive assessment, not for testing, but for learning. Incorrect answers are not random but betray specific assumptions and patterns of thought. Analysis of answers, therefore, can be used to guide students to exactly that lecture that needs to be reviewed and understood to achieve mastery of the material. Computer-adaptive testing will thus become computer-adaptive learning.

Computer-adaptive learning will be as if every student has their own professor on demand—much more personalized than one professor teaching 500 students or even 50 students. In his novel Diamond Age, science fiction author Neal Stephenson describes a Young Lady’s Illustrated Primer, an interactive book that can answer a learner’s questions with specific information and also teach young children with allegories tuned to the child’s environment and experience. In short, something like an iPad combining Siri, Watson, and the gaming technology behind an online world like Skyrim. Surprisingly, the computer will make learning less standardized and robotic.

In other words, the adaptive textbook will read you as you read it. The NYTimes has a good piece discussing recent advances in this area including Bakpax which reads student handwriting and grades answers. Furthermore:

Today, learning algorithms uncover patterns in large pools of data about how students have performed on material in the past and optimize teaching strategies accordingly. They adapt to the student’s performance as the student interacts with the system.

Studies show that these systems can raise student performance well beyond the level of conventional classes and even beyond the level achieved by students who receive instruction from human tutors. A.I. tutors perform better, in part, because a computer is more patient and often more insightful.

…Still more transformational applications are being developed that could revolutionize education altogether. Acuitus, a Silicon Valley start-up, has drawn on lessons learned over the past 50 years in education — cognitive psychology, social psychology, computer science, linguistics and artificial intelligence — to create a digital tutor that it claims can train experts in months rather than years.

Acuitus’s system was originally funded by the Defense Department’s Defense Advanced Research Projects Agency for training Navy information technology specialists. John Newkirk, the company’s co-founder and chief executive, said Acuitus focused on teaching concepts and understanding.

The company has taught nearly 1,000 students with its course on information technology and is in the prototype stage for a system that will teach algebra. Dr. Newkirk said the underlying A.I. technology was content-agnostic and could be used to teach the full range of STEM subjects.

Dr. Newkirk likens A.I.-powered education today to the Wright brothers’ early exhibition flights — proof that it can be done, but far from what it will be a decade or two from now.

See also my piece with Tyler, the Industrial Organization of Online Education and, of course, check out our textbook Modern Principles of Economics which isn’t using AI yet but the course management system combines excellent videos with flexible computerized assessment and grading.

Inequality, Stereotypes and Black Public Opinion: The Role of Distancing

There is less support for redistribution and race-targeted aid among blacks in the U.S. today than in the 1970s, despite persistent and enduring racial and economic disparities. Why? I argue that anti-black stereotypes suggesting blacks are lazy and reliant on government assistance have not only had consequences for political attitudes of whites but blacks as well. I note that as stigmas persist,they can have durable effects on the groups they directly stigmatize. To combat being personally stereotyped, some members of stigmatized groups will practice “defensive othering,” where one accepts a negative stereotype of one’s own group and simultaneously distances oneself from that stereotype. I illustrate the ways in which defensive othering plays a role in black attitudes toward redistribution using individual and aggregate level survey data, as well as qualitative interviews.

That is from a new paper by Emily M. Wager, via Matt Grossman.  And here are some of Emily’s other papers, many of them focused on why Americans do not feel compelled to respond to higher income inequality with bigger government.  Although still a graduate student, she is a future and indeed current star.  (She is on the job market by the way and also would be a great hire for economics departments.)  Here is her master’s thesis on who has enough influence to correct false perceptions from fake news.

Who favors unbreakable commercial encryption?

Governments may be the main threat to big tech companies’ current approach to encryption, but there is another, more surprising threat: their own business interests. The techno-libertarians’ absolutist rejection of lawful access has never been tenable in a commercial context. Barr lambasted Silicon Valley for claiming that government access to consumer devices was never acceptable, even for a purpose as critical as stopping terror attacks, while insisting that its companies had to have access to all their customers’ devices for the purpose of sending them security updates (and, in Apple’s case, promotional copies of unwanted U2 albums). What’s more, Big Tech’s best customers—that is, businesses—don’t want unbreakable end-to-end communications direct to the end user. That encrypted pipe makes it impossible to find and stop malware as it comes in and stolen intellectual property as it goes out. It also thwarts a host of regulatory compliance mandates. So, pace the absolutists, tech companies have found ways to ensure that their business customers can compromise end-to-end security.

And there is this:

…I believe the tech companies are slowly losing the battle over encryption. They’ve been able to bottle up legislation in the United States, where the tech lobby represents a domestic industry producing millions of jobs and trillions in personal wealth. But they have not been strong enough to stop the Justice Department from campaigning for lawful access. And now the department is unabashedly encouraging other countries to keep circling the tech industry, biting off more and more in the form of law enforcement mandates. That’s a lot easier in countries where Silicon Valley is seen as an alien and often hostile force, casually destroying domestic industries and mores.

The Justice Department has learned from its time on the receiving end of such an indirect approach to tech regulation. It has struggled for 30 years against a European campaign to use privacy regulation to prevent tech companies from giving the U.S. government easy access to personal data. But as the tide of opinion turned against U.S. tech companies around the world, the EU was able to impose billions in fines on them in the name of privacy. Soon it really didn’t matter that these companies’ data practices weren’t regulated at home. They had to comply with Europe’s General Data Protection Regulation. And once they accepted that, their will to lobby against similar legislation in the United States was broken. That’s why California—and perhaps the federal government—is inching closer to enacting a privacy law that resembles Europe’s.

Here is the full Stewart Baker post, interesting throughout.