I am indebted to Bryan Caplan for developing and popularizing the idea of “firing aversion.” The core notion is that employers often do not wish to fire people for one of the same reasons they do not wish to cut nominal wages — it can demoralize their broader workforce. Furthermore, some bosses simply may feel squeamish about the idea of firing people they know and like.
In the old days, bosses might have enjoyed “busting heads” to keep all the workers in line, but in this softer millennial age, well firing aversion is the order of the day, if only to ease future recruitment and boost intangible capital and institutional continuity.
Now imagine a macroeconomy where firing aversion is present. At time period zero, a boss hires one hundred workers, who at the time are perceived as being of roughly equal quality and thus are offered the same wage. After a few years on the job, however, some are “keepers,” while others are being paid more than their marginal products.
Because of firing aversion, they are not fired. Because of sticky nominal wages, they also do not take a pay cut. If the economy is imperfectly competitive, and times are good, this nonetheless can be a stable equilibrium.
Now let’s say a negative shock comes along: demand, supply, maybe a bit of both, as is usually the case. At some margin these workers can no longer be carried and the firing aversion of the boss is overcome and they lose their jobs. Then, a few points:
1. They’re not getting those jobs back.
2. They’re not worth a comparable wage elsewhere in many cases.
3. Per hour productivity likely will rise, even adjusting for ex ante measures of changes in worker composition.
4. Companies won’t want to pay higher wages to lure these workers out of leisure, rather they are branded as less productive than average and properly so.
5. These workers will have to lower their wage expectations for the next job by an above-average amount. That is one reason why their reemployment may be slow. And they won’t re-enter the labor market at anything like their old wages.
6. As the economy returns to full employment, you won’t observe rising wages in the traditional sense because these workers are pulling in relatively low wages.
7. The more that Charles Murray is right in his Coming Apart, the stronger some of these effects will be. Yet none of it requires a “sudden attack of laziness.”
8. The more the employer can tell apart the quality of different workers, the slower the recovery will be and the less pro-cyclical wages will be. Arguably we have been seeing this difference at work since the G.H.W. Bush recession.
OK, now maybe you don’t buy firing aversion, fair enough. Just sub back in the traditional assumption that bosses study and scrutinize worker quality more in tough times, when revenue is tight, and you get essentially to the same place and the same conclusions as listed above. Firing aversion is simply one way of stylizing a pretty simple incentive effect, namely that the weaker workers have a better chance when cash is flush.
Addendum: How many blog posts have I read asserting “Since wages are not rising, etc., therefore various conclusions including lack of full employment, etc.”? Hundreds, I believe, mostly from the Keynesian bloggers. But in the data, real wages were never very cyclical in the first place. And in theory we should not expect much if any real wage cyclicality either. Most of all, the more employers can measure worker quality, the less cyclical real wages will be. And yes I know real wages just rose a lot (see the next blog post to come), if anything that is a sign recovery has been here for some time, not that finally recovery is arriving.