Minimum wage hikes and real net wages

Richard McKenzie reports:

…past experience has confirmed the nonmonetary impact of a minimum-wage hike on workers, not only in reduced fringe benefits but in increased work demands and decreased job training. For example:

  • When the minimum wage was increased in 1967, economist Masanori Hashimoto found that workers gained 32 cents in money income but lost 41 cents per hour in training — a net loss of 9 cents an hour in full-income compensation.
  • Similarly, Linda Leighton and Jacob Mincer in one study, and Belton Fleisher in another, concluded that increases in the minimum wage reduce on-the-job training and, as a result, dampen long-run growth in the real incomes of covered workers.
  • Additionally, North Carolina State University economist Walter Wessels determined that a wage increase caused New York retailers to increase work demands. In most stores, fewer workers were given fewer hours to do the same work as before.
  • More recently, Mindy Marks found that the $0.90 per hour increase in the federal minimum-wage rate in 1990 reduced the probability of workers receiving employer-provided health insurance from 66.2 percent to 63.1 percent, and increased the likelihood that covered workers would be reduced to part-time work by 26 percent.

Wessels also found that for every 10 percent increase in the minimum wage, workers lose 2 percent of nonmonetary compensation per hour. Extrapolating from Wessels’ estimates, an increase in the federal minimum wage from $7.25 to only $9.00 an hour would make covered workers worse off by 35 cents an hour.

And if the minimum wage were raised to $10.10 an hour, for example, the estimated 16.5 million workers earning between $7.25 and $10.10 could lose nonmonetary compensation more valuable than the $31 billion in additional wages they are expected to receive.

I would be skeptical or agnostic about some of those particular estimates, but surely the general point holds, and is hardly ever mentioned by advocates of hiking the minimum wage.


The reduction in training by employers of workers by employers is ironic, as the return from better trained workers would be to the employers benefit.

Shhh - CEOs are superstars, and they know how to run a company better than anyone else could ever possibly imagine.

Anyone familiar with the acronym CFIT? It applies perfectly.

That really depends. If it's a general skill that could be ported to a new employer, the returns would largely go to the employee in the form of higher pay (that they are already getting because of the minimum wage hike). If forced to pay for that skill regardless, employers will tend to be more demanding in their hiring.

An employer-specific skill would benefit the employer.

Also, these reductions were noted in 1967 and 1981. I'd think modern minimum wage recipients don't have much "training compensation" to lose. Wessels makes the most coherent theoretical point, but I think the liberal argument would be that there's little non-monetary compensation left to cut and thus an increase would help the recipients.

Of course, who will end up paying for the increase? The conservative response is that the costs will be passed on by corporations to customers, who won't all be rich.

These low wage jobs are also characterized by turnover. If you are smart, you move on to something better, if you aren't smart you can't keep the job. Additional training doesn't gain the employer a whole lot, but the employee gains substantially since the training stays with them and improves their prospects in the future. It isn't the specific skills, but the confidence and learning to learn that is valuable.

Jeese why didn't the employers think of that! They should be taking business advice from blog comments rather than their decades of experience on what actually brings money into the business.

I take most of these with a grain of salt.

I'd be interested to know how much paid training minimum wage workers receive these days and how one estimates its value. Minimum wage jobs are pretty typically very low skill and I'd be surprised if there is much training on transferrable skills that directly lead to increased wages.

If employers are rational actors, wouldn't they already be employing the fewest number of people to do the most work as possible? Companies already cut a lot of the fat in last recession. Is there much more to cut?

Employer-sponsored insurance has been going up, if you believe the latest RAND survey. Employer mandate aside, many people think it is less than ideal to link insurance coverage to employment at a particular place. Job lock, etc.

"If employers are rational actors, wouldn’t they already be employing the fewest number of people to do the most work as possible?"

Of course employers want employees to be as productive as possible. But they also decide how much work to do, based on the costs and benefits. Raising the wages of employees changes all of these cost-benefit calculations. Suppose a fast food restaurant will bring in an additional $X per day by having an extra cashier, increasing the number of customers that can be served. It is worth hiring a new employee only if the cost of the employee is less than that amount. If the cost of labor goes up, it may be more beneficial to serve fewer people at lower cost.

An extension of this is that some businesses are no longer viable when wages reach a certain point, so the optimal amount of work to do becomes 0.

I can understand your argument, but I don't think it aligns with the specific findings of the NC study: "Wessels determined that a wage increase caused New York retailers to increase work demands. In most stores, fewer workers were given fewer hours to do the same work as before."

This sounds to me like the employer says, "Bill, you used to fold 25 tacos an hour, and that was fine, but now you must fold 35." They aren't saying that employers cutback their output in response to the wage hike.

I am skeptical, too, that there is much capacity to just increase workload to make up for having fewer workers.

It's not really an either/or situation, though, as the two effects are closely related. In the example I gave above, if a fast food restaurant cuts 20% of its cashiers, it might serve 10% fewer customers. The remaining cashiers would be doing more work. One can imagine that the extra cashiers really only increased capacity at peak meal times, when everyone was working non-stop. The rest of the day, there are now fewer workers to handle the same amount of business. This is not just a contrived example; in most service industry jobs, when you add more employees, you do more business, but it is a declining return the more employees you add, and wages dictate how many employees it is worth having.

In my opinion, this is a big part of what is going on, and increased work demands are just the side of it that the employee experiences.

Another effect, though, is that if the minimum wage makes jobs scarcer, the employer has more leverage and can successfully demand more out of employees.

I don't know how many minimum-wage jobs you have worked, but It would seem that our views of these type of workplaces is a bit different. First, at most minimum-wage jobs, capital isn't at the location beating the drums and motivating the employees. There are a series of intermediaries between capital and labor, each subsequent layer identifying less with capital and more with labor. No one (that I've met) works at 100% effort for the entire workday. Efficiency gains can be muscled out of employees, but this is unpleasant business for the direct supervisors. It is entirely possible that new and greater productivity standards issued by capital can be squeezed out of labor if given adequate urgency.

For instance, you are posting during what are *likely* your work hours, just like many of the commentators here. Evidently, more direct supervision could steer your (and/or others') efforts from internet blogs to company business. At the current price levels however, your employers have decided not to employ retired drill sergeants.

It can be true in individual cases. But can the entire system just "get better", so that productivity stays the same? I am skeptical. Overall, I think this leads to a new equilibrium, where worker output is higher, but total productivity has also gone down.

41 cents an hour in training? What could he possibly mean?

That back in 1967, just before employers were forced to pay a federal minimum wage of one dollar an hour, America corporations had been also paying more than half of their workers' previous wage of 68 cents in training costs.

At least by one reasonable interpretation of 'workers gained 32 cents in money income but lost 41 cents per hour in training — a net loss of 9 cents an hour in full-income compensation.'

Well, reasonable in the sense of trying to illustrate the utter absurdity of what was written without apparently even a hint of satire.

The US minimum wage was first introduced by FDR in 1938.

It was $0.25 per hour.

It clashes with the idea that higher wages = lower churn = higher investment in each employee. When you have to pay somebody more it seems odd to reduce productivity-enhancing training at the same time.

Then again at the extreme, we see in the NFL the shift to highly compensated top rookies and a perceived need for them to begin contributing immediately--even when, especially for QBs, there's some evidence that cutting out their "training period" behind a veteran has been bad for a lot of high-profile rookies. As opposed to Aaron Rodgers and Tom Brady, who had time to train in their offensive systems before starting. Just throwing that out there.

'I would be skeptical or agnostic about some of those particular estimates'

Obviously - as a crime fighter par excellence, nothing was said about how raising the minimum wage increases crime, which you are apparently neither skeptical nor agnostic about.

"workers lose 2 percent of nonmonetary compensation per hour"

What non-monetary compensation does a fry cook receive? It's not like they get retirement benefits or paid gym membership. If all you get form an employer is a paycheck, I'm trying to figure out what they would take away?

the last bullet point: "More recently, Mindy Marks found that the $0.90 per hour increase in the federal minimum-wage rate in 1990 reduced the probability of workers receiving employer-provided health insurance from 66.2 percent to 63.1 percent, and increased the likelihood that covered workers would be reduced to part-time work by 26 percent."

Fry cooks dont get health care coverage, even if they are working full time. No mimimum wage fast food worker is getting health care coverage. Maybe in manufacturing but the idea that your subway sandwich artist is getting health care coverage is laughable. Its hard for me to believe that even 63% of mimimum wage workers are receiving health care coverage. Its not even 5% in the food service industry (which represents a pretty large share of minimum wage workers). Also, it makes no difference if an employee is full time or part time in fast food, other than the difference in hours worked. Its not like there are any full time employee benefits or retirement accounts or anything.

I think everyone can agree that with a minimum wage increase you get some combination of fewer hours worked (as owners try to cut labor costs back to their previous share of gross output) and decreased profits going to the owners of the capital (as there simply isnt that much fat to cut). If the establishment is already operating pretty efficiently, there is little room to cut back hours (without sacrificing quality in some way be it service speed or store cleanliness) and whats lost is all profit accruing to the owners. A larger percentage of the gross is going to labor costs, end of story, and there is nothing the owners can do about it. So they all hate it, why wouldnt they?

However, from the workers perspective, even if hours are cut back by 10%, employees are better off. Assuming that employees are making 10 bucks per hour instead of 7 bucks per hour, they make out better by 80 bucks per week working fewer hours (assuming 40 hours a week). Not to shabby. Since they werent receiving health care coverage or any other benefits in the first place, who cares.

All in all, in fast food, its most likely that workers will be better off and the owners of the capital will be worse off. The idea that the reduction in training benefits is so large as to make the overall gains negative is laughable. What kind of training are they assuming?. Learning to make a sandwich or make a milk shake doesnt require all that much training. A couple of hours maybe.

For all of this text you literally add nothing. I'll summarize:

1. You don't care what any source says, you will refuse to believe that any fast food workers receive health insurance.
2. You prefer a larger share of gross expenditures going to labor, without qualification (as in absolute size of labor expenditures).
3. You presume that minimum wage increases are always beneficial to workers because of your simplistic model.
4. You laugh off anything that you don't agree with.

You could have just said "NOPE!!!!!11!" and spared us all the text that you clearly should be dedicating to your dissertation in oppression studies and/or sociology.

Subway sandwich artists are offered health insurance, dental and vision coverage, paid vacation and 401k plans. The same goes for other fast food restaurants and retailers. Such benefits were available to me as a near-minimum-wage worker at Staples.

It is true that many workers don't receive these benefits, because they tend to be part-time, students, not the primary earner in their household, etc. But they are available. You simply have no idea what you are talking about.

To the extent that the decline in training is true, raising the minimum wage might increase income inequality over time. I can't figure out if this would be an unintended consequence.

From my own time as an actual minimum-wage-drawing employee in the fast food (Taco Bell) and retail (Foley's) sectors, call me...skeptical. I received very, very little training for either of these jobs, in part because they had clearly been modified over the years to require as little training as possible. My equipment in both places, for example, had been carefully designed for simplicity of operation with minimal instruction.

This study may hold true for the historical study period, but I'm unconvinced that it tells us much about the present.

This should be a disclaimer on this website, personal anecdotes don't get you very far and offer little evidence.

The most remarkable claim is that hikes are literally free of cost.

If you have 20 employees working for minimum wage, and the minimum wage is increased 10%, what happens is that the 20 people get 10% less hours, or 10% less other benefits of some kind, keeping the employment costs the same.

That matches what I've seen. Send half the crew home 1/2 hour earlier after the lunch rush.

Given that there are much better anti-poverty programs and that the evidence on the effect of the minimum wage is mixed at best (and that's being generous), why does anyone support it again?

In most aspects, symbolism and signalling. The EITC is too abstract to hang an expansion of social welfare on it and either drawing attention to it or associating it too closely with the poor jeopardizes its existence.

And of course a UBI is anathema because welfare queens.

If most low-wage employee training is for new employees and if minimum wage laws result in less employee turnover then training expenditures would decrease.

Yeah, this seems like a problem. Out of all the claims made, I am most skeptical of the training ones.

In 1967 the minimum wage rose from $1.25 to $1.40,

$0.41 would be one-third of the $1.25 wage. I find it impossible to believe that firms were giving training worth one-third of minimum wage employees salary.

Exactly what kind of return did the employers expect to get from that investment..

At least provide some documentation on how that $0.41 estimate was derived.

Very good post. Of course, this is just common sense. Like you, I'm unsure if the reported magnitude is accurate, but the unwillingness of the left to even consider that this is how the big mean ole world works is interesting.

Comments for this post are closed