The Interpersonal Sunk-Cost Effect

Christopher Olivola
Psychological Science, forthcoming


The sunk-cost fallacy — pursuing an inferior alternative merely because we have previously invested significant, but nonrecoverable, resources in it — represents a striking violation of rational decision making. Whereas theoretical accounts and empirical examinations of the sunk-cost effect have generally been based on the assumption that it is a purely intrapersonal phenomenon (i.e., solely driven by one’s own past investments), the present research demonstrates that it is also an interpersonal effect (i.e., people will alter their choices in response to other people’s past investments). Across eight experiments (N = 6,076) covering diverse scenarios, I documented sunk-cost effects when the costs are borne by someone other than the decision maker. Moreover, the interpersonal sunk-cost effect is not moderated by social closeness or whether other people observe their sunk costs being “honored.” These findings uncover a previously undocumented bias, reveal that the sunk-cost effect is a much broader phenomenon than previously thought, and pose interesting challenges for existing accounts of this fascinating human tendency.

Via the excellent Kevin Lewis.


I always wonder whether some portion of the sunk cost fallacy also relates to uncertainty.

For example, if I have a machine in my business that I paid $10K for, and returns me $1K per month. A salesman comes in and says "I have a new machine that will cost you $20K, and it will return $3K per month". My payback on that new machine is 10 months (increase of $2K per month), but maybe I prefer to keep my old machine.

So maybe someone would say "sunk cost fallacy - you should have junked your old machine and taken the new one."

And a cynic might say that promises of salesman aren't worth anything, and better the devil you know. The machine I have is definitely returning $1K per month, the new machine is promised to save $3K per month, but maybe it doesn't, and maybe when I try to install it I have to retrain my staff, and maybe there's a special job I do that the new machine wouldn't do as well, and maybe lots of stuff. It feels to me like an innate conservatism or skepticism isn't unhealthy, and there's no particular reason that wouldn't be baked into our natural state - people who always jump to a new thing might on average not end up better off.

Unless the machine is less than a year old people probably wouldn't use the term sunk cost to such a situation. I have seen building project start with an estimate of about $2,000,000 and after $250,000 was spent it became apparent that at least $4,000,000 more would have to be spent to finish the project. If the original estimate was $4,000,000 the building probably wouldn't have been built, but the $250,000 sunk cost influenced the decision to continue.

But I believe a similar concept applies here. Any alternate use of the money may also involve similar cost over-runs. In a situation of uncertainty, the project you have already invested a bunch of money in has an extra data point in it's favor.

The alternative use is not borrowing the money for the project and losing money.

Unless your original decision to build a $2m building included an assumption there'd probably be a cost overrun. So your actual decision might have been "We'll build for $2m but we know building estimates are always wrong, so it might be as much as $4m". And now you're far enough in to go "yup, it's $4m, but it's no $8m." I'm not saying there's no sunk cost fallacy, I'm just saying that perhaps some of what researchers categorise as sunk cost actually includes an element of unwinding of uncertainty.

I see the "sunk-cost fallacy" as just another example of Social Science unable or unwilling to deal with the real world complexity of human decision making. But then again, I am profoundly ignorant about what the experts in the area actually claim. What I DO know is that changing your mind is neither solely intrapersonal - it involves social costs, nor is it only an issue about quantitative aspects of human behavior. I think few leaders can do what Trump does and change their mind many times a day. Followers typically want leaders who exhibit stability in their "vision". There is a 'cost' to changing your mind. It's not just about ego, it is also about respect and appearance. It would seem to me obvious that it is rationally only about interpersonal dynamics (although as Kahneman's Slow and Fast paradigm suggests, there IS intrapersonal cost when a well established process is replaced with novelty (and its corollaries and side-effects).The latter requires a lot more mental effort (to get to the same state of comfort).

...but is it rational to consider the cost of being rational when making decisions?

Excellent points. Experiments are designed to make the effect under study "clean." But in the real world, sunk costs are embedded in complex relationships where it is not always clearcut that they are pure sunk costs. Someone behaving in a narrowly rational way may be mistaken or else signal that they are not a normal human being but either a sociopath or an extreme nerd. And hence less reliable to outsiders. Many of the survival traits of humans -- the ability to love, to be loyal, to be empathetic, to cooperate, to be patriotic -- require an ability to suspend narrow rationality. Those without such traits or those who need to calculate cost & benefit to simulate those traits are not trustworthy.

People behaving contrary to our expectations = irrational.

Getting married. Having children.

People assess others by 'putting themselves in their shoes'.

So what is the novel finding of this 'paper'?

In other news [see next post] people are happier with more money.

And I used to think economics was the most useless social science.

On an interpersonal level, the biggest examples of dunk costs are building relationships.

I note conservatives use the sunk cost fallacy to argue liberals have destroyed America, society, people, by supporting divorce, abortion, and rejecting faith.

On the other hand, they dismiss long term employment relationships and argue workers are commodities, ignoring ultimately higher costs of constantly replacing workers.

Are you sure you're not arguing against a straw man? I don't know a lot of conservatives who argue against long term employment relationships, but I know plenty who argue against arrangements that force you into long term employment relationships - i.e. most conservative businesspeople I know are very keen on long term employees and good employees. They just also want the ability to get rid of bad employees, rather than some external party deciding that the bad employees should be long term as well.

It's clearly a strawman argument. He's just trying to shoe horn the current topic into his never ending diatribe on how horrible conservatives are.

"Interpersonal Sunk-Cost Effect" is quite a fancy way to describe tradition.

Take one of Tabarrok's posts on agricultural best practices in China. Researchers try to convince farmers to change their traditional ways in order to improve productivity.

This situation can be framed as interpersonal sunk-cost effect because people pursues "an inferior alternative merely because they have previously invested significant resources on it".

The correlation between traditional agriculture and low yields is perfect, but are sunken-cost the cause?

In a global economy, there are more choices, more choices with decidedly different costs and potential profits. When a company decides to abandon a plant and relocate to the south or to the far east, it's not only the empty plant left behind but an empty community. To the company, it's the rational choice. But is it? This is a blog dedicated to the notion that disruption is the path to prosperity, order and stability the path to stagnation. But economists don't include in their calculations the human costs of disruption. Not only the lives directly affected, but the social cost to a community, a region, or an entire nation. I was shocked the first time I visited the industrial mid-west in the late 1980s and saw all those abandoned plants, many still filled with heavy machinery, the only employee remaining left with the task of keeping the machines oiled so they wouldn't rust, the ground and water in the area forever contaminated by industrial waste. The enormous scale of the costs of disruption to the community, the region, the nation, and the planet ignored except for the legal liability for the contamination. Sunk costs, indeed.

agree w other commentators that there are all sorts of reasons why other people caring about something in prior periods may make it rational to assume that doing so has a positive value going forward

Yup, I was going to make a comment but the other commenters have already covered things well.

I'll add that many of those additional sorts of reasons are due to imperfect information, and thus can be viewed as a type of signaling or reading of signals. And all too often some of the commenters here --
and some of the GMU economists -- assume that signaling is inherently bad or inefficient.

But in a world with imperfect information the best that we can hope for is a signaling equilibrium that is more efficient than the feasible alternatives, even if it is less efficient than a mythical perfect information equilibrium.

I.e. to repeat what many of the commenters have said here, sometimes paying attention to sunk costs is the right thing to do.

I should add that learning about the Sunk Cost Fallacy is one of many valuable economic principles that students learn (or at least should learn) when they take an economics course. So I'm not claiming that it's okay to always keep looking at sunk costs.

Here is an unusual application of the sunk-cost fallacy, to what Facebook paid for WhatsApp (the founders of WhatsApp have left Facebook because Facebook (i.e., Zuckerberg) broke the promise not to advertise on WhatsApp in order to monetize the investment in WhatsApp):

Here's the pre-registered research design:

Would you spend that last $1M to finish a (presumably) much larger R&D project after you have reason to believe it's going to fail? And does it matter if you were the CEO responsible for initiating the project vs having taken over for someone else?

There are all kinds of potential considerations here other than just plain sunk costs. Was the CEO an adversary? Do you have the job now because of your opposition to this kind of R&D spending? If so, you might want to cancel the project if the proof of its deficiencies is completely clear. But if not, you might want to let it continue and fail on its own merits -- lest you be accused of killing your enemy's project out of spite and hurting the company. But maybe the previous CEO was your mentor and you fully supported the project. In which case, your reputation is on the line almost as much as if you'd signed off on it in the first place. In that case -- if you kill the project, are you going to be fired now? If so, you may as well roll the dice, keep funding the project, and hope something goes wrong with your competitor's plane -- like the de Havilland Comet.

In any case, the considerations seem much more complex than simply whether or not the subject understands the sunk cost fallacy.

You will also usually negatively impact team morale by cancelling a project like this so close to completion.

There are additional values to consider, including ones with real financial benefit to the company, that cannot be tracked simply by sales projections.

If the 'sunk cost bias' manifests even when the person biased was not involved in the original decision, that seems like extremely strong evidence for the alternative accounts of sunk-cost behavior like option value, value of information, coordination costs, and so on.

Curing humanity of the sunk-cost bias would likely spike the divorce rate to the high 90s.

The sunk cost fallacy, as it relates to preferences, is itself a fallacy. Preferences change with consumption; it isn’t irrational to finish reading a book I dislike half way through, since my preferences aren’t the same as when I started reading it. I dislike reading books unfinished. Unless there’s an objective goal such as a firm maximizing profits, the sunk cost fallacy doesn’t apply.

The comments today seem to be a clear indication of just how many people can't even really say what the sunk cost fallacy is, much less come up with an example of it.

I'm curious how the discussion would be different if instead of that phrase, we just talked about "throwing good money after bad".

Comments for this post are closed