The limits of good vs. evil thinking

Good vs. evil thinking causes us to lower our value of a person's opinion, or dismiss it altogether, if we find out that person has behaved badly.  We no longer wish to affiliate with those people and furthermore we feel epistemically justified in dismissing them.

Sometimes this tendency will lead us to intellectual mistakes.

Take Climategate.  One response is: 1. "These people behaved dishonorably.  I will lower my trust in their opinions."

Another response, not entirely out of the ballpark, is: 2. "These people behaved dishonorably.  They must have thought this issue was really important, worth risking their scientific reputations for.  I will revise upward my estimate of the seriousness of the problem."

I am not saying that #2 is correct, I am only saying that #2 deserves more than p = 0.  Yet I have not seen anyone raise the possibility of #2.  It very much goes against the grain of good vs. evil thinking:  Who thinks in terms of: "They are evil, therefore they are more likely to be right."

(Which views or goals of yours would you behave dishonorably for?  Are they all your least correct views or least important goals?  With what probability?  Might it include the survival of your children?)

I do understand that this line of reasoning can be abused: "The Nazis went to a lot of trouble, etc."  The Bayesian point stands.

Another example of misleading good vs. evil thinking stems from the budget.  Many people believe:

3. "If the Republicans win, they will irresponsibly cut taxes and do nothing real to control spending."  You may have even seen this view in the blogosphere.

One response to this is 4. "We should ensure that the Republicans do not win and criticize them every chance possible."

An alternative response is 5. "Sooner or later the Republicans will in fact win and I cannot prevent that.  Right now the Democrats should spend less money, given the truth of #3.  In this regard the Republicans, although evil, are in fact correct in asking the Democrats to spend less money, if only to counterbalance their own depravity."

I do not see many people entertaining #5.  #5 implies that a group judged as dishonest should be granted some probability of speaking the truth on an important issue.  (Nor will pro-Republicans be attracted to a claim which portrays their group as dishonest.)  Note also that by accepting #5 you are admitting and partially accepting the ability of the Republicans to "out-game" the Democrats.  That makes #5 even harder to accept.

Again, I am not asking you to buy #2 and #5 outright.  I am simply suggesting they have a higher "p" than many people are willing to grant them.  And that is because we are accustomed to judging the truth of a claim by the moral status of the group making the claim.

Comments

Which views or goals of yours would you behave dishonorably for?

None, I should hope.

You?

2. "These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for.

I strongly suspect this is correct. Unfortunately, I am also well aware that once you get to that stage, it's almost impossible to be able to continue to look at the data objectively.

Thus, for me,

I will revise upward my estimate of the seriousness of the problem."

is not true.

I'm fairly sure their concern drove them to pad the data. The fear is that if lots of scientists made the same choice and based it on the overwhelming evidence being provided by their colleagues who where also padding their data, we end up with a lot more uncertainty in our equation.

The pragmatics of getting elected make 5 unlikely. You do not get elected and rewarded for cutting spending. You get rewarded for cutting taxes. At some point the debt will need to be paid off and cutting enough spending to do that will lose the following election.

Steve

2. "These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for.

But why did these scientists (and Nazis) think it was really important?

The way to stop seeing people "on the other side" as evil is to have friends "on the other side". My life is much richer because my friends don't agree with me about everything.

A question is: where do you stop with the whole evil thing? At what point is someone "good enough" or "not too evil" to associate with or treat with civility or take seriously? How big is that circle beyond your own circle of belief?

There is a great quote from William Safire in the current issue of "The Green Bag" (http://www.greenbag.org) that is somewhat germane:

"The magic moment of maturity in every voter's life comes when he or she accepts the fact that no candidate is right on every issue. At that moment, one becomes either a chooser or a loser....

I will not say here how I will vote, partly because it's against my newspaper's rules, but mainly because millions of undecideds would follow like lemmings and make me responsible for the result and unable to creatively backbite or pettily bicker."

-- William Safire, "The Clothespin Vote," The New York Times, Nov. 6, 2000

There are real reasons why the climategate is special. I don't know if p>0 for 2 or not, but they shouldn't be personally interested. They are of course because funding is zero-sum. This is no different than other areas of science, except that they are promoting an agenda beyond obtaining funding. Science shouldn't be politicized, but it is. But, that doesn't change the fact that it shouldn't be.

All scientists think their area is under-respected (else they'd pursue another area?) and overly competitive so they advocate for their angle. However, the politics is based entirely on their reputation as being disinterested men of science. Consensus and conspiracy are separated by this thin line. Peer review is always vulnerable to corrupt gate-keeping, it should be less so in science. I have no doubt that people always think they are well meaning. My opinions haven't changed, they've just been reinforced. But, my opinions have always focused on the system and not the individuals.

Although your point that many people are irresponsibly viewing climategate in good vs. evil terms is certainly accurate, such a viewpoint is not necessary in order to condemn the controversial statements that sparked climategate or to believe that the CRI scientists have seriously damaged their credibility.
As I don't know any of the implicated scientists I can hardly judge their moral character. However I can say that as scientists they made serious mistakes. Scientists aren't allowed to make choices that "risk their reputation" simply because they think some issue is really important. As scientists their reputation, based upon a set of principles, is all they have. So when it is shown that some scientists have broken those principles by manipulating data-sets, destroying data-sets, and stifling the publication of new data-sets their reputations, again as scientists, should fall.

This is not a moral position, just a scientific one. Personally I assume that these scientists are good moral people who believe strongly that climate change is occurring and are trying their best to mobilize people to think as they do. Unfortunately for them, that is the job of a politician, not a scientist. I can assume that they were well-intentioned and at the same time contended that their actions and words have seriously damaged the credibility of their message.

"These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for. I will revise upward my estimate of the seriousness of the problem."

Are you kidding me? These people have done more than risked their scientific reputation. They have caused many, including me, to doubt all of the "science" about global warming. How can that simple result ever be excused? Who cares whether they thought that lying for their cause was worth the risk to their scientific reputation? They have tarnished the work of honest scientists.

This is only true if we ignore the specific nature of the actions in question, simply categorizing them as dishonorable and reassesing our level of trust based on that.

Taking candy from small children or cheating at golf are also dishonorable acts. If we had found out that the climate researchers were engaging in such acts we could justifiably be shocked. However, Tyler's logic regarding the trustworthiness of their climate research would hold, since these acts have nothing to do with the climate change debate.

But this is clearly not the case here; actively colluding to withhold or distort climate data and suppress opposing views or conflicting evidence is not just dishonorable, it bears directly on the trustworthiness of the supposed scientific consensus.

In fact, whether or not the conduct revealed in the leaked emails is honorable or not isn't particularly relevant at all. Even if we grant for the sake of argument that these were perfectly honorable acts, by their very nature (and the stated intention of the parties in question) they have almost certainly had a biased effect on the public understanding of the science.

It is on this basis that the leaks should cause us to re-evaluate our trust downward.

I'm a bit confused, Mr. Cowen. Are you saying you believe the probability that someone will think, "These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for." is greater than zero? Or, are you saying that the opinion has a probability of being true that is greater than zero. Are you saying both?

the fact that they were unable to actually make their case scientifically is the reason that they had to try and do it unscientifically and attempt to present it as something other than what it was.

The problem is that what you need to make your case scientifically is *not* what you need to make your case politically.

To use an analogy. In 1950, it was pretty much scientifically determined that cigarettes caused lung cancer. It took another *14* years before the Surgeon General (i.e. the Government) could make the same declaration due to cultural factors and more importantly, the efforts of those who would lose financially.

One could argue that the scientists chose to preserve their reputation and the scientific method by sacrificing a few hundred thousand American lives.

Now you're looking at 100 million lives (mostly poor) and the data doesn't make (and never will) as "open and shut" case. How long do you wait? 20 years? 30 years?

Are those lives really worth that much less than your reputation?

(I still remember one posting: "We only have to delay this until most of them (those flooded out) are dead." I think the poster was joking, but I bet he reflected the fears of the AGW scientists)

So the greater the belief, the more likely the believers are to be right? Then God must exist, and I've been on the wrong team. Ohhhhh crap.

Interesting topic!

I would say instead of, or in addition to, #5, a #6 is applicable:

Republicans will win again on a rhetoric of cutting spending, there is nothing I can do about that. However, once in office they will cut taxes and redistribute spending to Republican causes and make the country's fiscal situation even worse. This in turn will bring Democrats back into office, willing to offset spending with tax increases. And the cycle begins anew.

Soren Kierkegaard addresses this issue in "Fear and Trembling", particularly in the chapter: The Teleological Suspension of the Ethical.

SK makes the point that the entirety of this dynamic occurs internally in the individual and is effectively unexpressable. In other words, one can witness another's actions (Abraham going to sacrifice Isaac) but can not thereby deduce motives or intentions or honorability. Morality is by faith, and faith without action is dead, but action does not imply morality to the external viewer.

Climategate in not a good example because all we see is scientists "behaving dishonorably" for dishonorable reasons.

I'm more willing to entertain these notions if they are coming from groups that historically bargain in good faith (i.e. usually avoid "ends justifies the means" type of strategy)

It is a slippery slope, but the scientists get a pass in my book (and so, by the way, does the Freakonomics team).

Republicans, however, do not. I've heard too lies, witnessed too many hypocrisies, met too many blind faithful to countenance this type of appeasement.

You forgot point number 6: As a professor, I just can't accept the fact that other professors lied through their teeth in an effort to impose their religion on the world, so I'll try to justify their behavior by saying they must have thought the problem was really, really serious, in a blog post ostensibly about some other subject. Don't light a match around all that straw.

"that is because we are accustomed to judging the truth of a claim by the moral status of the group making the claim"

I don't think I agree with this - could you define "moral status"? I assign high moral status to my pastor, for example, but I'd probably take advice from the CFO of a casino operator over his on an economic issue.

The Bayesian point stands.

The Bayesian point only stands if the P(ClimateGate | AGW) > P(ClimateGate | ~AGW). That is the only way you can revise your prior upwards in light of ClimateGate

I don't seen how anyone could argue that this is true. Even if there is a slight effect for #2 is must be dwarfed by #1.

Last few posters get it -- it always boils down to incentives.

Regarding number 2, "really important" means "cha-ching, more funding and a large paycheck." There's no reason to suspect that they may be correct other than coincidence.

Now look at #5 -- the Republicans are certainly evil (albeit no more evil than the Democrats). In both cases, the incentive is "win re-election." Regardless whether "spend less money" is correct, this is a line that plays well with a certain net tax-paying segment of the population. Draw your own conclusion, if you can (I can't).

Regarding the Nazi example, the incentive was to consolidate power through, inter alia, fear. So they did go to a lot of trouble, but that doesn't mean they were right -- just power hungry.

If we really had such difficulty believing that bad people say true things, then prosecutors, who are often forced to rely on the testimony, even the paid testimony, of criminals, would have a great deal more trouble obtaining convictions than they do.

Oh, sweet irony. The whole rationalization for stonewalling Steve McIntyre in the first place was that he was such an agent of evil, obviously acting in bad faith.

"These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for."

In scientific and any other logical pursuits, if I *ever* find myself thinking this way -- that lying and cheating on the evidence is OK because my cause is just -- I will know that I have let my emotions cloud my logical thinking, and that therefore I really need to step back and re-evaluate the issues, my evidence, and my thinking.

Justin Martyr after 30 comments finally hits on the right answer.

If X happens, it doesn't matter whether there is merely some interpretation of X that would lead one to increase one's estimate of P(AGW).

Rather it must be the case that on balance one's probability-weighted summative judgement of all possible interpretations of X leads one to conclude that overall, one should increase his/her estimate of P(AGW).

Tyler does not explicitly make this mistake, but to my mind omitting the notion of summative judgement leads to a misleading post.

In fact, Tyler's point is still valid -- do not revise your P(AGW) quite as far down because you are probably irrationally discounting the likelihood of #2 -- but to me common sense (and summative thinking) strongly suggest that P(AGW) ought to be revised in the downward direction by some amount.

"I find it interesting that those who do not believe in climate change almost universally ascribe reputation and funding as the motives for the ClimateGate authors' actions"

As an AGW skeptic, I agree that this is tiresome, and the attribution is probably incorrect. I think the CRU scientists are genuinely concerned for the environment and (arguably) humanity, but their emails demonstrate that when the evidence conflicted with what they thought was the truth, they assumed the evidence must be wrong. That isn't science - it's religion. AGW believers frequently (and justifiably) accuse skeptics of some sort of profit motive, without recognizing that religious belief is an exponentially more powerful motivator.

"I do know one person who insists that it is essential for the health of the planet that we released more carbon into the atmosphere, but I've not seen anyone make this argument online or in print"

There seems to be an assumption at the root of your argument that we pollute environment for spite. The fact is that we derive enormous benefit from, to take two examples, the use of fossil fuels and the use of agricultural chemicals; benefit that wildly outweighs the cost of damage to the environment.

Having a probability higher than zero is not the same thing as having a probability high enough to be worth considering. Everything has a probability higher than zero.

While it's possible to be a bad person and right, its not possible to be a bad scientist and a reliable source of information. To call a willingness to risk a scientific reputation a moral failing - "dishonourable behaviour" - misdirects us from seeing that it is also a professional failing: bad science. If people really did behave as bad scientists, making the evidence fit prior assumptions and refusing to let others attempt to disprove a claim, whatever their patchwork of moral and immoral actions this cannot lead us to assert #2. They must thought this was important, but as a result they muddied the means by which they - or we - could tell if they were right.

"I find it interesting that those who do not believe in climate change almost universally ascribe reputation and funding as the motives for the ClimateGate authors' actions"

Agreed. But put it in context. It is a response to the argument that skeptics are motivated by oil company money. What is good for the goose is good for the gander - that is how they are reasoning.

The correct interpretation is from Caplan and Hanson. People do many things with their beliefs. They use them to affirm their moral goodness and worth. They signal group membership. They can be used to gain social status and dominance. We can do all these things with our beliefs but the cost is a loss of the truth. Hanson calls this the One Ring to Rule Them All of cognitive biases. People who suffer from these self-serving biases will seek out confirming evidence and ignore discomfirming evidence. They will poison the well by claiming that others cannot be trusted. They won't critically analyze their own arguments.

The upshot is that both skeptics and warmists are motivated. They both see themselves as doing good for the world and fighting evil. And that takes us back to ClimateGate. A useful heuristic is that we should discount those who (1) aggressively punish out-group members, and (2) make it difficult to produce discomfirming evidence. On that view, we should revise our prior for AGW downwards.

Enough of this ridiculous myth that pro-AGW people are more "religious" about it than anti-AGW people. Anti-AGW people don't actually have scientific evidence to support them. What do they have? A deep and abiding faith in the evil of government intervention in the economy, 95% of the time in my experience.

Anti-AGW belief is necessary to block out the potential that government can intervene in a huge absence of voluntary collective action and make things better through market regulation.

Note that the defence of the denial believers about the lack of evidence to back them up is usually:
(a) to cite short-term temperature trends, like "2008 is colder than 1998", which is the kind of nonsense that Bjorn Lomborg slams in his book, or more usually "IT'S REALLY COLD OUTSIDE TODAY AMIRITE!?!"; or
(b) to cite a cartel of scientists who stop the publication of the tons of papers that prove AGW is a myth. This puts denial religion up there with the 9/11 truthers.

Justin Martyr wins the thread again.

The upshot is that both skeptics and warmists are motivated. They both see themselves as doing good for the world and fighting evil. And that takes us back to ClimateGate. A useful heuristic is that we should discount those who (1) aggressively punish out-group members, and (2) make it difficult to produce discomfirming evidence. On that view, we should revise our prior for AGW downwards.

I have to agree. I've revised my personal P(AGW) from 0.95 to 0.75 based on ClimateGate. (Clear physical evidence (sea-ice loss) makes it pretty much a no-brainer for at least GW.)

Not only is Tyler correct that option 2 should be considered, it is, in fact, the position taken by a lot of the supporters of the science of CRU and other AGW proponents. Note the the numerous excuses for withholding the data from "critics that will only use it to lie about climate change": this is a form of option #2.

All in all, I consider p to be essentially 0 for option 2 for the simple reason that all the evidence suggests that the scientists in question are guilty of assimilation and confirmation bias. This should have be evident to anyone that actually follows this debate with a half-way open mind, and should be crystal clear, now, to everyone with any integrity, from the content of these e-mails and other files. The AGW crowd is going to have to start from scratch to rebuild trust in their methods and data, and they have to do it starting ASAP. Openess is the only solution, but I predict they still won't be able to follow through.

Given the datum that people are willing to behave badly in the name of hypothesis A, in the absence of other data the conclusion that seems most likely is that people are also willing to behave badly in the name of hypothesis not-A. It does not sway my estimate of relative probabilities of A and not-A.

The intensity of feeling on a subject says something about the nature of the subject matter, but little as to the correctness of views held.

I think Eliezer Yudkowsky has thoroughly refuted the point in #2, that dishonorable actions are favorable evidence of the importance of the cause. See The ends don't justify the means (among humans). In short: you run on self-serving hardware. Your cognitive systems automatically try to make self-serving actions seem like the right thing to do -- like the rightness of bending the rules is "just the way things are".

Deontological ethics ("the ends don't justify the means") are a way to override this "fault of perception" by giving hard-and-fast rules that limit your ability to do damage while thinking you're helping others. When you start believing that you're justified in breaking the rules, that's a sign you've tricked yourself into seeing self-serving, anti-social behavior as a righteous cause.

More importantly here, folks, you need to keep in mind that events can go into *different reference classes*, an issue recently raised in this discussion. So even if you discount a theory because of this bad behavior, it's still possible for *other* evidence to more than make up for that discount (not that there's such evidence here, though). Bayesian reasoning is non-monotonic. Deal with it.

If I wanted to give them more incentive to lie, then yes, I'd increase p. If I wanted to reduce their incentive, I'd announce that as a consequence of their action, I now believe p=0.

I was going to make a point but then mk made it better.

Tyler, your initial post of Climagate was so poor and stupid that you are now quite obviously bending over backwards to rationalize the behavior of some scientists. You are clearly far too up your own a$$ about being a contrarian that you have become sort of cartoon character.

Jordan,

What you are missing is that it is not actually most of the scientists claiming that there is no more room for scientific questions, it is the politicians and activists.

The VERY FIRST paper I looked at from one of the implicated states "Improved representation in models of coupled atmosphere/sea-ice
dynamics will be critical for forecasting Antarctic temperature change."

Steig, E.J., Schneider, D.P. Rutherford, S.D., Mann, M.E., Comiso, J.C., Shindell, D.T., Warming of the Antarctic ice sheet surface since the 1957 International Geophysical Year, Nature 1457, 459-463, 2009.

The scientific dance is always "I can give you the answer, with just a little more money and more time." And the best scientists run this for their whole career.

#5 is not merely an opening to "out-game". It relegates the Democratic party into becoming the Responsible Parent figure to the Republican Good Time Boys. Not only would it be a politically suicidal thing to do (being, in fact, becoming a Republican enabler), but also it would be an abdication of the generally accepted platform the Democratic party in the first place.

Another way to look at it is to model it game theoretically. You're trapped in a marriage with a profligate wastrel, with no divorce. Or: you're on a desert island with a selfish glutton, and no bullet for your gun. What is the optimal game to play?

PJ:

If they are correct and competent, then this will ultimately foster acceptance of their results.

As I said in a previous post, "eventually" may not be acceptable in this case. It was 14 years for official acceptance of the risks of smoking from the essentially irrefutable proof. Those who believe in AGW don't believe we have 14 years before we're looking at untold damage, especially to poorer nations.

And 14 years is optimistic. Given the scientific "proof" will always be somewhat more problematic, I would suggest 20-30 years (assuming the science does come out) would be more likely. So, from the point of view of the pro-AGW climate scientists, do you fail mankind for the sake of pure science? (Of course, if you get caught, you're in trouble, but if you behave like proper scientists, your chance of policy success against entrenched interests is zero.)

Hell, why don't we just call it an even billion people will die if the earth warms 0.4 degrees or whatever the hell it's supposed to be this week.

--"These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for. I will revise upward my estimate of the seriousness of the problem."

Another way to express what's wrong:
to restate, you said: these people behaved dishonorably; they must have thought their THEORY was really correct, worth risking their reputations for. I will revise upward my estimate of the accuracy of their THEORY.

if you had said: "these people behaved dishonorably. They must have thought their DATA was really correct, worth risking their scientific reputations for. Therefore, I will revise upward my estimate of the accuracy of their THEORY"

then it might have changed my p. But scientists shouldn't hold the validity of their theories more tightly than the validity of their data.

Sorry Mesa. my misinterpretation.

Shouldn't these be:
1. I don't trust their opinions. This is evidence why.
2. These people behaved dishonorably. They must have less evidence than even I thought.
3. If the Republicans win, they will irresponsibly cut taxes and do nothing real to control spending.
4. We should ensure that the Republicans do not win and criticize them every chance possible.
5. Sooner or later the Republicans will in fact win and I cannot prevent that. Right now the Democrats should spend more money, given the truth of #3, so they won't have so much to spend.

What a bizarre set of "givens" to base discussion on. This is surreal.

"My side was on the lying side of Climategate, so when does lying indicate virtue?"

"My side is doing deficit spending at previously unimaginable, truly breathtaking rates ... so how can we turn this into a discussion of the supposed fiscal irresponsibility of the other side?"

This isn't just a castle in the air; it's a whole kingdom.

I think others have touched on it, but to be explicit: Tyler is wrong about #2.

He is correct that the "p" for deciding global warming is serious because 'they must have really believed in it' is > 0. But not that this is a revision upward.

It is clearly a revision downward. I already had a fairly high "p" that global warming was a serious problem, that researchers such as these are right. That "p" has now been reduced to a small (but, yes, nonzero) number.

In the context of science, at least, 'they must have really believed it' tends to revise "p"s downward. 'Really believing' something does not give others comfort that one's scientific findings are disinterested and uninfluenced by 'belief'. Whereas before, I only suspected that these guys 'really believed' in their theory, now I'm fairly sure of it. Hence "p" goes down.

SonicCharmer:

I don't think that's quite right.

On the one hand, being very certain of some claim surely makes it harder for you (at best objective-science software running on legacy don't-get-eaten-by-lions hardware) to recognize problems with your claim. That should cause us to become more skeptical of your claims when you're especially certain. (I suspect the whole controversy about it makes it even harder to see contradictory data.)

On the other hand, there can be a lot of evidence which is visible and convincing to an expert, but which doesn't reach the point of being publishable, or is noisy enough to allow significant debate. An expert's intuition can be worth a lot, and it's evidence (in the Bayesian sense Jaynes would have used), even if it's not proof. If (as appears to be the case) the great majority of people working in this area are pretty convinced that the earth is, long-term, warming because of human CO2 emissions, it's quite possible they're all wrong, but it doesn't seem like the way to bet.

I don't know which of these effects is bigger, or even how I'd try to figure out which is bigger.

It's true what Patrick said: "Blindly believing in bad evidence, because we want it to be true, is the norm."
If you _want_ it to, malicious arrogance and academic pettiness can become proof of "manipulating... destroying... and stifling the publication of new data-sets"

So how about options 1a and 2a for the introspection-challenged: twist the evidence to support one's prior beliefs, with no reflection whatsoever.
1a) Yell "Here's the clear evidence of evildoing I've been praying for! Time to post in ALL CAPS on the internets!"
2a) Place hands firmly over eyes and ears, and shout "scientists are heroes, especially Al Gore! This is all Exxon's fault somehow... Save teh polar bears!"

There is a greater difference between the good old dishonorable beaviour (something equally divided among we earthlings) and lying scientists. If a scientist disputing funds with one another hires a hitman to kill the rival, this is dishonorable beaviour. If a scientist disputing funds with one another lies about the data of his research in order to prove that he's ahead of his opponent this will affect the research. The morals don't play any role in it.

albatross, On the other hand, there can be a lot of evidence which is visible and convincing to an expert, but which doesn't reach the point of being publishable, or is noisy enough to allow significant debate. An expert's intuition can be worth a lot

Indeed. And it was worth a lot, to me. Like I said, my prior "p" for them being right was pretty high, because 'experts know stuff I don't' is my default stance, precisely for the reason you state.

But to learn, definitively (however much I may have suspected) that they were playing politics and messing with data - that reduces my "p". If as Tyler says I am to interpret their behavior as motivated by 'deep belief' this only reinforces my interpretation of their behavior as motivated by ideology (or worse) - i.e. flawed science. In other words, I abandon my default stances toward 'experts' in light of evidence that they are not behaving in disinterested, scientific ways.

As I should.

If (as appears to be the case) the great majority of people working in this area are pretty convinced that the earth is, long-term, warming because of human CO2 emissions, it's quite possible they're all wrong, but it doesn't seem like the way to bet.

Agreed. But that was before we knew that an apparently central repository of all their data had people cooking the books and playing politics against dissenters. It now seems like a safer way to bet than it did before. "p" revised downward.

Tyler Cowen's topic is "The limits of good vs. evil thinking" and
then gives as an example of that ClimateGate. The implication
being that people that are upset about the behavior of the alleged
scientists in question are viewing this through "good vs. evil"
lenses. Now I realize that the subject here is supposedly "the
limits of good vs. evil thinking" and not a backhand attack on
thinking of those are sceptical about the CO2 induced climatic
disaster hypothesis. And maybe it really is. (I'm uncertain in
general that I really understand what Tyler is up to and what
his assumptions and deeply held beliefs actually are.)

Whatever. The point I'm trying to make is this a bad example
if we want to explore "the limits of good vs. evil thinking."

Speaking for myself, I first moved in the direction of climatic
alarmist skepticism not because I believed the hypothesis of
imminent CO2 induced climatic disaster was wrong, but because
I detected process errors. Like for instance a scientific paper
that suggests something disturbing and dramatic but it turns
out the author in question isn't allowing anyone inclined to
question to see his data or explain how he processed the data
to arrive at that conclusion. Or another paper with some
interesting result, but then one discovers that there were
200 datapoints that the author was aware of that would have
seemed natural to include, but surprisingly only 39 actually
were, and further that the full 200 tell a different story.

These disturbing moments do not invalidate the CO2 induced
climatic hypothesis. They may just be individual examples of
misbehavior. But then when one discovers that others in the
field whom one has no specific reason to doubt are not upset
by these behaviors, that in fact they treat papers as valid,
ah, then you know, then I know, for sure, that something is wrong.
That something is systematically wrong.

Is this "good vs. evil thinking"? I don't think so.

Am I unique? Hardly. Again and again, I read comments from
others that hint at similar starting points. The evidence
has been out there for decades. ClimateGate is new, but it's
been shaping things for a long time.

We are imperfect beings. Every individual has flaws. It's
amazing in a sense that we ever achieve anything. Science
is a method for filtering out some of the flaws. It doesn't
guarantee truth. It's just a whole lot more likely to produce
truth than ignoring that process.

Getting back to Tyler's point, he's asserting that these misbehaviors
are motivated by belief and that by the seriousness of the
misbehavior we should assign a higher probability that the beliefs
are true.

That may be a valid proposition and it may be not. I can come
up with alternative hypotheses to explain misbehavior other than
the possible truth of the beliefs of the misbehaver.

What I'm quite certain of though, is that the odds of science
coming up with the truth are a whole lot better than this sort of
thing. If our goal is to discover an external truth then the
one process is clearly superior to the other.

Obviously, I'm making a value judgement. Obviously, I'm saying
that the scientific method is good and that strongly-held personal
beliefs motivating the pretense of practicing the scientific method
is bad.

I'm unapologetic about making that value judgement. I don't
think Tyler is simply criticizing value judgements.

Patrick wrote: 'No, it doesn't. Why would it EVER stand after such an obvious rebuttal.'

The rebuttal is 'p != 1', whereas the claim it's 'rebutting' is that 'p != 0'. There are an infinite number of possible values remaining, and in fact p = 1 is far from what is being tacitly suggested. Try p = 0.1 or p = 0.001, perhaps. The rebuttal is a straw man, included chiefly because it's a very predictable one.

Without slogging through the comments to see if this has already been suggested, but ... is #2 counterbalanced by asking the same of the miscreants or possible criminals who hacked the Hadley e_mails?

Comments for this post are closed