From the comments

by on December 2, 2009 at 7:35 am in Education | Permalink

People are not always eager to lay down good vs. evil thinking.  I don't mean to pick on any single commentator but here is one example:

…E T Jaynes is spinning in his grave that you used Bayes to justify an increase[d] belief in AGW based on scientist's personal beliefs when they lacked the to support their own conclusion.

They believed something so strongly they faked data? A scientist should only believe something so strongly because they have the data to support their belief!

This was perhaps the most misunderstood blog post (including by other bloggers) I've written, yet the original text is quite literally clear, though perhaps it confuses people by not offering up the emotional valence they are expecting.  I did not try to justify any absolute level of belief in AGW, or government spending for that matter.  I'll repeat my main point about our broader Bayesian calculations:

I am only saying that #2 [scientists behaving badly because they think the future of the world is at stake] deserves more than p = 0.

Nor is my point that p is large, but rather if you don't consider this p at all your reasoning is incomplete.  People simply do not wish to hear that sometimes they should pay heed — incomplete heed at that — to the opinions of evil others.  It's remarkable how many people responded to this blog post by attacking either the scientists or, in some cases, me.  

Millian December 2, 2009 at 8:02 am

To repeat: What is the social usefulness of revising my p=0 belief, called X, up to p=0.0014?

If this is such a useful change, how do we figure out exactly what p=0.014 is?

if any rough estimate will do, why not p=0, and save time on working it out to four decimal places? It’s an interesting game, to be sure, but in real life we don’t shun people from the table for failing to quantify precisely each of their beliefs.

Lonely Libertarian December 2, 2009 at 8:57 am

Tyler,

I think I clearly understood your point – but stand by my original post in believing that the “bad” behavior actually REDUCES my p [BTW is their a difference between bad and dishonarable - I think so and the difference is important]. If the AGW was build on sand then the lack of evidence to support their argument makes me less inclined to believe in AGW – still have a p greater than 0 – but not as great as I had before these folks were exposed.

I have also been thinking that this has a temporal aspect. Ten to twenty years ago the short term data and some other data could be interpreted as supporting AGW as a very strong hypothesis worthy of research. As time passed two things happened.

1. The trends in temperature did not cooperate – I believe that I am correct in saying that the average global temperature has not increased in the past 10 years [ but I am not suggesting this to be any sort of “proof” that AGW theory is wrong – just needs to be considered”

2. Better climate models have been built which in turn led to two specific findings A. The recognition that the pre 2000 models were pretty bad. B. Along with a recognition that building a really good climate model was probably still beyond our reach.

So the AGW supporters found themselves in a weaker place in 2008 than they were in 1998. I believe that today they do not fall into a single group – but three segments.

1. The careerists – those who like Al Gore have found AGW very profitable and do not want to lose their research grants or University positions – at their best the demand a higher standard of proof to move away from supporting AGW – at their worst they are simply greedy and willing to act dishonorably.

2. Those who genuinely believe in AGW and that future data will prove them right

3. Those who have doubts about AGW [the data from the last 10 years has lowered their personal p] but they also believe that the risks of AGW are so high that any p greater than 0 is unacceptable [and they find it hard to understand why this logic escapes the rest of us].

You characterized your post as one of the most misunderstood – and that may be true – but it was also one of your best in stimulating some of us to think about this topic a bit more openly

Thanks and keep up the good work!

Justin Martyr December 2, 2009 at 9:07 am

P.S. Is anyone else having problems posting comments? I keep getting wierd errors.

Andrew December 2, 2009 at 9:23 am

Isn’t the bottom line that they withheld the data so that we couldn’t do our own complete analysis?

anon December 2, 2009 at 9:31 am

Re: #2: In other words, the scientists at the CRU believe the world is facing catastrophic effects from AGW, but the data doesn’t support it…so they delete, hide, and “correct” the data and then fake the graphs for our own good.

Yes, p is > 0. They *could* be right, and AGW could be catastrophic. But it would be a classic example of getting the right answer by accident.

Lord December 2, 2009 at 9:58 am

Some advice for the anti warming crowd. If you have to base your case on your opponents data, you probably don’t have one. If you have to blame your opponents for your lack of incentive or laziness in collecting your own, that is just pathetic. If you have to blame bias for your failure and play for sympathy as victims, I suggest you grow some and get off your a##.

tom December 2, 2009 at 10:07 am

What Josh said above.

And when the product has been sold to people around the world based on trust in science generally rather than the actual AGW science, the effect of a loss of trust is necessarily larger than Tyler seems to get.

I think that Tyler and lots of the other quasi-iconclastic economists I read have been victims of a ‘science’ bias, and that they are just beginning to work through its consequences. By ‘science bias’ I mean too much deference to the conclusions of the big AGW scientists, when those conclusions were being used to support the largest intentional restructuring of economies in history.

Over the last ten years How many smart econ bloggers have written something like “I haven’t reviewed the science myself but it is clear that the scientific community really does agree that global warming is a real and enourmous problem and that it is likely caused by humans?” That deference may be warranted on smaller issues, but it was never right on something as central to our futures as the AGW debate.

Joshua December 2, 2009 at 10:24 am

I’m still waiting for the justification that P(X believes A) increases P(A is true). AGW aside, you seem to be saying that if you were rolling dice with a believer in the gambler’s fallacy and you had a long run without a 6 coming up then as he places higher and higher bets on 6 “because it’s due” a good Bayesian ought to be revising his own estimate of the probability that the next roll will be a 6 upward as well (instead of downward as observations strengthen the hypothesis the die isn’t fair).

mk December 2, 2009 at 10:39 am

Once again Justin has nailed it, and further comments are unnecessary.

Nylund December 2, 2009 at 11:05 am

I do not think the number of attacks against scientists or you is remarkable at all. Global Warming is a major litmus test for the non-Left (be it libertarianism, conservatism, republicanism, evangelicalism, etc.). It comes just after abortion, gay marriage, and maybe even before the right to own guns (the belief that tax cuts are always the right answer, no matter the question, should be in there somewhere as well). For that side of the political spectrum, the only correct answer to any question about global warming is a complete and absolute denouncement of the entire hypothesis. In short, any answer that is not “P=0″ makes you the enemy of all that is righteous, or something like that. I really don’t get why be so against the idea of global warming is so important to a certain half of the political spectrum, but it is.

Russell L. Carter December 2, 2009 at 11:18 am

Probably I’m not the only one observing a little mordantly that these two posts amount to a chickens come home to roost situation. Once again, a rational, decent, conservative-leaning thinker gets beaned on the head by the crazies[1]. And guess what? There’s millions of them. Somewhere around 30% of the country, apparently. And since the Republican Party has achieved purity and now represents just that ideological slice, and the structure of the government is such that a unified opposition party can more or less obstruct any action, you can’t ignore them any more.

But you (and Alex) did exactly nothing to discourage this situation over the previous eight years. Thanks for that.
(At first, it seemed to me a bit bizarre that you’re so earnestly trying to reason with them, but if I navel-gaze for a minute I can recall not so many years ago when I did the same)

[1] i.e., classic anti-intellectual reactionaries with binary thought processes.

tom December 2, 2009 at 11:26 am

Tyler, Seth Roberts is seeing a naked emperor and is celebrating the exposure of the weakness of ‘consensus’ science, while you are asking people not to adjust their ps too much.

Why are you and he so opposed here? Is it his iconoclasm (or, more accurately, his lack of any consideration for icons)? When is it appropriate for a thinking person to defer to the consensus?

Jim December 2, 2009 at 11:56 am

Wow, you are missing the point by a very wide margin.

If the opinion of these people had any merit, they would not have had to commit brazen fraud to manufacture supporting evidence.

You are refusing to see this, and instead insist that we are rejecting the opinions because the people are “evil.”

Can you seriously not see the difference? People are going to react to the complete collapse of AGW in different ways, but this is just strange.

Lord December 2, 2009 at 12:06 pm

Scientists welcome questions that lead to learning or raise new thoughts or insights. They don’t welcome repetitive questions that show the listener never bothered to learn enough about the subject in the first place, doesn’t learn anything from the answers provided, and does so not to advance knowledge but only to confirm their own preconceptions. So this is likely much more mundane than behaving badly. It is far more likely to be a case, where as with dealing with cranks, further conversation serves no purpose.

BKarn December 2, 2009 at 12:14 pm

“For that side of the political spectrum, the only correct answer to any question about global warming is a complete and absolute denouncement of the entire hypothesis.”

I find it profoundly depressing that someone can seriously hold this sort of opinion — and post it — all the while being on the side of the argument that _also_ portrays its opponents in absolute terms. Meaning: Anyone who expresses doubts about the degree of man-made contribution to global warming, or the nature of global warming in any way, is portrayed as a “denier” who “doesn’t even believe in science,” and who is now also conveniently probably a Republican who loves guns and wants to cut taxes and harm minorities. That people cannot see what sort of rhetoric they are engaging in and the mindset that has created is profoundly disturbing, and I say that without a hint of exaggeration.

Andrew December 2, 2009 at 12:29 pm

Okay, last try:

It is clear to me that is possible that #2 would actually be worse than #1 for the end product that we actually care about, the validity of the science BECAUSE of the effect of #2 within the enclave of these scientists.

Seth December 2, 2009 at 1:32 pm

‘…worth risking their scientific reputations for’

I think this is the critical assumption required to raise your estimate of seriousness and I think it’s wrong. I think they either didn’t give much thought to the impact their behavior would have on their reputation or they simply became intoxicated with the positive reinforcement they were receiving from their positions.

Replace your assumption with “they were stupid, much like Ken Lay, mortgage lenders and borrowers” and what does that do to your p?

This is an academic argument. Why should I care? AGW is either happening or its not. There’s either a God or there’s not. Should someone’s strong belief in a God raise my p for there being a God? Maybe. But, does it change whether there really is one or not?

Mesa December 2, 2009 at 2:04 pm

It depends on what the meaning of “p” is.

Here’s how it was described in the post:

“2. “These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for. I will revise upward my estimate of the seriousness of the problem.”
“I am only saying that #2 [scientists behaving badly because they think the future of the world is at stake] deserves more than p = 0.”

I think many people were discussing another ‘p’: probability that AGW is “true” or “real” or “a big problem”.

Even if the probability that the motivation for the bad behavior is that the scientists are really worried has a p>0 (Tyler’s p), the consensus of most commenters seemed to that be probability of other motivations (1-p) was much greater ( (1-p) >> 50% ).

Even if you believe Tyler’s p is small but > 0, the correct response is to adjust one’s estimate of the seriousness of the problem downward, wherever it might currently be since P(Climategate|AGW)
< P(Climategate|~AGW).

This is how “the on the one hand….” or “thinking like your enemies” stuff gets you into trouble, if you are not rigorous about it!!

Mesa December 2, 2009 at 2:50 pm

To put it in a more provocative way, one of the limits of bullet point thinking is that it tends to imbue each bullet point with an equal probability in the mind of the list creator, whether or not that was the intention!

Patrick December 2, 2009 at 3:08 pm

I am only saying that #2 [scientists behaving badly because they think the future of the world is at stake] deserves more than p = 0.

This would imply that for some number of (people behave badly enough to lie about x, therefore x must be true), such that it will beat the odds without the claim attached to it.

Example:
I have a coin that is fair with 50/50 chance of heads and tails, and appear this way after a short testing. A person informs me that he has a spreadsheet of coin flips with this coin, and that it is actually 75/25. Initial examining of his data shows this to be true.
Later I discover that this data was manipulated, because he REALLY wanted it to be.

A man comes to me after this fact and says, “it’s clear your friend wanted the coin to be heads, he was willing to risk a lot on it. I will offer you a wager. We flip these coins any number of times you want, and we will pay out at the end. One hundred dollars per flip, and if it’s heads, you get $199.99.”

You’re telling me you would take that wager? Even if your friend wasn’t right about 75%, surely he must have had SOME reason to believe the coin wasn’t fair, and that a million flips would find it out.

Let’s crank it up more then. Everyone in France believes their coins land heads up more then tails, even though the coins are otherwise fair. Everyone. They believe in their coin THAT much they would lie to you about it, and you later discover this was a lie, but otherwise your data says they are fair coins.

Let’s say, again, the man offers a wager. Surely that friend was just a jerk, but how can 50 million French men be wrong? You can flip all fifty million coins, for $100 each flip, and begin narrowing down to a specific coin to flip as much as you want. You get $99.98 for a tail and $100.01 for a heads.
Surely this deal must be a huge bargain! One of these Frenchmen’s coins has to be more right then average, even if just a small amount! When I find that person’s coin, you can flip it and make up your losses, and then make unlimited money.

Tyler, 5 billion people on this planet believe something really stupid. They believe it, even though there is no evidence of it. Some of them are willing to strap dynamite to themselves and kill in the name of this belief.
Fervency is not a basis of a belief. Nor is the huge size of its impact. A lot of people believe in conspiracy theories, UFOs, or ghosts. Each of these things would be SUPER incredible if it were true. They’re willing to risk their jobs, their reputations, their free time, and even their lives in pursuit of these false beliefs.

Wanting something to be true doesn’t make it so. Extraordinary claims require extraordinary evidence.

Gabe December 2, 2009 at 3:32 pm

I believe many of the skeptics here would relate to what Harsanyi talked about at Reason:

“Now, I do not, on any level, possess the expertise to argue about the science of anthropological global warming. Nor do you, most likely. This certainly doesn’t mean an average citizen has the duty to do the lock step.

Yes, you apostates will be tagged “denialists”—because skepticism is synonymous with the Holocaust denial, don’t you know—or some other equally unfriendly moniker.

Don’t worry; you won’t be alone. Gallup recently found that 41 percent of Americans now believe global warming news reports are exaggerated—the highest number in more than a decade despite the fact that this time frame has coincided with concentrated and highly funded scaremongering. That number is sure to rise as soon as word of this scandal spreads.

The uglier the names get, the more anger you see, the more that science-challenged politicians push invasive legislation, the more skeptics will join you. True believers will question your intelligence, your sanity and your intentions.

But as ClimateGate proves, a bit of skepticism rarely steers you wrong. In fact, it’s one of the key elements of rational thinking. ”

Most people reading marginal revolution are not “good” versus “evil” thinkers or dogmaticaly religous to the point of using faith as an over ride for logic for MOST of their beliefs.

However, when it is pointed out that many of the big players in the pro-co2 tax crowd are full of shit(even if it is out of good intentions, i’ll even grant p=.99 on #2, but lots of smart people supported Trotsky out of good intentions too)…then Tyler chooses to smear the co2-tax skeptics as operating from some sort of faith-based religous mindset?! That is straight up logic FAIL.

albatross December 2, 2009 at 4:16 pm

Silas:

Right, we care about

P(CG|AGW) / P(CG|~AGW)

Evaluating that requires thinking through the different explanations for Climategate(CG), right? For example, Tyler’s #2 gives an explanation of why P(CG) may go up when P(AGW) goes up: Researchers whose long experience and study has convinced them of AGW (increasing P(AGW)) are more willing to play games with ambiguous data (CG). On the other hand, the whole idea of researchers who were more concerned with their positions and grant money than the truth gives an explanation of why P(CG) may go up as P(AGW) goes down. And the whole confirmation bias argument several people brought up gives an explanation of why CG should decrease P(AGW) (because the previous relatively high probability was conditioned on a higher level of willingness to examine contradictory data in the climate modeling community than is indicated by CG).

My intuition is that P(CG|AGW) < P(CG|~AGW), so that CG should decrease our confidence in AGW, but I’m not 100% sure. A lot depends on how much CG reflects on the larger climate-modeling community. Is this isolated unethical behavior by a couple researchers, or indicative of the whole community’s ethics?

Justin Martyr December 2, 2009 at 4:29 pm

albatross, you wrote the post that Tyler should have written. Well-reasoned and charitable to a fault!

I do not think it is a live option that P(CG|AGW) > P(CG|~AGW). Perhaps I have an outsiders guilt about how sausage is made, but I do think that the true science is higher quality than the false science. However, I will channel my inner Cowen and claim that P(CG|AGW) could reasonably be high enough that people need not adjust their priors for AGW downwards by a lot.

dwinds December 2, 2009 at 6:25 pm

Stick to the cutey pie stuff.

Anthony December 3, 2009 at 12:03 am

Tyler:

I think you deserve the criticism. Your principle should justify eugenics. But you don’t believe in eugenics, do you?

You know what this means.

Allison December 3, 2009 at 2:25 am

As the commentator you’ve pointed out as so egregiously misunderstanding, let me dig myself in a little deeper.

Your argument was that in order to assess the probability of AGW being true (because #2 said: “they ruined their reputations–>their belief must be really strong to bother–>I should revise my p up.” if p wasn’t p(AGW), the rest of this made no sense. Sorry, but I’m sticking with that.) I made no comment about objective values of P(AGW), but relative increase, so that straw man was unnecessary.

But here is my calculation: before, we lived in a world where scientists to varying degrees may have been honest or dishonest about AGW.
That is, oldP(AGW) = P(AGW|CG)P(CG) + P(AGW|’CG)P(‘CG)

AGW|CG hasn’t changed. AGW|’CG hasn’t changed.
Since P(CG) is true,
newP(AGW) = P(AGW|CG). Your claim is P(AGW|CG) is higher than P(AGW|CG)P(CG) + P(AGW|’CG)P(‘CG).

If your prior for academic dishonesty was low, then your claim is p(AGW|CG) was always higher than p(AGW|’CG), but the priors dwarfed that contribution for us. That means scientific consensus and evidence are garbage compared to the one noble liar. It’s nuts.

If your prior for academic fraud is very high, you’re still suggesting that P(AGW|CG) is higher than P(AGW|’CG). Again, nuts.

If you don’t believe P(AGW|CG) is higher than P(AGW|’CG), then your probability should fall when you kill off a portion of your sum.

Let’s add some numbers. say AGW|CC = 50%. say AGW|’CG = 90. Reasonable scientists might expect priors on academic honesty higher than dishonesty, but let’s be generous.

P(‘CG) = 70; p(CG) = 30.
oldP(AGW) = P(AGW|CG)P(CG) + P(AGW|’CG)P(‘CG)
5/10*3/10 + 9/10*7/10; 15+63 = 78%

But now we’ve collapsed down to that 50%. Show the priors and posteriors you had that led you to think P(AGW|CG) is higher than P(AGW|CG)P(CG) + P(AGW|’CG)P(‘CG).

Sonic Charmer December 3, 2009 at 3:54 am

The fact that posts such as Justin’s and Allison’s haven’t put this issue to bed and gotten Tyler to see his error leads me to believe he’s confusing something fundamental.

I think it may be the fact that, as some have pointed out, there are tacitly two propositions spoken of interchangeably as having a ‘p’, and they are not the same:

(a) a scientist will risk his reputation and use fraud if he ‘really believes’ there’s a problem

(b) AGW really is a problem

Tyler seems to construct a working argument for p(a) > 0 (but works too hard to do so, because it’s not as if p(a) > 0 is something that anyone actually ever disputed). But then via sleight of hand he pretends that this is equivalent to p(b) > 0 (or doesn’t see that it isn’t). Not only that but he pretends the latter would necessarily be a ‘revision upward’, which is really strange (it seems to confuse a number with its derivative: ‘the baseball is 1mm above the ground, 1mm is greater than 0, therefore the baseball is going upwards’).

Tyler, when you speak of ‘p’, which proposition is it whose probability you are you thinking of, (a) or (b)? And why do you think that proving p(whatever)>0 means a ‘revision upwards’?

Andrew December 3, 2009 at 4:07 am

That this actually happened at least one time also proves that ‘we’ (and I’m using “we” in the sense of ‘apparently everyone on the planet except me’) haven’t cared enough about the integrity of the process and let it happen. If ‘we’ really believe there is a problem are we more likely to defraud ourselves?

The first (and only?) signal ‘we’ should always be sending to scientists is “get it right.” We don’t actually need their data before making a decision if we need to move fast, btw. What have ‘we’ been signaling?

Nigel December 3, 2009 at 8:01 am

>>1. “These people behaved dishonorably. I will lower my trust in their opinions.”
Another response, not entirely out of the ballpark, is: 2. “These people behaved dishonorably. They must have thought this issue was really important, worth risking their scientific reputations for. I will revise upward my estimate of the seriousness of the problem.”< <

One problem is that you were conflating two issues - whether AGW is happening, and if happening, how serious the consequences are.
Both 1 and 2 are compatible with each other - AGW can be less likely, but at the same time more scary if it is happening. (for example, I’d be happier to accept a 90% chance of breaking my leg than a 15% chance of death in the next 12 months.)

Secondly, the analysis doesn’t address the detail of ‘climategate’ – just that scientists behaved ‘dishonourably’. You are applying it to an incomplete set of facts, when a more complete set is available.
It is the detail which is scientifically damning.

And no, I don’t have an axe to grind. I still think that AGW is more probable than not, though my confidence is considerably less than it was.
There are also very plausible economic reasons for developing alternate sources of energy – particularly mass solar – in any event.

Jay Daniel December 3, 2009 at 2:06 pm

P > 0. But This doesn’t tell us anything. To use a very similar example, creationists regularly misrepresent scientific evidence to convince the faithful that science supports creationism. I suspect that many, if not all of them do this because they REALLY BELIEVE IN CREATIONISM. But this does not mean that creationism is more likely true.

Neither do climate scientists’ strong belief in climate change make it more likely that climate change is true. It may be true, but the beliefs of those invested in a hypothesis is rarely a good indicator of the strength of the evidence itself — particularly when all sorts of religious, ethical, and political interests are implicated (e.g., Fundamentalist Christianity, Environmentalism).

Paul December 4, 2009 at 2:50 pm

What about ‘good vs. evil’ thinking is so wrong? Good scientists are good because of their process, bad scientists are bad because of their process. The process directly impacts the value of a scientific conclusion. Good scientists ought to be believed because they are good at science, and bad scientists disbelieved because they are bad at science.

The bad scientists at East Anglia CRU are only evil because of the magnitude of the policy issues that turn upon their bad science, which has the same status as deliberate lies. Consider two liars: Falstaff and Iago. Iago is evil, Falstaff not, or much less so. Consequences matter.

となり日本を沸 March 10, 2010 at 11:52 am

lxq
どのジャージにおいても、タテに走るネイビー色のラインは生地に直接染めてありますNIKE ジャージ。
レプリカジャージはナイロンメッシュ生地です。NIKE ジャージほぼ同じ倍率にも関わらず、メッシュ穴の大きさが他の2つに比べて、小さく非常にきめ細かに入っていることがわかります。またナイロン独特のテカつきがあるのも特徴ですNIKE ジャージ。
一方、スウィングマンジャージはポリエステルメッシュ生地で出来ていますNFLジャージ。レプリカジャージよりもメッシュ穴が大きく、目を凝らすと生地の織り目まで見えます。オーセンティックジャージ(≒実使用)にも同じ生地素材を採用してい用いるチームが多いため、より選手が着用する実使用の物に近い仕様になっております。また、手触りはレプリカジャージと異なり、reebok ジャージザラザラしているのも特徴です。
オーセンティックジャージは、reebok ジャージ同じチームでもホーム・アウェイ・3rd(現オルタネート)の種類ごとに生地の素材が異なる場合が多く、レイカーズ ジャージ今回は偶然にも、スウィングマンジャージの生地素材と同じでした。見た目はほとんど変わりませんが、オーセンティックジャージの方がメッシュ穴がきめ細かに入っています。また、触った感じではオーセンティックジャージの方が厚手に感じられます。

Comments on this entry are closed.

Previous post:

Next post: