Cancer and Statistical Illusion

The cover of this month’s Wired promises "The Truth About Cancer" but the article inside is a tissue of misleading statistics and faulty logic.  The article begins with fancy graphics telling us "If we find cancer early, 90 percent survive" but "If we find cancer late, 10 percent survive." And this:

Find the disease early "and the odds of survival approach 90 percent…This reality would seem to make a plain case for shifting resources toward patients with a 90 percent, rather than a 10 or 20 percent, chance of survival."

Thus, the opening block of text commands, "Scientists should stop trying to cure cancer and start focusing on finding it early.  It’s the smart way to cheat death." 

The fallacy in all of this is painfully easy to spot.  If we measure survival, which these studies do, with a 5 or 10 year survival rate then obviously people whose cancers are detected early will survival longer than people whose cancers are detected late.

The key question is whether people who are treated early survive longer than people whose cancers are detected early but who are not treated.  In Thomas Goetz’s long article there is not a single piece of evidence which demonstrates that this is true.  Indeed, quite the opposite.  About 9 pages into the article, after the jump, we find this about CT scans for lung cancer:

As with the Action Project, these studies found that, yes, CT scans detected a huge number of early cancers–10 times as many as they would expect to find without scanning. In that regard, the scans did their job as a screening test. And as expected, the number of surgeries based on those diagnoses jumped. But when Bach looked at the resulting mortality rates, he found essentially no difference between those who received a CT scan and those who had not. Despite the additional surgeries, just as many people were dying as before.

Nowhere does the author mentions that this finding invalidates just about everything he has told us in the first eight pages.

Addendum 1 : Do note that I have nothing against early detection and I am not claiming that it never works.  My problem is with misleading statistical analysis.

Addendum 2: Careful readers will note that this is an almost perfect example of the economicitis fallacy that I blogged about late last year.

Comments

Let's also recall that "cancer" is not a homogeneous concept (i.e., "the disease").

You can easily survive late-stage colorectal cancer, so long as the tumor is still encapsulated. Lung cancer is still almost always a death sentence, no matter how early it is detected.

Something to keep in mind when Obama & Co. talk about "jobs" -- as if all "jobs" are the same.

Fine post. This is an important topic I wish I knew more about.

Related is the danger that comes from early diagnosis - unnecessary but risky treatment. There has been some publicity about that problem in Canada recently. Combine the two and the medical industry has some serious explaining to do.

Alex, this is a great observation; it's so obvious but it never occurred to me. Did you invent this, or did you read someone else making the point? Also, have you emailed any doctors about this? Probably not the author of the article, since he would be predisposed to dismiss it as nitpicking, but other people with no large stake in the outcome (besides saving lives!).

I don't know if you've heard of this one, but Mary Meyer (at U of Georgia I think) claims that the official statistics on how many lives airbags save, is wrong because of the way the NHTSA performed the analysis. (They looked at only crashes involving fatalities, rather than all crashes.) Here is a story about it. The NHTSA says she's nuts and that airbags do save lives on net. (Note that Meyer is saying they are dangerous for everybody, not just petite women and children.) If you like mistaken statistical arguments, you might check it out.

While I am an actual, honest to god subscriber to Wired, and I love the magazine, I would say that this article is par for the course with them. I would summarize their type of journalism as "aggressive ignorance". Get a small bit of knowledge, and then blow it all out of proportion by extrapolation.

reminds me a stat in American football, whichever teams kneals down the most wins the game. :).

To the unnamed commenter at Jan 7, 2009 9:21:50 AM:

The idea is that we need to think about measuring something more fixed than "survival" -- something like "age at death" instead.

For example, if you die at age 40, then it doesn't matter that the cancer was detected at age 20 and you "survived" for 20 years, or whether it was detected at age 39 and you "survived" for 1. The point is that you still lived to be 40 in either case.

But if it can be shown that people with early detection live to be 41, whereas people with late (or no) detection live to be 39, then you can make the case that early detection extends life expectancy by 2 years.

This would be the more interesting result.

Early detection is not the issue. It might be fine to detect something early, but the choice given to the individual after early detection is the issue. If the individual chooses not to undergo a risky treatment, his cost for health care can go up enough to make him/her think about the treatment in the first place. I think this as a form of monopoly where you don’t have a choice. As long as the medical profession and health care providers don’t offer all the choices to individuals we would be faced with factors mostly out of our control that decide how we live our life.

SUGGESTED READING

http://cancertutor.com/

I came across this same point in my cancer reading a couple years ago.

Every time I read the same critique I ask myself "can they really be this bad" and the more I read my answer is "yes."

If you would have told me that the world was like this when I was in high school I might have went ahead and killed myself, or chosen to major in economics.

To the poster concerned with alerting the medical mainstream: this concept - lead time bias - as well as the myriad other problems that can affect medical research are well understood and appreciated by epidemiologists and others in the medical community. Lead time bias, length time bias, selection bias... are all taught in biomedical statistics courses, medical school, and epidemiology courses. Quality research generally avoids or accounts for such biases. Though, to be sure, there is plenty of poor quality research being published!

I understand the statistical illusion, but it's silly to imply there is a lack of evidence that early detection/treatment doesn't improve the odds of survival. Surgery, for example, can be very effective against cancer if the cancer is found early enough that you can cut it out without dying. If Bob Marley had had his cancerous toe amputated, he'd still be with us. Radiation can be effective against a limited spot, too, but it can't be used against cancer that has spread widely, which leaves only chemotherapy, so now you are down to one method.

Thomas Goetz (and others) --

Alex claims that none of the studies you cite in your article support your thesis. Can you please give the citation for an epi study that appropriately adjusts for "lead-time" bias? Not being an epidemiologist, I'd like to be sure that what they view as lead-time bias is actually addressing the concern that Alex raised.

Roger

There was a really nice discussion of this whole issue on Respectful Insolence (a blog written by a surgeon/medical researcher whose area of research is breast cancer). You can find the first part at

href="http://scienceblogs.com/insolence/2007/04/detecting_cancer_early_part_1_more_compl.php

The series of posts is quite good, and readable with some knowledge of statistics and little knowledge of medicine. (I speak from experience here.)

http://dartmed.dartmouth.edu/summer05/html/hunting.php

These guys are looking closely at public health issues of "early detection"

"Get screened!" "Find it early!" When it comes to cancer, these dictums are considered self-evident. But what if getting screened for cancer and finding cancer ever-earlier does not save lives? What if too much probing does more harm than good? Several DMS physicians have been asking provocative questions like these for more than a decade.

Related is the danger that comes from early diagnosis - unnecessary but risky treatment. There has been some publicity about that problem in Canada recently. Combine the two and the medical industry has some serious explaining to do

I love seeing websites that understand the value of providing a quality resource for free. It is the old what goes around comes around routine.

Comments for this post are closed