Cancer Sucks
After my article showing that prostate cancer screening with the PSA-test does more harm than good, I was asked to follow up with an article looking at breast cancer screening with mammography. That turned out to be easier said than done, because virtually all studies of breast cancer screening only report the effect of screening on breast cancer mortality, not on overall mortality.

As I've discussed several times before on this blog, mortality from a specific disease is a meaningless when trying to work out if an intervention is life saving or not. The only statistic that matters is overall mortality. If an intervention decreases your risk of dying of breast cancer by 10%, but increases your risk of dying of something else by 10%, you probably wouldn't be interested in that intervention. People care about whether they are dead or alive, not about which cause of death is listed on their death certificate.

Unfortunately, a lot of scientists don't seem to understand this very basic concept, and so continue to produce meaningless studies showing that intervention x decreases the risk of dying of y, without any thought as to what the effect on overall mortality is.

When it comes to screening, there is a second problem, and that is that screening generates a lot of false positives. A false positive is when someone is told that they have a disease when in fact they don't. The problem with false positives is that they lead to follow-up interventions, which can sometimes cause physical harm and even death.

In the case of prostate cancer, the initial follow-up intervention is prostate biopsy, which carries with it a risk of acute prostatitis and sepsis (which can be fatal). After the biopsy, there is a significant risk that the person with a false positive diagnosis will have surgery to remove the prostate or radiation therapy to destroy it. The problem with these interventions is that they often cause sexual impotence and urinary incontinence.

I've met several patients who have had their lives ruined by the treatment they received for prostate cancer. An intervention that may, or may not, have prolonged their lives.

In 2012, the Cochrane collaboration performed a systematic review to see whether breast cancer screening saves lives, and whether any benefit in terms of lives saved outweighs the human cost in terms of false positives that lead to unnecessary surgery and radiotherapy.

The systematic review identified 11 randomized controlled trials of breast cancer screening, of which three were excluded due to quality issues. Two of these were excluded because they were small trials looking at several interventions, of which breast cancer screening was just one. One study was excluded because it removed patients who died in the screening group after randomization, but didn't do the same for patients in the control group. Obviously if you remove deaths from the screening group but don't do the same for deaths in the control group, this will make screening look better than not screening. It is a form of scientific malpractice.

That leaves eight trials that were included in the review, with a total of over 600,000 women. The trials had slight variations in the ages at which women were included, but none looked at women under the age of 40, and most had an upper cut-off of 65 or 70, so that is the age range for which the results of these studies are applicable.

Three of the eight trials were determined to be adequately randomized, while five were determined to have a sub-optimal randomization (i.e. the randomization of participants to either the screening group or the control group was done in such a way that there was significant risk of the results becoming biased in favor of screening). Of the five with sub-optimal randomization, one was determined to be so flawed in terms of how it randomized patients, that its results were analyzed separately and not included in the main analysis. So only seven studies were actually included in the analysis.

Let's look at the results.

First we can look at breast cancer deaths, just because that is what all the studies had as their primary end point, even though it tells us nothing about whether breast cancer screening actually saves lives overall.

When all seven studies were included in the analysis, there was a 19% reduction in the relative risk of dying of breast cancer with screening during 13 years of follow-up. The result was statistically significant. However, when only the three studies with adequate randomization were included, that decreased to a 10% relative risk reduction with screening, which was no longer statistically significant.

So, even if we look at the highly flawed metric of breast cancer deaths, we can't actually be sure with any certainty that screening decreases them. Now let's get to the thing that actually matters, overall deaths.

If we look just at the three adequately randomized trials, there was a 1% relative reduction in risk of dying by the 13 year mark. However, this marginal reduction was nowhere near statistically significant. In other words it was likely due to chance. Even if we look instead at the sub-optimally randomized trials with a high risk of bias, the reduction in overall mortality was still only 1% .

Just to give some perspective on what this means in terms of absolute numbers of lives saved, if we assume that the 1% reduction in overall mortality is real and not just the result of chance: Out of 318,515 women in the control group, there were 747 deaths over 13 years of follow-up. A 1% reduction in mortality would thus mean that, over thirteen years, if you screened 300,000 women, you would prevent 7 deaths. So you would need to screen over 40,000 women to save one life.

What can we conclude from this?

The highest quality evidence suggests that breast cancer screening does not save lives, or at best, has an extremely marginal effect on mortality. In which case, that extremely small potential reduction needs to be weighed against the harms caused by false positives and also against the cost of screening.

So what are the avoidable harms caused by breast cancer screening?

Apart from the anxiety induced by being told that you have breast cancer, a diagnosis generally leads to surgery and possibly also to radiotherapy. Both are potentially disfiguring, while radiotherapy can cause damage to the heart and lungs (since these organs also get hit by some of the radiation). Occasionally this results in serious complications like lung cancer, pulmonary fibrosis, coronary artery disease, and heart failure.

In the seven included studies, screening resulted in a 35% increase in the relative risk of surgery, and a 20% increase in the relative risk of radiotherapy. Both these differences were highly statistically significant. Since there was no statistically significant effect on overall mortality, we can assume that these increases represent one of two things (possibly a bit of both):

1. A lot of healthy people had surgery and radiotherapy that they didn't need.

2. The harms caused by surgery and radiation therapy were such that, overall, people didn't live longer than they would have without screening.

If you prevent one person dying of breast cancer but cause one person to die of lung cancer, you've really gained nothing. Which is why we should always only look at overall mortality, never at a specific cause of mortality.

Based on their results, the authors of the review determined that, for every woman who actually has breast cancer and has her life saved through screening, at least ten healthy women will be falsely diagnosed as having breast cancer and treated for it.

There is another aspect to consider here, and that is cost. Because screening isn't exactly cheap. A study published in JAMA internal medicine in March of this year found that mammography screening in the US costs about 350 dollars per woman screened. Obviously the costs will vary up and down between different countries, but even if we assume a much lower cost of 100 dollars per woman screened, and that screening really does cause a 1% reduction in deaths, screening would cost four million dollars per life saved.

That is an enormous amount of money, and from a societal perspective, it is not ethically defensible. The reason it isn't defensible is that you can save many more lives with other much cheaper interventions. The more money you spend on one thing, as a society, the less you have left to spend on other things.

In the UK, the National Health Service (NHS) is not supposed to spend more than 30,000 pounds per year of life saved by an intervention. Let's assume that the average age of the woman whose life is saved by screening is 50 years old, and that she will on average live another 35 years if her breast cancer is caught in time to treat it. And let's assume breast cancer screening only costs 100 dollars per person (remember, in the US, the cost is actually 350 dollars). That would mean that catching her breast cancer in time to save her would cost roughly 114,000 pounds per year of life saved (40,000,000 / 35 = 114,286).

I'm being intentionally generous towards breast screening in the numbers I'm using here - in reality the number of years saved is probably much lower, and the cost much higher. But even with these intentionally generous numbers, the cost is still far beyond what is considered acceptable at a societal level for other diseases.

So what can we conclude from all this?

As with prostate cancer screening through the PSA test, the probability of harm from breast cancer screening is much bigger than the probability of benefit. You are at least ten times more likely to get treatment that you don't need than you are to get treatment that you do need. And it is highly questionable whether breast cancer screening has any beneficial effect whatsoever on mortality - if it does, the effect is tiny.

And from a societal perspective, the harms are definitely bigger than the benefits, since a huge amount of money is plowed in to a highly questionable intervention which could instead have gone to interventions that we know save lives at a fraction of the cost.

You might also be interested in my article about scientific method in health science, or my article about whether statins save lives.