surprise shock
I know, dude. We're all freakin' out.
The mainstream scientific community apparently hates low-carb diets. So do the mainstream media, who take every opportunity to parrot whatever 'study' gets published that can provide an attention-grabbing click-bait headline, trashing anything that actually goes against mainstream dietary dogma. Last week it was a new 'study' published in the Lancet, and despite the fact that it is a horrible piece of pseudo-scientific nonsense, it was dutifully splashed across the headlines of pretty much all mainstream media sources. And what was the shocking finding? Low-carb diets will shorten your lifespan. Of course they will.

The study appears to be attempting to show a "balanced" perspective by showing high-carb diets are also dangerous; as if we didn't know that eating lots of sugar is harmful. The researchers claim their findings show that a moderate approach is what's healthiest - less than 40%, or more than 70%, of calories from carbohydrates carried a higher risk of mortality, they claim.

But check out these headlines:
Both low- and high-carb diets can raise risk of early death, study finds
Low-carb diet linked to early death, Lancet medical study says
A Low-Carb Diet May Reduce Four Years of Your Life: Lancet
Low-carb diets associated with lower life expectancy, study suggests
Low-carb diets could shorten life, study suggests

Notice only one of the headlines, the dietary-dogma pusher the Guardian oddly enough, mentioned the high-carb association. The reason being that this was likely an orchestrated hit-piece on LCHF (low-carb, high-fat) diets, a diet that is increasing in its popularity due to its effectiveness in dealing with weight issues, multiple health markers and disease states.

This is because the western mainstream media is essentially a cheerleader for government dietary recommendations, backed up by government indoctrinated dietitians and nutritionists ("experts" if you will, including the quotation marks), and their headlines rarely step out of bounds of those recommendations. Even when they do feature a LCHF approach in their pages, the tone is always with high skepticism, or mocking derision, and inevitably includes a quote or two from 'experts' telling everyone how dangerous such an approach would be, and you shouldn't dare step outside the bounds of acceptable dietary conformity, no matter what the internet tells you.

This study, and its associated headlines, are effective by appealing to the middle ground fallacy - or argumentum ad temperantiam if you dig Latin - which asserts that "given any two positions, there exists a compromise between them that must be correct". By painting low-carb diets as 'extreme', and tacking on high-carb diets (which few, if any, actually recommend) as the opposite extreme, they argue for the moderate middle ground. And, conveniently, the government dietary recommendation just happen to fit that moderate position.

To get more conspiratorial, Belinda Fettke over at #isupportgary.com says this story was quite likely planted. Quoting an unnamed public relations expert,
"The simultaneous appearance across a wide variety of news sources is the most obvious reason. It's not just coincidence and since it's a health story with a very particular slant, it is not something journalists just picked up from a newswire. Unfortunately #fakenews is nothing new, and in fact has been quite prominent since newspapers first emerged."

He concluded; "the difference is that, in the past, journos used to at least make an attempt to look unbiased."
But, you say, it comes from a study! In the, from what I'm told, highly reputable Lancet! Obviously it's science. There's no argument here. Either it's true, or we need to start questioning the scientific foundations of our entire culture. Science is the unquestionable truth! Science told me so!

The problem is that nutritional studies can be, and often are, designed to give whatever results are desired, especially when those results are intended to enforce dogma. Most people only read the headlines and even those who read the entire article rarely if ever look beyond to check its veracity. Despite being extremely tenuous by nature (see below), this tendency seems to be most extreme in much of dietary research, which never fails to generate extreme headlines. The scientific journals get their publicity, the researchers justify their funding, the dietary authorities get their authority bolstered, the news sites get their hits - everybody wins. Except for those actually wanting some approximation of the truth. But who cares about those assholes?

So, What's Wrong With This Study?

Sigh, where to begin. As with almost all nutritional studies, this study is observational. I've harped on about this in the past - observational studies only show if there's an association between the variables studied. They watch a bunch of people, put them into categories based on what they're interested in and try desperately to reduce the literal infinite number of possible confounding factors. Then they see if two variables happen to correlate (for example, those eating the most kumquats having the most instances of seal finger).

But that's where you stop. Seeing that two things correlate does not mean that one caused the other. It may mean that, but we can't know from an observational study. Until someone starts force feeding subjects kumquats to see if it causes seal finger, we can't say one way or another whether a causal relationship exists. Maybe kumquats happen to grow only in seal-infested areas. Or maybe the only produce vendor in the region who sells kumquats has a bad case of persistent seal finger and is spreading it among his customers. There are simply too many possibilities.

All this to say that, to imply causation from two associated variables is really bad science.

Arguing for reform in epidemiologic research, Dr. John P. A. Ioannidis writes:
Assuming the meta-analyzed evidence from cohort studies represents life span-long causal associations, for a baseline life expectancy of 80 years, eating 12 hazelnuts daily (1 oz) would prolong life by 12 years (ie, 1 year per hazelnut),1 drinking 3 cups of coffee daily would achieve a similar gain of 12 extra years,2 and eating a single mandarin orange daily (80 g) would add 5 years of life.1 Conversely, consuming 1 egg daily would reduce life expectancy by 6 years, and eating 2 slices of bacon (30 g) daily would shorten life by a decade, an effect worse than smoking.1 Could these results possibly be true? Authors often use causal language when reporting the findings from these studies (eg, "optimal consumption of risk-decreasing foods results in a 56% reduction of all-cause mortality").1 Burden-of-disease studies and guidelines endorse these estimates. Even when authors add caveats, results are still often presented by the media as causal.
To quote Dr. John Schoonbee, Global Chief Medical Officer at Swiss Re, "even though umbrellas are strongly associated with rain, they do not cause rain!" But most people don't know this, so journalists, and often even the researchers themselves (who really have no excuse), will talk about these correlations as if they're causations. And as long as the word "study" shows up in the headline, people believe it. How many people are dutifully eating their 12 hazelnuts a day, thinking they're adding 12 years to their lives, all because an editor either didn't know, or didn't care, that this likely isn't true?

Food Frequency Questionnaires: The Pseudo-Science Behind Most Nutritional Data

Another issue with the current study is that it, like the majority of nutritional studies, relies on Food Frequency Questionnaires (FFQs). Again, this is something I've written about in the past, but it bears repeating. FFQs have repeatedly been shown to be entirely unreliable in assessing a person's diet. How many eggs have you eaten in the last year? What do you mean you don't know? Maybe you have Alzheimer's. How many times in the last year did you eat salad?

Here's a sample of an FFQ. Take a look and see how rigorous and scientific you think the resulting data would be.

Here's an amusing video to drive the point home further:


In an early version of a manuscript for paper published in the Journal of Clinical Epidemiology titled "Controversy and Debate: Memory based Methods Paper 1: The Fatal Flaws of Food Frequency Questionnaires and other Memory-Based Dietary Assessment Methods", the authors take apart the use of participant's recall as a method of data collection:
First, the use of M-BMs [memory-based dietary assessment methods] is founded upon two inter-related logical fallacies: a category error and reification. Second, human memory and recall are not valid instruments for scientific data collection. Third, in standard epidemiologic contexts, the measurement errors associated with self-reported data are non-falsifiable (i.e., pseudo-scientific) because there is no way to ascertain if the reported foods and beverages match the respondent's actual intake. Fourth, the assignment of nutrient and energy values to self-reported intake (i.e., the pseudo-quantification of qualitative/anecdotal data) is impermissible and violates the foundational tenets of measurement theory. Fifth, the proxy-estimates created via pseudo- quantification are physiologically implausible (i.e. meaningless numbers) and have little relation to actual nutrient and energy consumption.
The above paper is worth reading in its entirety, if you are a nerd like me, as the authors make the case that the very foundation for dietary recommendations given to the public are based on pseudo-science. And considering that the majority of the greater than 1,000,000 diet and health studies published since 1946 relied on these M-BMs (Ibid.), their point is well-taken.

Making Claims Beyond its Reach

Getting back to the current study, one of its most glaring flaws is that, despite the study's multiple references to low carbohydrate diets throughout, no participants in the study were actually eating low-carb. As Dr. Zoë Harcombe points out:
Low carb diets have not been studied by this paper. Full stop. The average carbohydrate intake of the lowest fifth of people studied was 37%. That's a high carb diet to anyone who eats a low carb diet. As we will see below, the researchers managed to find just 315 people out of over 15,000 who consumed less than 30% of their diet in the form of carbohydrate. The average carb intake of these 315 people was still over 26%. Not even these people were anywhere near low carb eating. Hence, if you do eat a low carbohydrate diet, don't worry - this paper has nothing to do with you.
So what the researchers were labeling as "low-carb" was not an actual low-carb diet. It was simply the lowest carbohydrate consuming group among the participants; in other words, the lower carbohydrate end of the spectrum of a crappy conventional diet. Anyone comparing the "low-carb diet" of the study participants to what someone is actually eating on a low-carb diet would see there is very little resemblance. So the researchers, and all the journalists who reported on their findings, have been completely dishonest in how they've presented this research. Their findings have nothing to do with people doing low-carb diets. But that won't stop your mom from calling you, all in a tizzy, after reading the headlines, telling you to stop this foolishness and go back to eating bread and Cheerios because she can't handle outliving her children, fergawdsakes. Mission accomplished, Lancet.

Here's NHS cardiologist, and low-carb diet advocate, Dr. Aseem Malhotra calling this study "fatally flawed" and "a miscarriage of science" to a befuddled BBC presenter, bravely attempting to hold up the mainstream perspective by appealing to the authority of the Lancet:


And Malhotra isn't the only one publicly calling out this study. In fact, there's so many that it would go beyond tedious to cover them all. To get into the minutia, Dr. Zoë Harcombe has written two articles breaking down the flaws in the study, and effectively shows how devious the authors were in their data manipulation. (See: Low, moderate or high carbohydrate? and Low carb diets could shorten life (really?!) Not enough? Chris Kresser has also done a thorough response. As he says of the study, "The devil is always in the details, but details aren't sexy and don't generate clicks."

And here's a quick synopsis from Dr. Georgia Ede, LCHF and carnivore diet advocate, who was quite vocal on social media, pointing out the study's flaws (note: RCT = random controlled trial):

Dr. Ede quote
So, at the end of the day, we have a "study" that relies on notoriously inaccurate FFQs, that can't imply causation because it's observational, making claims about the danger of low carbohydrate diets despite the fact that they didn't study low carbohydrate diets. But they got their headlines, which is probably exactly what they were after from the beginning, discouraging people from trying a dietary approach, which is increasing in popularity, that might actually prove very beneficial to one's health. Notice that none of the multiple studies showing benefits of LCHF dietary approaches have received even close to the same traction in the media as the current one (and many like it; at least one of these studies per year for the last decade, by Chris Kresser's count).

Call me cynical, but I think the ultimate purpose of studies like this is to spread propaganda among a public that isn't paying attention, serving as gatekeepers to prevent people from veering away from mainstream dietary recommendations, and improving their health. They serve to protect the monopolistic structures held in place as 'the unquestionable authorities' on dietary advice, at the same time protecting the financial interests of their Big Food sponsors. The whole thing is a racket.

When your mom calls, freaking out about how you're killing yourself slowly from a bread deficiency, send her this article. Maybe she'll at least read the headline.