First, to just mention a few points tangential to that: Most nutritional epidemiology is among the biggest jokes in the field -- not as bad as tobacco research, but worst than most other subfields. The big cohort study projects, like the source of this particular study, are notorious for publication bias. In other words, had they not gotten the "right" answer, there is a good chance the result would have been censored, so on average the results overstate the case for the consensus beliefs.
Additionally almost all nutritional epi is based on "food frequency questionnaires", which ask dozens or hundreds of questions about what someone eats and are notorious for having a huge amount of measurement error (i.e., the data might be useful, but it is always quite wrong). Have you ever noticed almost every such study takes pains to point out it was a validated food frequency questionnaire. Notice that they never tell you what this impressive-sounding adjective means. (Hint: it means that one time they checked to see whether the instrument produced results close to those from some more careful measurement method; notice that they never tell you how that checking worked out.) One of the more inside/subtle jokes in my "tobacco candy" research parody was a dig at the silly term "validated food frequency questionnaire".
That said, the observation that meat seems to be bad for your longevity and red meat seems to be worse than average has been replicated enough that it is unlikely to be wrong. Indeed, in despite the new round of headlines, the new study really told us nothing new -- which means that it stands a much better chance of being approximately right than something that made a novel claim. So, for today, take the result as True and see how people did at explaining what it means.
The main result was that eating red meat increases the hazard rate for dying by 13% for each serving-per-day that you eat. (I am going to set aside the fact that that fixating on the exact 13% implies far more precision than the research provides -- a common error of those who do not understand study error.) Note that this is very different from a lot of the results you see in epidemiology in several ways:
- That "hazard" thing means that whatever the risk of having died would have been this year or next year, it is increased by 13%, and that continues for future years. It does not just mean that the chance of some bad thing occurring sometime in your life has increased by 13%. (Note: usually studies that calculate this "hazard ratio" just assume that this pattern -- the same x% change every year -- and force the data to fit it. In the present case they actually tested that assumption but allowing the curve to wiggle, and while it was clearly not a perfect fit, it was not terribly wrong.)
- Often risks you hear about are an increase in the chance of getting one particular disease, often one that is rather rare, while this is about an increase in a risk for mortality in general.
- The reported change in risk was for a realistic level of change in behavior that someone could make. Indeed, they could move by multiple increments, like going from 3 servings down to 1, for two increments of benefit. This contrasts with many studies that only report the comparison of those with the greatest exposure to those with the lowest exposure (ignoring the majority of the population in between), so someone could only see the theoretical change described if they were at the worst extreme and somehow could move clear to the other extreme.
So props to the BBC for taking it seriously and trying to put it in perspective. Too bad about the answer they got:
The easiest way to understand it is to think of how this might affect two friends who live very similar lives, according to David Spiegelhalter, a Cambridge University biostatistician, and the Winton Professor of the Public Understanding of Risk.So far, that really adds nothing, other than maybe explaining "hazard ratio" and telling you what "a serving" is, if that jargon was neglected in a news report. So, continuing:
Imagine that the two friends are men aged 40, who are the same weight, do the same amount of exercise and do the same job. The only difference between them is that one eats an extra portion of red meat every day - an extra 85g, or 3oz. "Let's say that every work lunchtime one of them had a hamburger and the other didn't. "What the study found is that the one who likes the meat had a 13% extra risk of dying. They're both going to die in the end, but one has got this extra annual risk of dying."
But what does that extra risk amount to in practice - for these two average people? The paper doesn't say. Spiegelhalter has been working it out.Unfortunately, that simplification, though tempting, is not a very useful way to think about this risk. Indeed, it is quite misleading. Someone might well make the suggested choice, to sacrifice their 80th year. But that is not the choice. The choice includes having a 13% greater chance than your peer of losing your 50th year (and every one thereafter). Obviously this is still unlikely -- a 13% increase in dying at that age still results in a small increase because it is merely 1.13 times a fairly small risk -- but it might result in different motivation. Most people are a lot more willing to give up a year of old age than risk the same expected value (statistics talk for "the probability averages out to the same total") of loss across their middle and old age. Whatever the merits of that preference, it is the predominant preference, so saying "don't over-worry about it -- it is just one fewer years of retirement" understates the real risk.
"The person who eats more meat is expected to live one year less than the person who doesn't eat so much meat. You'd expect the 40-year-old who does eat the extra meat to live, on average, another 39 years, up to age 79, and the person who doesn't eat so much meat, you'd expect him to live until age 80."
So all those headlines, and it turns out we are talking about whether you might live to age 79 or 80. Maybe you feel willing to sacrifice that year in order to enjoy a life full of roast beef and steak sandwiches.
But the story is not over yet. The BBC and their consultant go on to propose an error that probably tends toward the other direction to make up for this:
But Spiegelhalter says there is another way to look at the statistics, which might make the issue seem more urgent. That one year off the life of this 40-year-old hypothetical burger eater is equivalent to losing half an hour a day.That may well make it seem more urgent for some people -- but too much so. Someone who is urgently trying to succeed in school, launch a business, or be a single parent might rationally consider half an hour a day right now to be incredibly urgent, such that they would gladly borrow it from the later in their life. (I have certainly had those years. You?) The loss of half an hour per day would thus be enormously more daunting than a 13% hazard ratio, let alone losing her potential last year.
"On average, when he's sitting eating his extra burger, that person is losing half an hour of life because of that meal. On average, it's equivalent - scaled up over a lifetime - to smoking two cigarettes a day, which is about half an hour off your life.
On the other hand, many of us who are pretty secure in our day-to-day performance might choose to trade a half hour per day, or even several hours, for getting to see how the next generations turn out for a few extra years (assuming our healthfulness over the years averages out the same). So this simplification does not work either, overstating the loss for someone who is intensely busy with important stuff, but perhaps understating it for others.
The real mistake here, I believe, is assuming that this is something that people cannot understand if you tell it straight. Many percentages require some kind of "professor of public understanding of risk" treatment because the risk is of a magnitude that people cannot understand. People do not understand how truly small something like "a 54% increase in lifetime risk of esophageal cancer" is, and so resorting to one of these misleading simplifications might be an improvement over "ooh, 54% is a big number! -- that must be bad!". Even worse are environmental exposure risks that are down in the one-in-a-million range; telling someone, "the total lifetime risk from this adds up to losing a minute and a half off the end of your life" is useful because it transforms "there is a risk!!!!" to the rational "oh, never mind."
But the red meat risk is actually big enough that people can understand the numbers and might legitimately care about the difference. If you tell someone "based on your demographics, there is an X% chance you will die before age 65, and if you eat one fewer servings of meat per day, it will drop to X/(1.13*X)% those are numbers someone can understand. They would be in the order of 4% and 4.4%. Ok, not everyone will be able to understand that, but anyone who cannot probably cannot make much sense out of the suggested equivalencies either.
So, if the BBC and their Cambridge consultant cannot figure out how to sum that up, who can? Credit to Rob Lyons at Spiked:
The authors claim that 9.3 per cent of deaths in men and 7.6 per cent of deaths in women could be avoided by eating little or no red meat. To put that into some back-of-an-envelope statistical perspective: multiplying that 9.3 per cent by the 20 per cent who actually died [by age 75 during the course of the study] shows that about 1.8 per cent of red-meat eaters would die by the time they were 75 because of their meat-eating habit. Even if that claim were absolutely accurate (and even the authors call it an estimate), would you really give up your favourite foods for decades on the slim possibility of an extra year or two of old age?Often the answer is "yes", of course, despite the implication of the phrasing. Indeed, if you are going to change your behavior to try to live longer, as many people try to do, this change may well have the greatest benefit:effort ratio available. But that aside, if you ask the question this way (and perhaps extend the same calculation to give the numbers for ages 65 and 85 also), you are answering the right question when you make the choice.