SkepticblogSkepticblog logo banner

top navigation:

Reporting Preliminary Studies

by Steven Novella, Feb 14 2011

A recent study, presented as a poster at the American Stroke Association International Stroke Conference, found a 61% increase in risk of stroke and cardiovascular disease among survey respondents who reported drinking diet soda compared to those who drank no soda. The study has resulted in a round of reporting from the media, and in turn I have received many questions about the study.

Frequent readers of this blog should have no problem seeing the potential flaws in such a study. First – it is an observational study based upon self-reporting. At best such a study could show correlation, but by itself cannot build a convincing case for causation. Perhaps people who are at higher risk of cardiovascular disease and stroke, for whatever reason, are more likely to choose diet sodas because they are trying to avoid unnecessary calories. Questions that should immediately come to mind – what factors were controlled for and how was the information gathered? According to an ABC report:

The researchers used data obtained though the multi-ethnic, population-based Northern Manhattan Study to examine risk factors for stroke, heart attack and other vascular events such as blood clots in the limbs. While 901 participants reported drinking no soda at the start of the study, 163 said they drank one or more diet sodas per day.

The study also controlled for “smoking, physical activity, alcohol consumption and calories consumed per day.”

Some obvious factors were controlled for, but others were not. For example, there did not appear to be any control for BMI or any measure of fat percentage. This is the most likely confounding factor – overweight people are more likely to drink diet soda and have vascular disease. They did later account for metabolic syndrome, but not weight as an independent variable and not other eating habits. People who drink diet soda may also be doing so to offset less healthy eating habits otherwise. This in itself makes it impossible to interpret the study.

Further, we have a one-time self report, rather than reporting on soda intake at several points in time. The data itself is not very reliable.

While this study has serious flaws that preclude any confident interpretation, it is a reasonable preliminary study – the kind of study that gets presented as a poster at a meeting, rather than published in a high-impact peer-reviewed journal. Such preliminary research is mostly an exercise in data dredging – looking at data sets for any interesting signals. The purpose of such preliminary research is to determine whether or not more definitive follow up research is worth the time and effort. If there were no signal in this data, then don’t bother designing and executing a tightly controlled several year prospective trial.

Medical science is full of these preliminary studies. They provide the raw material from which large and expensive trials are derived. We also know from reviewing the literature that most of these preliminary studies will turn out to be wrong. The scientific community understands this.

The problem is in the reporting of these studies. The mainstream media probably should just ignore any study that is deemed preliminary, especially if it’s just an isolated study. Perhaps in a thorough feature article it would be reasonable to give an overview of the state of the research into a question, including preliminary studies, because in a feature time can be taken to put the evidence into perspective. But reporting a single preliminary study as science news is a highly problematic approach.

On this item there was a range of reporting, from fear-mongering to reasonable. The ABC report, for example, was very reasonable and included appropriate background information and balanced quotes from critics of the study. But many people reading the report will come away with just the headline: “Diet Soda: Fewer Calories, Greater Stroke Risk?” (other headlines did not even include the question mark). Even those who read the article and get the fact that the conclusions are preliminary and many experts are skeptical – three months from now they are likely to just remember the association between diet soda and stroke risk, and not the fact that the association is likely not true.

Over-reporting of preliminary results also has the effect of confusing the public with lots of noisy information, most of which is not true. This causes people to distrust science in general, because they keep hearing conflicting information.

It is unlikely that the mainstream media will voluntarily forgo the reporting of sensationalistic news just because the information is preliminary and unreliable. It is too easy for them to convince themselves that including a bit of skepticism (or even well-balanced skepticism) is sufficient. While this is better than rank fear-mongering (which also happens) in the end the reporters still get their flashy headline and the public comes away with misconceptions.

While I will continue to advocate for higher standards of science news reporting (including using judgment in terms of what not to report), it seems this needs to be combined with educating the public about the nature of preliminary science research, the nature of observational vs experimental studies, and the need to filter all science news reporting through an informed skeptical filter.

14 Responses to “Reporting Preliminary Studies”

  1. Thank you for this. I have already been making many of these responses to the paper. Too many people depend upon the media for health information and only see the ‘experts’ constantly changing recommendations. They don’t have the inclination or ability to examine the original information.

    This leads to either trusting the reporters, or trusting no-one and leaving the door open for CAM practitioners. Since woo is unchanging and never acknowledges contradictions,it provides certainty for the public.

  2. Max says:

    Last year, there was this
    “Daily Consumption of Diet Soda Linked to Metabolic Syndrome, Type 2 Diabetes”
    http://www.medscape.org/viewarticle/588137

    There’s the theory that drinking diet soda causes weight gain, which in turn causes health problems. In that case, weight gain is not a confounding variable, so the study shouldn’t control for it.

    • Here’s one that always gets me: studies with so small a studied population that, given the p value involved, the study can’t really “prove” anything even close to a reasonable doubt anyway. The latest? A study that claims fMRI can predict how likely a person trying to quit smoking is to succeed.

      People studied? 28. http://news.discovery.com/human/smoking-quitting-brain-activity-110131.html

      “Studies” like these border on pseudoscientific because they’re done in part precisely to get headlines, IMO, which then become “leveraged” as part of a push for more grant money.

      If some pop sci mags would simply stop publishing them, that would be a start.

  3. Max says:

    Educate the public with comics

    The Science News Cycle
    http://www.phdcomics.com/comics/archive.php?comicid=1174

  4. Mario says:

    Just like you pointed out, this is a huge problem when it comes to people trusting in science, media only care about ratings and day to day are more irresponsible about what gets published, and most of the people never take neither the time to read it through nor to check up the facts on another source, and this goes for everything from health to war reports.

  5. CountryGirl says:

    The question should be WHY did they do this study. The misinformation on the subject of food and drink is everywhere. So instead of studying the problem and looking for a cause they came up with what they want blame the problem on and then searched for support for that claim. Arguably you could just as easily show a correlation (I use that word loosely) between eating carrots and stroke. The first question everyone should ask when faced with some new claim is; WHY? Why did they do what they did? What is their motive? What are they trying to accomplish? Who benefits?

    • Somite says:

      In reality these studies are multifactorial and this might be one of many positive or negative correlations observed. Like Steve says this is just helpful to frame and design future studies and should not be considered a conclusion.

    • Max says:

      They must’ve been paid off by Big Water <_<

    • It’s a socialist plot to control our food, CG. But, given what you’ve already evidenced of your political leanings, you already knew that, didn’t you?

  6. BillG says:

    “Studies need further studies” see news at 11.

  7. Brandon Z says:

    Probably snarky – but a line here trips a pet peeve of mine. I normally wouldn’t comment, but since we are talking about communicating science and science education….

    “data” is the Plural form of datum. Thus “These data themselves are not very reliable.” is a more appropriate turn of phrase. Also, just for good measure the number have reports has no bearing on the reliability of these data, but the reliability of any estimates or generalizations drawn from the data. The reports themselves may be perfectly accurate in their reflection of the concept that is trying to be measured. The self-reporting is probably more of a threat to the validity of the data as it probably consistently fails to measure what is actually desired.

    In any case great post. Thanks for letting my ” peevish beast inside” grace the college section.

    • tmac57 says:

      “data” is the Plural form of datum.

      Not necessarily. See this usage note from Merriam Webster:

      Data leads a life of its own quite independent of datum, of which it was originally the plural. It occurs in two constructions: as a plural noun (like earnings), taking a plural verb and plural modifiers (as these, many, a few) but not cardinal numbers, and serving as a referent for plural pronouns (as they, them); and as an abstract mass noun (like information), taking a singular verb and singular modifiers (as this, much, little), and being referred to by a singular pronoun (it). Both constructions are standard. The plural construction is more common in print, evidently because the house style of several publishers mandates it.

  8. Karen says:

    Science in the news just drives me up the wall.

    I had a professor who said he never gave interviews for specifically this reason, they always misinterpreted what he said and came out with an article that basically slaughtered his work.

    I understand its not really interesting news wise to write up both a studies findings accurately and post limitations of each study, but its SO misleading to just go headline grabbing and post the most sensational piece of a study. I’ve gotten to the point I refuse to give any credibility to news stories talking about science break throughs unless someone can provide me with the actual research to go along with it. News should really be printing the facts and not what will just get people to read, but obviously they have to make money somehow.

  9. Al Morrison says:

    Steve, during your latest SGU episode, you pointed to a 20-year longitudinal study indicating an increase in scientific literacy (from 10 to 28%). With this in mind, it would seem now is the time to begin to introduce the method of science during science news stories.

    Adding the nature of the research (preliminary, basic, applied) and how the current research reflects the prevailing scientific consensus would put the findings in context.

    Another interesting technique (if you listen to the Science Magazine podcast you will here it used often) is to have an expert from the relevant field — who is not affiliated with the study — comment on the findings. It is like having a real-time peer-review session. This is a technique frequently used in print, but used much less during radio and TV news reporting.

    Certainly this would require an investment by the news media. Yet it would seem there is more of an audience today than there was 20 years ago.

    Unfortunately, this may do nothing to offset sensational headlines. As you correctly point out, regardless of the content and integrity of the article or story, its headline (whether written or “dropped and dangled” prior to a commercial break) may be all that is remembered. I am certain it is debatable whether a snappy headline is required at all (an interesting research topic in its own right). Nonetheless, a snappy headline does not have to mislead, create dichotomy, or otherwise subvert the actual report and research.

    Clever writers, directors, and producers can undoubtedly find ways of making science news more interesting while remaining true to the science. There is finally evidence that a significant segment of the population would be receptive to better science reporting regardless of the medium.