Why You Shouldn’t Believe Every Hyped Up Media Headline on Health
Supplement and Nutrient (Flawed) Studies with Big Press
The anti-vitamin and supplement attack has been occurring for many years. For example, in December 2013 in the Annals of Internal Medicine, there was a big ado about the supposed useless value of multivitamins for overall, cognitive, and cardiovascular health. This was based on three studies. However, the methodology used for the various studies had several flaws. These included:
- using the incorrect form of certain vitamins.
- not considering absorption efficacy of the supplement; i.e. how can it work if it’s not absorbed and assimilated well?
- baseline need for the intervention wasn’t considered-in other words, a “highly nourished” population would make the need for multivitamin or additional nutrients appear less necessary and effective. This is because the subjects were already “nutrient sufficient.” (A comparison to this would be using a drug for a condition you don’t have.)
- poor adherence to the intervention and high dropout rates were found in the cardiovascular study.
I discuss all of these in detail here.
This is just one example of how bad studies can lead to misinterpretations of the efficacy of an intervention. Still, whether it is a drug, supplement, or behavioral approach, these research flaws can taint any conclusion. I reviewed some other examples of methodological flaws leading to negative biases of natural methods here, here, and here.
The Variance Factors
From Cell to Animal to Human
A very interesting study in Nutrients in 2013 pointed out in a very precise manner another major issue for accurate interpretation of studies- translating cell and animal studies to human outcomes. This can be dangerous due to various lab animals or cell cultures having varying biochemical responses that could differ from a human. Although they are useful in determining mechanisms and looking at how various compounds work, caution is warranted in basing the results for Homo sapiens (“wise humans”).
From Person to Person
To take this analogy further, different individuals hold variances in biochemistry and genetic predispositions within the population. This makes generalizations for specified treatments (drugs and supplements) more complex. Consider MTHFR or COMT single nucleotide polymorphisms (SNPs). These are examples of how differences in how one’s body uses folate and other nutrient co-factors affect certain enzyme pathways which are needed to allow for the body work optimally.
Still, even within this context of a genetic propensity, it can get more complex for two reasons:
- If the diet and other lifestyle factors are in balance, additional supplementation may not be needed, even if your 23 and Me Test says you may need extra B12 or folate!
- There is some evidence that over-methylation with folate can cause aggravation in those who have diagnosed mental issues, though not so much with autism. This is due to its effects on neurotransmitter receptors, even if they have a folate variance!
Basically, it goes deep and using both genetic variances with symptoms and a health history are needed if one is off balance and needs to dig deeper.
A Side Note on Preconceived Theories Being Tested
Okay, pause here for a side note…
Let me point you to the rabbit hole of if the theory of neurotransmitter imbalances and depression. This neurotransmitter theory of depression is one example of how a theory can result in many trials that may be looking for how to treat an effect rather than a cause. (There’s a mind bender!) No wonder there’s so many mixed studies, right?
Anyway, there’s a whole book on that now and it has enough research to satisfy any real scientist interested in the truth vs. theory. Dr. Brogan has a profound love of geeking-out in research and finding biases verses validity. You can go cross reference crazy here to read her synthesis of evidence to support her points.
The Issue of Using Bad Quality Supplements for Research
There is a valid issue of quality concerns for the supplements and vitamins on the market. There’s been a lot of attention lately on this lately due to the February 2015 press eruption from the NYS attorney general. This first caused a backlash to supplement safety and then a follow-up-counter-punch pointing out flawed methodology. Specifically, it was related to the finding of supplements not having ingredients listed on the label as well as contaminants being found in tested products. Here’s the press release entitled,
A.G. Schneiderman Asks Major Retailers To Halt Sales Of Certain Herbal Supplements As DNA Tests Fail To Detect Plant Materials Listed On Majority Of Products Tested
And Desist Letters Sent To GNC, Target, Walgreens And Walmart As Most Store Brand Supplements Were Found To Contain Contaminants Not Identified On Ingredient Labels; Just 21% Of Supplement Tests Identified DNA From Plant Species Listed On Labels: Schneiderman: Mislabeled Consumer Products Pose Unacceptable Health Hazard
However, later on the methodology was found to be flawed. Still, quality control wasn’t off the hook for supplements. This is why I’m a stickler for professional supplements that are standardized and bought directly from the manufacturer. (Yes, there are people who buy supplements and put placebos in the bottle and re-sell them online in the same bottle!) Here’s one example of a brand I use due to their exquisite testing I’ve seen firsthand. There are others who have similar qualifications (Fullscript distributes through Emerson.).
The Problems with Medical Research
It’s interesting that conclusions of supplements in a few unfavorable study or reviews are deemed useless and harmful; whereas medications, that are more commonly prescribed, become a controversy and hot topic rather than considered a waste and detrimental. This is regardless of medications being more potent and resulting in many safety concerns. Let’s take one of the most controversial issues in integrative health and conventional health-the role of statin medications to reduce cholesterol.
This cholesterol controversy is a whole other topic, with a lot of factors to explore, in which I dove into a few here. (Interestingly, as the debate continues on lowering cholesterol, it was removed in the 2015 dietary guidelines as a nutrient of concern.) However, in this blog, I want to focus more on the clinical studies and potential unsound results leading to standard of care treatment decisions.
When I did a review of statin trials awhile back, I found that studies which purport high success in outcomes were: (a) either basing efficacy on a specific measurement, such as on lowering cholesterol, rather than clinical outcomes (decreases in mortality rates (deaths) from cardiovascular disease), or (b) using select populations with specific health histories to establish an effect (such as secondary or primary prevention of a heart attack). Furthermore, there was also the issue of biases in the studies related to funding, which I will get to later.
Regardless of this conflicting science, nearly ½ of Americans were expected to be recommended statins due to guideline updates in 2013. These guidelines were based on two studies reported in JAMA found here and here. The New York Times did a good review on the two sides of the story here. The article reported several arguments against the recommendations. These included population sizes being too small for clinicians to make conclusions and the concern for an overestimate of risk. (Speaking of risk, the NNT (number needed to treat) website is a great resource consider for bookmarking if you are looking for comparing actual clinical outcomes of a pharmaceutical intervention to risks for “side effects.”)
So, it appears in both supplement and nutrient trials have blemished research. This may be one reason why our current methods for treatment is producing such sad health statistics. Furthermore, the reproducibility of the evidence-based model in real world settings itself has limitations within our “proving of efficacy.”
More on the Science of the Flawed Science
Here’s some more specifics with a few examples of how science and interpretations can be biased, misleading, or falsified… and the scary thing is these studies are guiding treatment recommendations!
1. Suspected survivor bias. According to the investigators of a study, “Survivor bias occurs when exposed cases are less likely to take part in a study (e.g., because they died or became severely ill) than unexposed cases.”
2. Healthy user and related behavioral biases. A 2011 study reported on several observational studies that lead to over-exaggerating benefit based on participants’ behavior and characteristics. These included: “healthy user effect, the healthy adherer effect, confounding by functional status or cognitive impairment, and confounding by selective prescribing.”
3. Falsifying results or unintentional errors in data. According to another 2011 study reviewing reasons for retraction in 742 studies (bold emphasis is me again):
“Error was more common than fraud (73.5% of papers were retracted for error (or an undisclosed reason) vs 26.6% retracted for fraud). Eight reasons for retraction were identified; the most common reason was scientific mistake in 234 papers (31.5%), but 134 papers (18.1%) were retracted for ambiguous reasons. Fabrication (including data plagiarism) was more common than text plagiarism. Total papers retracted per year have increased sharply over the decade (r=0.96; p<0.001), as have retractions specifically for fraud (r=0.89; p<0.001). Journals now reach farther back in time to retract, both for fraud (r=0.87; p<0.001) and for scientific mistakes (r=0.95; p<0.001). Journals often fail to alert the naïve reader; 31.8% of retracted papers were not noted as retracted in any way.”
Here’s some examples:
- two retracted studies in cancer research were due to falsification of data on a protein thought to suppress tumor growth.
- The Wall Street Journal reported on 21 fabricated trials for one anesthesiologist.
4. Issues with preclinical data translation, reproducibility, and patient selection in cancer trials.
5. Inappropriate authorship. A 2011 BMJ review states (I bolded again), “Inappropriate authorship (honorary and ghost authorship is an important issue for the academic and research community and is a threat to the integrity of scientific publication. Our findings suggest that 21% of articles published in 2008 in the general medical journals with the highest impact factors had an inappropriate honorary author, and that nearly 8% of articles published in these journals may have had an unnamed important contributor.”
More concerning is there were limitations in the review above that were potential biases! These included potential recall bias by the respondents, small number of studies, and criteria for authorship which may not generalizable to all studies. This could make the findings of inappropriate authorship even larger!
6. Funding Effects. Bias can be found in favor of an intervention based on who is funding the study as this study and this one point outs out. The second one is linked to the nutritional industry (See, I’m not out to bash drugs that are properly prescribed. But, I may be doing a little bashing media hype and poor studies leading to bad medical decisions- natural or synthetic!)
This Leads to Several Questions:
- Are most studies false?
- Should we just ignore the data?
- If we look at the data, how do we know what is true?
- Where do we go from here?
Enough Is Enough: Stop Wasting Money on Vitamin and Mineral Supplements. Ann Intern Med. 2013;159(12):850-851. doi:10.7326/0003-4819-159-12-201312170-00011
Michels A, Frei B. Myths, Artifacts, and Fatal Flaws: Identifying Limitations and Opportunities in Vitamin C Research. Nutrients. 2013; 5 (12): 5161 DOI: 10.3390/nu5125161
Pursnani A, Massaro JM, D’Agostino RB, Sr, O’Donnell CJ, Hoffmann U. Guideline-Based Statin Eligibility, Coronary Artery Calcification, and Cardiovascular Events. JAMA. 2015;314(2):134-141. doi:10.1001/jama.2015.7515.
Pandya A, Sy S, Cho S, Weinstein MC, Gaziano TA. Cost-effectiveness of 10-Year Risk Thresholds for Initiation of Statin Therapy for Primary Prevention of Cardiovascular Disease. JAMA. 2015;314(2):142-150. doi:10.1001/jama.2015.6822.
ACC/AHA Release Updated Guideline on the Treatment of Blood Cholesterol to Reduce ASCVD Risk. Am Fam Physician. 2014 Aug 15;90(4):260-265.
Pollock A. 2 Studie s Back Guidelines for Wider Use of Statins. New York Times. July 14, 2015.
van Rein N, Cannegieter SC, Rosendaal FR, Reitsma PH, Lijfering WM. Suspected survivor bias in case-control studies: stratify on survival time and use a negative control. J Clin Epidemiol. 2014 Feb;67(2):232-5. doi: 10.1016/j.jclinepi.2013.05.011. Epub 2013 Aug 17.
Honorary and ghost authorship in high impact biomedical journals: a cross sectional survey. BMJ. 2011; 343. doi: http://dx.doi.org/10.1136/bmj.d6128.
Ioannidis JPA. Why Most Published Research Findings Are False. PLoS Med. 2005; 2(8): e124. doi:10.1371/journal.pmed.0020124
Retractions in the scientific literature: is the incidence of research fraud increasing? J Med Ethics. 2011;37:249-253 doi:10.1136/jme.2010.040923
Drug development: Raise standards for preclinical cancer research. Nature. March 29, 2012: 483: 531-533483. doi:10.1038/483531a
Rubenstein S. A New Low in Drug Research: 21 Fabricated Studies.Wall Street Journal. March 11, 2009. http://blogs.wsj.com/health/2009/03/11/a-new-low-in-drug-research-21-fabricated-studies/
Sponsorship bias in clinical research. Int J Risk Saf Med. 2012;24(4):233-42. doi: 10.3233/JRS-2012-0574.
Lesser LI, Ebbeling CB, Goozner M, Wypij D, Ludwig DS. Relationship between Funding Source and Conclusion among Nutrition-Related Scientific Articles. Katan M, ed. PLoS Medicine. 2007;4(1):e5. doi:10.1371/journal.pmed.0040005.
Haneef R, Lazarus C, Ravaud P, Yavchitz A, Boutron I. Interpretation of Results of Studies Evaluating an Intervention Highlighted in Google Health News: A Cross-Sectional Study of News. Courvoisier DS, ed. PLoS ONE. 2015;10(10):e0140889. doi:10.1371/journal.pone.0140889.
Hanneman SK. Design, Analysis and Interpretation of Method-Comparison Studies. AACN advanced critical care. 2008;19(2):223-234. doi:10.1097/01.AACN.0000318125.41512.a3.
Thanks for the cool images pixabay- check them out!