Observational Studies, Bad Science, and the Media
by Steven E. Nissen, MD
The headlines glare at the public with a characteristic sense of drama: “Just one soda per day can cause heart attacks in men.”Or worse yet: “Consuming red meat tied to early death risk” and “Too Little and Too Much Sleep Linked With Heart Problems.”
The latter revelation was particularly well-detailed. This news article claimed that, “the findings suggest that people who didn't get enough sleep had a doubled risk of stroke or heart attack and a 1.6-times higher risk of congestive heart failure. Meanwhile, people who got too much sleep (more than 8 hours) had a doubled risk of angina…” Wow! I’m setting alarm clock now.
Some of these news stories describe observational studies reported in reputable medical journals. The “soft drink” story described a study published in Circulation and the “red meat” manuscript appeared in Archives of Internal Medicine. However, many overly-hyped medical studies, such as the “sleep story” are simply abstracts presented at medical meeting. Many are never published. The more dramatic and outlandish, the more coverage these “research studies” receive.
In a world of instant news delivered via the internet, poor-quality medical news stories now dominate the airwaves and print media. CNN actually devoted an entire program to a wacky retired physician who claimed that his diet could make patients “heart attack–proof.” The public hears these dramatic news reports and worse yet, actually believes them.
In my humble opinion, this type of observational “research” is almost always scientific nonsense. Do we really believe that short sleepers have double the risk of MI? If true, sleep duration would be a more powerful risk factor than smoking or diabetes. I’m no fan of sugared soft drinks, but do we really believe that consuming 140 calories worth of sugar daily in a soft drink increases the risk of heart attack by 20%? If true, then the adverse effects of sugar are equivalent in magnitude (but opposite in direction) to the benefits of some of the most effective drugs used in cardiovascular medicine.
There is a sad reality here, a problem well known to most clinical trialists—observational studies are usually unreliable. Nevertheless, the reporting of such studies in the media can lead to mass hysteria or promote a rush to judgment about unproven therapies. A spate of effervescent news reports on the miraculous benefits of vitamin D has resulted in a tsunami of patients demanding that their physicians place them on this wonder-cure for everything from cancer prevention of heart disease. However, when you look closely at the scientific data, the documentation of the reported benefits of vitamin D is almost non-existent. Not wanting to antagonize their patients, many physicians willingly comply with these requests, placing their patients on high dosages of vitamin D after finding a low blood level.
How do we reverse the ever-increasing trend toward a glut of low-quality “research” confusing the public? In my view, the inability of the media and the public to understand the limitations of observational studies requires that physician leaders speak out clearly about the unreliability of such studies. The limitations of observational studies are myriad, but the most common flaws are easily understood and explained. Since patients are not randomly assigned to a treatment group, there always exist differences in characteristics between the study groups. The best observational studies attempt to adjust for these “confounders,” but often consider only the most common demographic variables, such as age and gender. Statistical adjustment can never fully compensate for all of the differences in patient characteristics, leading a common problem known as “residual confounding.”
This was a rather obvious flaw in the study that claimed that drinking a single sugared soft drink per day increased the risk of MI. Individuals who drink sugared soft drinks likely have many other unhealthy habits that are potentially responsible for their high CV event rates. We can easily picture a “couch potato” sipping a soft drink while munching on junk food and avoiding exercise. Confounded observational studies have resulted in huge societal errors in interpretation of the benefits of therapies. When the Women’s Health Initiative (WHI) first proposed to study the cardiovascular effects of administration of hormone replacement therapy (HRT), some practitioner decried the study as unethical, because everyone knew from observational studies that HRT protected post-menopausal women from heart disease. In fact, when completed, the WHI showed exactly the opposite. The observational studies were wrong because they failed to consider that women who took HRT were different from those who did not (and had many other favorable health habits).
In responding to patients and the media, we must explain to the public the difference between showing an “association” and proving “causation.” Eating red meat may be associated with a higher rate of MI, but that does not prove that it causes MIs. We must also speak publicly about the harms associated with unsubstantiated claims of medical benefits. When a patient hears on CNN that a vegan diet can melt away plaques in the coronaries, he or she may well stop their statins with catastrophic consequences.
< Back to Listings