[Previous] Front Page Magazine DOES NOT Censor Comments [Updated] | Home | [Next] Leftist Lying: The Issue Is Never the Issue

Bad Correlation Study

Here is a typical example of a bad correlation study. I've pointed out a couple flaws, which are typical.

http://www.ncbi.nlm.nih.gov/pmc/articles/PMC3039704/
Chocolate Consumption is Inversely Associated with Prevalent Coronary Heart Disease: The National Heart, Lung, and Blood Institute Family Heart Study
These data suggest that consumption of chocolate is inversely related with prevalent CHD in a general population.
Of 4,679 individuals contacted, responses were obtained from 3,150 (67%)
So they started with a non-random sample. The two thirds of people who responded were not random.

This non-random sample they studied may have some attribute, X, much more than the general population. It may be chocolate+X interactions which offer health benefits. This is a way the study conclusions could be false.

They used a "food frequency questionnaire". So you get possibilities like: half the people reporting they didn't eat chocolate were lying (but very few of the people admitting to eating chocolate were lying). And liars overeat fat much more than non-liars, and this fat eating differential (not chocolate eating) is the cause of the study results. This is another way the study conclusions could be false.

They say they used "used generalized estimating equations", but do not provide the details. There could be an error there so that their conclusions are false.

They talk about controls:
adjusting for age, sex, family CHD risk group, energy intake, education, non-chocolate candy intake, linolenic acid intake, smoking, alcohol intake, exercise, and fruit and vegetables
As you can see, this is nothing like a complete list of every possible relevant factor. There are many things they did not control for. Some of those may have been important, so this could ruin their results.

And they don't provide details of how they controlled for these things. For example, take "education". Did they lump together high school graduates (with no college) as all having the same amount of education, without factoring in which high school they went to and how good it was? Whatever they did, there will be a level of imprecision in how they controlled for education, which may be problematic (and we don't know, because they don't tell us what they did).


This is just a small sample of the problems with studies like these.


People often reply something like, "Nothing's perfect, but aren't the studies pretty good indications anyway?" The answer is, if it's pretty good anyway, they ought to understand these weaknesses, write them down, and then write down why their results are pretty good indications anyway. Then that reasoning would be exposed to criticism. One shouldn't assume the many weaknesses of the research can be glossed over without actually writing them down, thoroughly, and writing down why it's OK, in full, and then seeing if there are criticisms of that analysis.

Elliot Temple on October 20, 2014

Messages

Want to discuss this? Join my forum.

(Due to multi-year, sustained harassment from David Deutsch and his fans, commenting here requires an account. Accounts are not publicly available. Discussion info.)