About half of the 70,000 total got omega 3 supplements. One of the things I like about press releases found at ScienceDaily is that they usually link the abstract. Here's part of it:
Data Extraction Descriptive and quantitative information was extracted; absolute and relative risk (RR) estimates were synthesized under a random-effects model. Heterogeneity was assessed using the Q statistic and I2. Subgroup analyses were performed for the presence of blinding, the prevention settings, and patients with implantable cardioverter-defibrillators, and meta-regression analyses were performed for the omega-3 dose. A statistical significance threshold of .0063 was assumed after adjustment for multiple comparisons.First, it's a Meta-analysis, so caveat emptor. They picked 20 studies out of 3,635 citations . Were they cherry picked? The usual level of statistical significance is that you would expect the results to happen by chance five times or less out of one hundred. I have no idea why they went for 63 times out of 10,000.
It could be that only those studies contained the data they were specifically analyzing, and met certain inclusion criteria. The number of methodologically flawed studies is truly astounding. No offense, but I have found that most physicians who venture into the realm of research have no clue how to properly design a study, or how to interpret the results.
The reported P value doesn't concern me; it was probably the exact P value that they calculated. I've calculated extremely low P values (where P < 0.0001), but reported them as P < 0.05, because that's the standard acceptable level of significance.