Basically, what this ended up being was nothing more than a knowledge-of-brand question. CNN's been around for 22 years, while MSNBC and FNC have only been around for six years. Thus, more people know about CNN, and automatically said, "Well, uh, yeah, I guess CNN is pretty trustworthy," even if they haven't watched it for a single minute in years. In fact, I would almost consider the question a push-poll. It's common knowledge in consumer research that you can't just dump a "are they trustworthy/a good brand/whatever" question onto your pollee's lap, because a huge percentage of respondents never really think about the "trustworthy" part, and only interpret it as a "have you ever even heard of these guys" question. The only way to get a truly accurate reading on such a question is to word it exceedingly carefully, and then make the person answer a few other questions to verify they even know what they're talking about ... perhaps a few queries along the lines of "Name some of the shows currently on CNN." If they respond "Take Two" and "Freeman Reports," they've just proven themselves unqualified, and should have their responses thrown out. They should also have another question asking to name one of the shows that's only been on CNN in the last year. Lastly, it should also include what's known as an "open-ended" question, where the respondent is put on the spot and asked to say WHY they think network X is trustworthy, in their own words. Then an extremely experienced pollster will pick through those responses and determine which people have the slightest clue of what they're talking about, and which ones don't and should have their responses thrown out. Pew didn't do any of this.
Besides, when only 40% of the nation trusts you, is that REALLY something to crow about?