162496602.jpg
  • Oleksiy Mark/Photos.com

Writing about the New York Times is a lot like it used to be writing about Mike Royko during his last few years at the Tribune. Much as I admired him, I never seemed to find an opportunity to say so; instead, he had me picking one nit after another.

In several recent posts I’ve commented on the Times‘s slapdash use of numbers. My latest nit to pick is a little different. It caught my eye because the Times ran a column of statistics that contradicted what the article on those statistics was reporting. But that numerical blunder led me to something more curious.

The article was an op-ed by Charles Blow in the Saturday Times, “The Penance of Glenn Beck.” Beck had admitted to making “an awful lot of mistakes” at Fox News, and Blow was dwelling on them and finding the worst of them impossible to forgive. “Beck and his colleagues at Fox did their viewers and the country a tremendous disservice,” Blow wrote, “not only riling folks up but outrightly misinforming them.”

blow.jpeg
  • from the New York Times

This brought Blow to his authority, a 2012 PublicMind poll by Fairleigh Dickinson University. In Blow’s words, the poll “found that people who watched or listened to no news were better informed than those who watched Fox.”

Watching Fox News actually increased its audience’s ignorance!

Because Blow was interested in the poll for only one reason—what it said about Fox News—he didn’t get into its methodology. In fact, what the pollsters did was ask 1,185 people nine questions about current events, five of them on domestic affairs and four on international affairs. Fairleigh Dickinson separated the results into two categories—domestic and international—but Blow paid no attention to the international results. You would have thought from his story that the poll’s references to “current events knowledge” and knowledge of “domestic questions” were talking about the same thing—though they weren’t.

So what went wrong?

Blow, trying to be fair, acknowledged parenthetically at the end of his article “that watching MSNBC also had a ‘negative impact on people’s current events knowledge,’ according to the poll, although it was not as large as the effect of watching Fox.” Blow accurately quoted the poll; but MSNBC’s “negative impact” only showed up in the questioning on international affairs. Blow didn’t point that out, and I’m willing to bet he didn’t read the poll results carefully enough to notice it himself.

The column of figures that illustrates this Bleader post was designed by the Times to illustrate Blow’s story. But look carefully at the small print at the top and you’ll see that these figures show how the audiences for various radio and TV programs scored on domestic affairs. And on domestic affairs, MSNBC scored above “No News” and the same as CNN.

Which means Blow said one thing about MSNBC in his article and the illustration to his article said another that contradicted it.

That’s not sound journalism. But it caught my eye, and sent me to the PublicMind report to see for myself what was going on.

The report’s called “What you know depends on what you watch: Current events knowledge across popular news sources,” and to my mind, there’s a half-baked quality to it that Blow ignored because the results suited his purposes. Isn’t there something fast and dirty about a poll that claims to take the public’s measure on the strength of just nine questions? Isn’t there something sketchy about its claim that it could isolate each respondent’s “source of news”—pretending, in effect, that no one who watches, say, The Daily Show also listens to talk radio?

That brings me to the poll’s most startling finding, which Blow ignored and which also seems to have escaped the pollsters.

As you can see from the Times‘s column of numbers, NPR topped the radio/TV pack. Its audience led with an average of 1.51 correct answers to the domestic news questions, and it also led with 1.97 correct answers to the international questions.

But square that with this—a line I had to read three times before I was sure what it said: “On average, people were able to answer correctly 1.8 of 4 questions about international news, and 1.6 of 5 questions about domestic affairs.”

Despite leading the pack, the NPR audience finished below average in its grasp of domestic affairs!

How could that be? The PublicMind report neither tells us nor shows any awareness of having raised the question. The only thing I can think of is that despite the poll’s premises, people do get their news from more than one source, making the grade given each separate source fairly meaningless. Moreover, when we read the report carefully, we discover a single reference to the complete list of “news sources” respondents were asked about. These also included local and network TV news, local and national newspapers, and blogs and political websites. And more respondents said they got news from local TV (76 percent) and newspapers (72 percent) than cited any other news source.

With that one mention, those sources of news disappear completely from the report. It’s up to us to keep in mind that 72 per cent of the respondents who supposedly got their news from Fox News, or MSNBC, or NPR, or The Daily Show also got it from their local newspaper. And apparently a full mix of news sources is informative enough that even the NPR audience, once its other news sources are separated out, scores below average on domestic affairs.

Who would have guessed?—getting news from a lot of sources beats getting it from one. My advice to Blow would be to resist the temptation to cadge a welcome conclusion from a study that it’s hard to take seriously.