Plain Dealer dumps PolitiFact … and about time, too
The Plain Dealer and its stable-mate, the Northeast Ohio Media Group, are finally doing the right thing about PolitiFact. They’re quitting. It’s for the wrong reason, but I’ll count my blessings.
The operation’s Ted Diadiun reports:
I should note that although some editors shared my dismay over the Truth-O-Meter, that did not figure in the decision by The Plain Dealer and NEOMG to part company with PolitiFact. Nobody (including me) disputes the overall quality of the enterprise.
… So why leave? In short, the reorganization of the newsgathering operation last August resulted in a smaller news staff and an unwillingness to remain in a partnership that required several PolitiFact investigations per week.
So, correct decision to say goodbye to the increasingly awkward partnership. But it’s a decision that I think should have been made earlier, and for different reasons. Among them:
- The PolitiFact deal put the full Plain Dealer reports on their own site, separate from cleveland.com, which hosts all the other content of The PD (and, now, of NEOMG). Since part of my job was to find ways to increase traffic to cleveland.com, this never sat well with me.
- Instead of providing the full reports on cleveland.com, the site ran only teaser summaries that focused on the hopelessly flawed Truth-O-Meter ratings, with links to the full reports. As the guy in charge of monitoring the comments, I knew that most critics didn’t bother to click the link the read the detailed, footnoted report. Heck, our standing joke (not a joke) was that many commenters barely read to the end of the headline before they starting taking shots.
- The Truth-O-Meter’s distinctions made no sense to readers (including me). As Diadiun wrote:
To show you how silly it could get, a couple of years ago there was a lot of internal hand-wringing over whether one of the ratings should be called “barely true” or “mostly false.” What’s the difference, you ask again? Good question.
- Even if one could parse out the differences, the Truth-O-Meter mixed apples and oranges. Its ratings are a combination of both whether a statement is true and whether it was misleading. Where the balance between those two values was struck in picking a rating was crucial. And as far as I could tell, looking at PolitiFact ratings from the national site as well as local ones, the final choices were coin flips. Much-debated coin flips conducted by honest journalists trying to be fair — but coin flips, nonetheless.
- Despite all those flaws, the Truth-O-Meter was the real star of PolitiFact, not the endlessly long, thoroughly research full reports. Diadiun, in an early column, noted:
Bill Adair, editor of PolitiFact, does not see the Truth-O-Meter as a gimmick. “It’s the heart of PolitiFact,” he said. “We are not putting our opinion in our work; we are doing solid, journalistic research, and then reaching a conclusion. That’s not the same as opinion.”
Diadiun disagreed; so do I. There were just too many variables and value judgments, and too much simplification, for the Truth-O-Meter not to end up a matter of opinion. And eliminating opinion from the equation was, I always assumed, what PolitiFact was supposed to be all about.
- There was no consistency to what PolitiFact weighed in on. The mix included ads, speeches and other material, from politicians, pundits and others. It looked at serious comments and fluff. The need to select from such a wide list of possibilities wiped out any legitimate way to compare the truthfulness of individuals — even though that doesn’t stop PolitiFact arms from doing just that.
- PolitiFact zeroed in on single statements often plucked from much longer contexts. That added to the possibility of selection bias. It also was a built-in contradiction. PolitiFact reports are supposed to dig deep and provide all the background — yet a politician could be zinged for a single factual error inside a much longer, completely accurate speech.
In the end, PolitiFact is a pair of strange bedfellows. The reporters who do the work dig hard for information, and they show their work so readers can see for themselves where the facts lie. But that serious work plays second banana to a show-bizzy format that emphasizes meaningless distinctions and the hideous “pants-on-fire” rating. I made these arguments during my time at The PD, but I’m certainly not the only critic. The Plain Dealer/NEOMG decision to sever ties to PolitiFact didn’t have anything to do with those criticisms, it would seem, but I still applaud my former colleagues for it.
John, as a long-time (consevative) critic of PolitiFact, I greatly appreciate this post.
“Bill Adair, editor of PolitiFact, does not see the Truth-O-Meter as a gimmick.”
I trust that you’re correct that PolitiFact staffers do their reporting honestly to the best of their ability. But can you shed any (additional) light at all on Adair & company’s steadfast insistence on saying the Truth-O-Meter doesn’t introduce subjectivity to PolitiFact’s body of work?
I wonder how any reporter could consider PolitiFact’s structure and think “Yeah, that’s objective.” To me, and apparently to you, it’s obviously fertile ground for the introduction of subjective judgments. Are you aware of any deeper arguments Adair used to defend the “Truth-O-Meter”?
First, remember that conservatives aren’t the only ones to criticize PolitiFact. From both sides, it works the same way: If the ruling goes against your guy because it says his facts are wrong, complain that the overall context was accurate. If the ruling says he mishandled legitimate facts, argue that the accuracy is all that matters.
My problem with PolitiFact is that its system, mixing facts and intention, makes such criticism inevitable. And given the vagueness of the standards for differences between ratings, it’s inevitable as well that there will be some substance to at least some of those criticisms.
I can only speak to the character of the people I worked with at The Plain Dealer who worked on PolitiFact Ohio. I am well aware of some harsh personal attacks, particularly from the right, on some of the reporters. I think those were attempts to distract attention from damning facts, which were laid out in detail in the full reports. But I also think the cheesy nature of the Truth-O-Meter (which I believe the national organization has defended as necessary to gain attention) and the attempt to cram too much under the simple label of “truth” combine to undercut the serious work that goes into the reports.
When I say elements of PolitiFact are subjective, I am not saying they bear evidence of a deliberate, or even consistent, tipping of the scales toward one political philosophy or another. Rather, I mean that the system leaves too many openings for individual judgements, on whatever grounds. Some statements are going to be judged because they sound funny or they just happen to catch someone’s eye, but there’s not even an attempt to determine whether the statements selected are a representative sample of the speaker’s output. Some rulings are going to shift toward one rating or another based on gut feelings — which can be political, but can also be simply that the judges had a bad day. Indeed, some politicians might get off easy because the editors making the ruling are consciously trying to balance out their personal leanings. In journalistic terms, the PolitiFact reports are objective — that is, the reporters seek out the facts regardless of their personal biases; I truly believe that. The rulings are arguably objective in that the accuracy of the statement — something that in most cases can be objectively determined — plays a major role. The context is harder to judge without recourse to personal opinion, but I accept that the editors who judge do their best to put their personal beliefs aside. However, if you take “subjective” to mean influenced in any way by personal opinions or feelings, then of course those rulings are subjective. As a teacher, I’m grading my students in part on the quality of their contributions to class discussions. I will aim to be objective — that is, not play favorites, not let my judgment be affected by whether one student is nicer to me or another wears a T-shirt promoting a political cause I oppose. But no matter how effective I am at avoiding those traps, my judgment will still necessarily be subjective because the gauging the quality of contributions is not as simple as counting up how many times they raised their hands.
That said, Adair is correct that the interminably long PolitiFact reports would gain far less attention without the Truth-O-Meter gimmick. I know at least some of the PD reporters appreciated the power of the meter to get the attention of those they covered. If fact-checking doesn’t have an audience, it doesn’t make a sound no matter how many trees are felled to print it. In my editing career, I’ve argued in favor of similar gimmicks, albeit concerning issues less crucial to the republic. I don’t think the idea of a specific fact-checking program is wrong. Nor do I think the rulings of PolitiFact overall show a consistent political bias. I do think the Truth-O-Meter is much too gimmicky; I think the misfit between the label and the way it’s determined is too great; and I think its flaws leave it open to legitimate criticism on almost every ruling.
“First, remember that conservatives aren’t the only ones to criticize PolitiFact.”
Right, no question. I helped start a blog, PolitiFact Bias, and we’re often automatically dismissed as partisans when in truth we also highlight quality criticisms of PolitiFact from the left. But just to provide one of the examples that’s hard to explain away, PolitiFact has tended to find it true that Republicans in Congress vote to raise their own pay while finding it false that Democrats in similar circumstances raise their own pay. PolitiFact has some running double standards that make their archives cry out for correction.
“When I say elements of PolitiFact are subjective, I am not saying they bear evidence of a deliberate, or even consistent, tipping of the scales toward one political philosophy or another.”
I don’t expect you to say that and I don’t think that’s what you’re saying. I’ve met liberals who write for the (Tampa Bay) Times, and almost without exception they seem to exhibit good quality of character. But it takes a special type of vigilance to monitor one’s own bias. The Truth-O-Meter, I believe, helps serve as an amplifier.
On the other hand, PolitiFact’s defense of its “Truth-O-Meter” and its defenses of questionable ratings (every single “Lie of the Year,” for example) have been so exceptionally weak that I simply have to suspect that the insiders–at least some of them–have to know not everything’s on the up-and-up.
Consider the “Lie of the Year” for 2013. PolitiFact has tended to emphasize that it rated the president’s “You can keep your plan” pledge as its Lie of the Year. Yet PolitiFact didn’t even rate that pledge during 2013, and rated it no lower than “Half True” in years past. How did it get to be the “Lie of the Year”? PolitiFact used its “Pants on Fire” rating for Obama’s “What we said was” explanation as a shoehorn to jam the former claim in as a candidate.
Read what I wrote about the”What we said was” claim. Something doesn’t add up. What we’re ending up with from PolitiFact often isn’t fact checking.
http://www.zebrafactcheck.com/what-we-said-was/