Selection Bias? PolitiFact Rates Republican Statements as False at 3 Times the Rate of Democrats
PolitiFact assigns “Pants on Fire” or “False” ratings to 39 percent of Republican statements compared to just 12 percent of Democrats since January 2010
PolitiFact, the high profile political fact-checking operation at the St. Petersburg Times, has been criticized by those on the right from time to time for alleged bias in its grading of statements made by political figures and organizations.
The organization (and now its more than a half dozen state offshoots) grades statements made by politicians, pundits, reporters, interest groups, and even the occasional comedian (anyone ‘driving the political discourse’) on a six point “Truth-O-Meter” scale: True, Mostly True, Half True, Barely True, False, and Pants On Fire for “ridiculously” false claims.
But although PolitiFact provides a blueprint as to how statements are rated, it does not detail how statements are selected.
For while there is no doubt members of both political parties make numerous factual as well as inaccurate statements – and everything in between – there remains a fundamental question of which statements (by which politicians) are targeted for analysis in the first place.
A Smart Politics content analysis of more than 500 PolitiFact stories from January 2010 through January 2011 finds that current and former Republican officeholders have been assigned substantially harsher grades by the news organization than their Democratic counterparts.
In total, 74 of the 98 statements by political figures judged “false” or “pants on fire” over the last 13 months were given to Republicans, or 76 percent, compared to just 22 statements for Democrats (22 percent).
First, it should be acknowledged that the number of public officials subjected to PolitiFact’s Truth-O-Meter lens from each party is fairly even during the period under analysis.
Of the 511 statements put through the Truth-O-Meter test from January 1, 2010 through January 31, 2011, PolitiFact devoted 74 percent of its attention to current and former political officeholders and elected officials (379 statements), 17 percent to ideological organizations and individuals not holding political office (85 statements), and 9 percent to other groups and individuals without a partisan or ideological agenda (28 statements). Another 20 statements came from chain e-mails, public opinion polls, bumper stickers, or “bloggers” generally (4 percent).
For those current or former political officeholders, PolitiFact has generally devoted an equal amount of time analyzing Republicans (191 statements, 50.4 percent) as they have Democrats (179 stories, 47.2 percent), with a handful of stories tracking statements by independents (9 stories, 2.4 percent).
Assuming for the purposes of this report that the grades assigned by PolitiFact are fair (though some would challenge this assumption), there has nonetheless been a great discrepancy regarding which political parties’ officials and officeholders receive the top ratings and those that are accused of not telling the truth.
Republican statements were graded in the dreaded “false” and “pants on fire” categories 39 percent of the time, compared to just 12 percent for statements made by Democrats.
That means a supermajority of falsehoods documented by PolitiFact over the last year – 76 percent – were attributed to Republicans, with just 22 percent of such statements coming from Democrats.
As a consequence, Democrats have therefore been presented as much more truthful – with over 75 percent of statements receiving the top three grades of True (16 percent), Mostly True (27 percent), or Half True (33 percent).
Less than half of Republican statements graded by PolitiFact were regarded as half truths or better – just 90 out of 191 (47 percent).
Republicans were also assigned a larger percentage of “Barely True” statements than Democrats, bringing the tally of all falsehoods or near falsehoods in the bottom three categories to 52.9 percent of Republican statements to just 24.6 percent of those made by Democrats.
PolitiFact Ratings of Current and Former Political Officials, January 2010 – January 2011
Party
|
Pants on Fire
|
False
|
Barely True
|
Half True
|
Mostly True
|
True
|
GOP
|
23
|
51
|
27
|
30
|
31
|
29
|
Dem
|
4
|
18
|
22
|
59
|
48
|
28
|
Ind
|
0
|
2
|
0
|
1
|
1
|
5
|
% Pants on Fire
|
% False
|
% Barely True
|
% Half True
|
% Mostly True
|
% True
|
|
GOP
|
12.0
|
26.7
|
14.1
|
15.7
|
16.2
|
15.2
|
Dem
|
2.2
|
10.1
|
12.3
|
33.0
|
26.8
|
15.6
|
Ind
|
0.0
|
25.0
|
0.0
|
12.5
|
12.5
|
50.0
|
Data compiled by Smart Politics.
During the last 13 months, the Republicans that have led the way with the largest number of Barely True, False, and Pants On Fire grades are Sarah Palin with eight, Michele Bachmann with seven, and John Boehner, Mike Pence, and the National Republican Congressional Committee with four each.
Whereas Boehner received six “True,” two “Mostly True,” and one “Half True” ratings during this span, Pence and the NRCC received none in these categories, Bachmann only two, and Palin just four.
What is particularly interesting about these findings is that the political party in control of the Presidency, the US Senate, and the US House during almost the entirety of the period under analysis was the Democrats, not the Republicans.
And yet, PolitiFact chose to highlight untrue statements made by those in the party out of power.
But this potential selection bias – if there is one at PolitiFact – seems to be aimed more at Republican officeholders than conservatives per se.
An examination of the more than 80 statements PolitiFact graded over the past 13 months by ideological groups and individuals who have not held elective office, conservatives only received slightly harsher ratings than liberals.
Half of the statements made by conservatives received ratings of Pants on Fire (12.5 percent), False (16.1 percent), or Barely True (21.4 percent), compared to 41 percent for liberals.
PolitiFact Ratings of Non-Officeholder Ideologues, January 2010 – January 2011
Ideology
|
Pants on Fire
|
False
|
Barely True
|
Half True
|
Mostly True
|
True
|
Conservative
|
12.5
|
16.1
|
21.4
|
25.0
|
14.3
|
10.7
|
Liberal
|
6.9
|
24.1
|
10.3
|
24.1
|
17.2
|
17.2
|
Compiles PolitiFact statement ratings of ideological organizations and individual ideologues who have not held political office or worked for political parties (e.g. commentators and talk show hosts). Data compiled by Smart Politics.
These findings beg the central unanswered question, and that is what is the process by which PolitiFact selects the statements that it ultimately grades?
When PolitiFact Editor Bill Adair was on C-SPAN’s Washington Journal in August of 2009, he explained how statements are picked:
“We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it.”
If that is the methodology, then why is it that PolitiFact takes Republicans to the woodshed much more frequently than Democrats?
One could theoretically argue that one political party has made a disproportionately higher number of false claims than the other, and that this is subsequently reflected in the distribution of ratings on the PolitiFact site.
However, there is no evidence offered by PolitiFact that this is their calculus in decision-making.
Nor does PolitiFact claim on its site to present a ‘fair and balanced’ selection of statements, or that the statements rated are representative of the general truthfulness of the nation’s political parties or the elected officials involved.
And yet…
In defending PolitiFact’s “statements by ruling” summaries – tables that combine all ratings given by PolitiFact to an individual or group – Adair explained:
“We are really creating a tremendous database of independent journalism that’s assessing these things, and it’s valuable for people to see how often is President Obama right and how often was Senator McCain right. I think of it as like the back of a baseball card. You know – that it’s sort of someone’s career statistics. You know – it’s sort of what’s their batting average.” (C-SPAN Washington Journal, August 4, 2009)
Adair is also on record for lamenting the media’s kneejerk inclination to treat both sides of an issue equally, particularly when one side has the facts wrong.
In an interview with the New York Times in April 2010, Adair said:
“The media in general has shied away from fact checking to a large extent because of fears that we’d be called biased, and also because I think it’s hard journalism. It’s a lot easier to give the on-the-one-hand, on-the-other-hand kind of journalism and leave it to readers to sort it out. But that isn’t good enough these days. The information age has made things so chaotic, I think it’s our obligation in the mainstream media to help people sort out what’s true and what’s not.”
The question is not whether PolitiFact will ultimately convert skeptics on the right that they do not have ulterior motives in the selection of what statements are rated, but whether the organization can give a convincing argument that either a) Republicans in fact do lie much more than Democrats, or b) if they do not, that it is immaterial that PolitiFact covers political discourse with a frame that suggests this is the case.
In his August 2009 C-SPAN interview, Adair explained how the Pants on Fire rating was the site’s most popular feature, and the rationale for its inclusion on the Truth-O-Meter scale:
“We don’t take this stuff too seriously. It’s politics, but it’s a sport too.”
By levying 23 Pants on Fire ratings to Republicans over the past year compared to just 4 to Democrats, it appears the sport of choice is game hunting – and the game is elephants.
Follow Smart Politics on Twitter.
Politifact’s job is not to search out obscure statements that are not even making waves in the media at large in order to provide party balance. Their job is to fact-check what’s being said, and often controversial and questionable statements by Republicans have driven the debate in the past 2 years.
For instance, you are close to suggesting that Politifact should ignore Sarah Palin since she isn’t an “officeholder” at all (you suggest that as a criteria for the amount of ratings) — however, that is patently ridiculous given the influence that even a Twitter message from her gets on the day’s news. She and Michelle Bachmann are covered widely for their statements, revered by the right and loathed by the left — given the press attention it is not only Politifact’s option but their DUTY to report on the truthfulness of these statements that are driving media debate. Replacing them with comments from (for example) some relatively anonymous Democratic committee chairman that isn’t being covered in the news anyways in order to provide sufficient “rating balance” would just be silly.
In order to prove bias at Politifact, I argue that you would have to demonstrate either 1) systematic and repeated omissions from their ratings of widely heard false statements from Democrats or 2) varied standards for rating statements, according to party. You have failed to do either.
Politifact has bent over backwards to find statements of little or no consequence to rate as false in a cynical attempt to present a their fact checking as “fair” to both sides. The result is the false equivalency so often seen in the media today. Republicans have false ratings at 3 times the rate of Democrats? Maybe it’s for the simple reason they publicly tell lies of consequence 3 times as often. Today they have two “Pants on Fire” consequential statements from the former President that were out-and-out lies that can and likely will incite violence among his supporters. Next to that they have a “False” rating for Vice President Harris who claimed a fentanyl flow reduction by half. There is not enough data to support that, but neither is there enough to refute it and it isn’t about to cause a riot.
Nonsense.
Republicans have given us “death panels,” “Obama is a Muslim,” and “Obama is out to destroy America as part of his anticolonialist revenge.”
Not to mention “Our intelligence confirms there are weapons of mass destruction in Iraq” and “Saddam Hussein and bin Laden are in cahoots.”
Republicans’ efforts to preserve the privilege and power of the rich, white, conservative minority know few bounds of truth. Of course Republicans lie more than Democrats, at least on policy issues. Why would we expect that both parties must lie equally??
(Perhaps different party members lie equally when it comes to their personal lives, although given how many prominent Republicans have turned out to have hidden gay affairs/sex scandals/inappropriate relationships with interns, I’m not sure about that.)
And look at where we are today.
It is not hyperbolic to say truth has a liberal bias.
Pulitzer = Progressive
So your argument is that there must be bias at PolitiFact, based on the assumption that Republicans can’t possibly be lying that much more than Democrats. That’s the extraordinary claim here, and it’s incumbent upon you to prove it, not PolitiFact to disprove it. Unless you can come up with some damning examples of liberals’ controversial statements going unchallenged or conservatives’ words being twisted, I’m inclined to side with PolitiFact over you.
Yes it’s biased. I was at first delighted to find a “independent non-biased (their words)” site to giggle with. After a few minutes of the Pants on Fire section, I started to notice that many of them were taken out of context or not sufficiently excavated. I’ll look for something less “progressive” in the bias department.
By levying 23 Pants on Fire ratings to Republicans over the past year compared to just 4 to Democrats, it appears the sport of choice is lie hunting – and the liars are elephants.
Eric Ostermeier here smears Politifact with being prejudiced against the Republican Party because PolitiFact found lies to be far commoner from Republican politicians than from Democratic ones. No evidence of bias was presented by this professor; there was just the repeated insinuation here that PolitiFact was biased, no allegation at all that conservatives are more prone to lie than are non-conservatives (which is obviously the case). This liberal professor was pumping conservative deception, rather than condemning it. Ugh!
Right, I agree with these commenters here.
Politifact should worry about people who only have twitter; or a TV show; and ignore people who write and pass legislation.
I mean if the government passes a law restricting my freedoms and rights; that’s no big deal. But if Palin gets something wrong in a twitter post the world must know immediately.
This is rational prioritization. Who cares if the government drives you out of business by new legislation; so long as someone can call Glenn beck a liar.
Who cares what elected officials in the halls of power with majority control say; who’d bother to check that… Palin updated her Facebook… go through it with a fine tooth comb.
Yeah, that’s not bias; that’s perfectly rational; presuming you define “rational” in a new and inventive way.
If that is the methodology, then why is it that PolitiFact takes Republicans to the woodshed much more frequently than Democrats?
Maybe because they lie more? Duh.
Death panels, anyone?
Great job, Ostermeier! Pay no attention to the liberal wagon-circlers. PolitiFact clearly frames its work to suggest conclusions that would only be reasonably supported if its subject matter was the result of random story selection.
I suggest an empirical study of the way PolitiFact handles “False” statements vs. “Pants on Fire” statements. PolitiFact’s grading system calls the latter “ridiculous” statements–apparently that’s the deciding criterion between the two. That’s objective, right?
Eric,
Anyone that has ever visited Politifact could tell you that Republicans tend to have more statements rated false. Stating this alone is not sufficient to conclude that they have a selection bias. Both selection bias and Republicans lying more in public venues would explain the trend. In order to strengthen your own hypothesis, you first need to prove the competing hypothesis false. This is basic hypothesis driven research and I’m frankly shocked that any Ph.D would ignore such an obvious counterpoint.
Aww of course my favorite thing that Conservatives do has even come up in the comments. If anything conflicts with your view or party then it must have a liberal bias. Damn you science and your liberal bias! Curses to you facts and research! The results are unfavorable to Conservatives? Well then it must not be true, the truth tends to have a liberal bias anyways.
> Both selection bias and Republicans lying more in public
> venues would explain the trend. In order to strengthen your
> own hypothesis, you first need to prove the competing
> hypothesis false. This is basic hypothesis driven research
> and I’m frankly shocked that any Ph.D would ignore such an
> obvious counterpoint.
This very point – that a party such as the GOP could be lying more – was acknowledged in the report itself:
“One could theoretically argue that one political party has made a disproportionately higher number of false claims than the other, and that this is subsequently reflected in the distribution of ratings on the PolitiFact site.“
If you are suggesting that a rigorous methodology is employed by PolitiFact when selecting its stories, capturing the real-world ratio of lies by political parties, then you also skipped this point made in the article – a quote from their Editor:
“We choose to check things we are curious about. If we look at something and we think that an elected official or talk show host is wrong, then we will fact-check it.“
You acknowledge the point that Republicans could be lying more and then dismiss it without providing any evidence other than you don’t like the way PolitiFact selects it’s stories. You need to show that they have ignored Democratic lies. There may be a bias there, but just because you don’t like the results doesn’t prove it.
Often I have said that Republicans cant win by using logical arguments so they resort to lying. Now I see my theory was correct.
I misread part of the piece — Mr. Ostermeier’s contention is that Republican officeholders are being unfairly targeted, not conservatives in general. My confusion was in the highlighting of the ratings of Sarah Palin, who was once an officeholder but hasn’t been now for some time.
That said, I don’t really believe there is a rigorous, objective way to select statements other than “curiosity”, and Mr. Ostermeier doesn’t suggest any alternative. Maybe you could use a computer program to highlight quotes that are being said frequently in a Nexus search, and then draw from that list. However there are many problems with that, and at that point you would be factchecking the Associated Press’s quote selection more than the politicians themselves.
You could try to install a sort of quota system and say “oh, today we highlighted a Boehner quote, now let’s look at what Pelosi said today,” but that’s not very interesting. What if tomorrow Pelosi says 3 false things and you miss it because you are allocating your time to balance out something you found that a Democrat already said?
Mr. Ostermeier argues that since Democrats were in power, Politifact should have been able to dig up more “Pants on Fire” statements from Democrats than Republicans. But why should this be true? You could equally well make a case that parties in power are more careful with their language because when you are governing you are more heavily scrutinized — furthermore, parties out of power are often led by more extreme voices who may play more fast and loose with the truth. They are apt to make broad, controversial statements to get media attention or play to their supporters rather than the nation at large. You might have found a similar “bias” against Democrats if Politifact existed in 2003.
What would be indisputably wrong is if Politifact claimed something to the effect of “Republicans lie 3 times as much as Democrats”. But they don’t — that’s Mr. Ostermeier’s own data, and nothing like that’s ever been published on the Politifact site. I guess the author’s contention is that casual readers of Politifact will go to the website, not read anything in particular, but pull up the Pants on Fire section and say “wow, Republicans sure lie a lot!” and call it a day.
Perhaps to avoid this, Politifact should put a disclaimer up that says that no conclusions should be drawn about the honesty of one political party or another based on the quantity of False or Pants on Fire rulings. Furthermore, the site should push to more actively solicit feedback from readers, so that if (as Ostermeier implicitly suggests) there are dozens of “Pants on Fire” statements out there from Democratic officeholders that are going unchecked, readers can bring it to Politifact’s attention. If they then chose to ignore investigating those statements, one might be able to prove bias in their selection. Or bias could be demonstrated if they investigated those statements but then rated them improperly (holding Republicans to a different standard than Democrats, etc.).
But unfortunately, just saying that Politifact is biased because they found an unequal number of false statements from each party is advocating the worst kind of equivocation that pervades our media today. (i.e. “On one hand, but on the other hand…”) If Politifact is constantly looking over its shoulder to try to rate an equal number of untrue statements for each party, that will certainly leak into the quality of the factchecking.
Thanks for helping make the point; Republicans are addicted to lying.
How do you know which items are rated pants on fire, etc, so that we can check the checkers and see what they are checking. From what I have seen,particularly from reading the media, the balance is far more that the Dems are lying in what they say and that president in particular is guilty of this as are Pelosi and Reid.
Perhaps a better way to find if the disparity in Pants on Fire ratings reflects a bias by Politifacts or that Republican politicians lie more would be to do a content analysis of the phrasing assigned the Pants on Fire rating to see if it’s a one-off statement by a single politician or if it reflects a group talking point, whose use by one politician reflects the fact that everyone’s reading the memo. I honestly can’t think of liberal equivalents to egregious group lies such as death panels, or here in the last three weeks, a caliphate.
PolitiFact covers the statements that have the most impact, often a repeated talking point or something the media latches onto. So while you might point at PolitiFact as being biased, it is in fact the sources that they draw from that reflect the bias.
Yeah tat must be it
And what agenda does Smart Politics have? Why don’t you do your own study of lies vs truths. I bet you’ll come up with an even higher difference between lies and truth than PolitiFact.
Why write an article hinting at bias? It’s easy enough to prove or disprove Politifact’s rankings. IF YOU REALLY WANT TO KNOW THE TRUTH.
Personally a 3:1 ratio seems too low to me.
The difference can easily be attributed to FOX New’s 24×7 propogana operation. There’s been nothing like it in Americn history.
You write:
But then you offer no argument which refutes that.
And I would, in fact, argue that one party has made more false claims. Pick a topic like, say, the healthcare bill, and it’s very simple to see that Republicans did, in fact, obfuscate and dissemble about the bill to an amazing degree, suggesting the existence of plans to “kill Granny,” to “take over healthcare,” etc.
Please be smarter. It’s not enough to merely imply bias. You should either prove it or STFU.
This is smart politics? I’d characterize it as one of the worst analysis of potential bias out there. You don’t advance any support for your hypothesis that they select worse statements by republicans. Perhaps you should look at the statements and do your own breakdown before making accusations.
Regarding the purported need for Smart Politics to prove the GOP isn’t lying more (or at the PolitiFact ‘rate’) than Democrats:
You can’t use the data published herein from coding PolitiFact’s stories itself (noting the site attributes more false statements to the GOP) as proof that the GOP lies more and thus there is de facto no selection bias. That’s circular reasoning. While this report does not definitively prove there is such bias, the data published shifts the burden, I would argue, to PolitiFact. In any event, greater transparency in their selection methodology would shine a light on this very question.
Actually, Dr. Ostermeier, since you are the one suggesting–without any supporting evidence, mind you–there is a bias, the burden of supporting that accusation falls with you. You have not done so.
Republicans are reported to lie more then Democrats do.
Foxes are reported to eat chickens more than groundhogs do.
Clearly, the explanation is bias. What else could it be?
Eric, if you want to make a case of selection bias against Politifact, then you need to demonstrate at least one case of a “pants-on-fire” lie by a Democrat that was not covered and should have been to keep their sample unbiased.
In fact, you would need to demonstrate enough such lies to roughly balance the discrepancy reported by Politifact.
Since you failed do this (and in fact, cannot) all you are doing is making a baseless libel against Politifact. Grow up and learn some basic scientific methodology and honesty.
Without establishing either a provable hypothesis or corroborating evidence to further the insinuation of collection bias, it would seem this article is in fact promoting the more sinister false equivalency bias that has swept both reporting and commentary. I am baffled by this site’s contribution to public disinformation. What interest does SmartPolitics have in questioning the integrity of the relatively non-inflammatory work being done by PolitiFacts? Instead of offering clarity or insight, this article seeks to cast doubt through aspersion.
> since you are the one suggesting–without any supporting
> evidence, mind you–there is a bias, the burden of supporting
> that accusation falls with you.
As made quite clear above, because PolitiFact’s methodology of statement selection is unknown…
“We choose to check things we are curious about.” (Bill Adair, Editor of PolitiFact)
…it is not practical to replicate PolitiFact’s test of the universe of Dem vs. GOP falsehoods to determine the firm existence (or degree of) selection bias at that news organization .
republicans have a talk radio monopoly that is routinely used to spread lies and distortions into politics and media when needed by the GOP. it is successful in doing so because most of the radio liars have call screeners and don’t take calls from those who would correct them. much of the material comes from think tanks now since karl rove’s gang is out of the white house. it is fed to all the main talkers and can be repeated from 1000 radio stations until they become “truth”, with widespread coordinated unchallenged repetition that is not possible with print, TV, or internet. i suspect many of the untruths have origins with the limbaughs and hannitys. many of the more common myths, lies, and untruths would not exist or be acceptable on large scale without the talk radio monopoly that reagan made possible when he killed the fairness doctrine in 1987.
Reality has a well-known liberal bias.
Dr. Ostermeier,
You keep stating that you cannot replicate Politifact’s selection criteria because you have provided a quote that indicates a very subjective criterion (i.e., personal curiousity).
To be very blunt, this is why many PoliSci people (Ph.D. or otherwise) should not be allowed to attempt to create original research. It shows a distinct lack of imagination with respect to research design. In the time it took me to read your article, I had the first bits of a A-B test design in my head already.
In addition to the poor design, this article also suffers from specious logic, which has been discussed quite thoroughly by others. I just want to re-emphasize one point: you cannot draw a conclusion of bias on the part of Politifact without a control sample. As you are not publishing in a peer-reviewed journal Dr. Ostermeier, you have quite a bit more freedom to develop even a rough control group to draw directional inferences (provided you caveat appropriately) that will allow tentative conclusions to be drawn.
In short, this is a lazy piece of research and a poorly-thought out article.
By the way, here is the initial kernel of my research design. Feel free to use it in a revised version of this article.
Sample criterion: match Democratic and Republican statements in the Politifact database to statements by members of both parties not in the database for “prevalence in the media.” Lexis-Nexus would probably be a good place to start. This is similar to a match-market test in media research.
Could it just very possibly be that a political party consumed by religious zealots makes more false statements.
Can you honestly give me a democratic lie propagated by powerful, office holding members of the democratic party on par with “Obama is a muslim” and “Death panels.”
You’re article makes me question the ability of public education in Minnesota.
I don’t think you can have a passing familiarity with truth or facts or reality and be a conservative, let alone a Republican, at this time. I wish it weren’t true, but it is. How many Democrats were on the Sunday shows this last weekend? None. Not one. But John McCain is on every Sunday Morning at least once, spouting whatever his people tell him to say. Lucky we have a liberal media biased against Republicans to such an extent that they only allow Republicans to be on the shows. Funny that Olbermann was fired for telling the truth too often, and the network he left has been hounded as the “liberal” network because it has three hours of “liberals” and five hours of conservatives on per day.
The truth would seriously burn a Republican like daylight tweaks vampires.
What a disappointing article. As has been noted in the comments, the point is to provide fact checking on issues that are being bandied back and forth in the public discourse. While you could certainly find enough examples where M. Bachmann said “it is raining” and it was, isnt it more reasonable to check facts on statements such as “death panels” and then look at who said them? Is it not also reasonable for the GOP to be the ones on the attack to regain power and therefore commenting on policy in hyperbolic terms. Wouldnt this necessarily lead to someone being right and someone being wrong, when the response “there are no death panels” is given? Then doesnt one conclude that on the issues under current debate, one party has answered more truthfully than the other? I for one do not believe that Palin inherently lies more than Obama. But on the issue of health care reform I believe she does because she really doesnt like it. This article is the classic example of using a statistical analysis to generate an unfortunate drama that doesnt exist. The “truth o’ meter” can only be applied to the truth or falsehood of statements being made on topics we the voting public are currently discussing. It can not be used to draw sweeping conclusions regarding the parties or individuals involved generally. However, the result of such analysis is to encourage people to discount fact-finding altogether because such fact-finding may imply bias. Doesnt this just encourage truth being determined by the loudest voice? Disenfranchisement of the media is the first step to controlling the minds of the public.
> The “truth o’ meter” can only be applied to the truth or
> falsehood of statements being made on topics we the voting
> public are currently discussing. It can not be used to draw
> sweeping conclusions regarding the parties or individuals
> involved generally.
Not according to PolitiFact Editor Bill Adair. As quoted above:
“We are really creating a tremendous database of independent journalism that’s assessing these things, and it’s valuable for people to see how often is President Obama right and how often was Senator McCain right. I think of it as like the back of a baseball card. You know – that it’s sort of someone’s career statistics. You know – it’s sort of what’s their batting average.” (C-SPAN Washington Journal, August 4, 2009)
This is the first time I enjoyed reading an article as well as the comments section. Good points made by all.
Definitely difficult to adjust for bias in this type of study, I like the idea made by Corey to use the Lexis-Nexus database to control for dissemination of the comments under study.
Eric: As you know, there is another website, FactCheck.org, which essentially does the same thing as PolitiFact but without the “truth-o-meter” rulings, making it less quantifiable.
For the most part, when they “rate” the same claim, the two sites are often in agreement except for PolitiFact’s blessing it with a “ruling”.
I have been following and quantifying PolitiFact’s rulings for some time now, and out of curiousity did a review of FactCheck to see if they were selecting more Republicans than Democrats and finding more Republican statements false. It was more difficult to quantify, of course, but on examining about six months of their fact-checking, it appears they mostly select statements that appear to be false, and they also select far more Republicans, for the sample I selected, even more so than PolitiFact.
You may also wish to take a look at PolitiFact’s newspaper partnerships, for there appears to be a correlation between the newspapers’ political endorsements and their PolitiFact score of “truthfulness”– in other words, if the paper endorsed more Republicans, its PolitiFact rating would favor the Republicans as well, as compared to the overall averages.
I have reached the conclusion it would be very difficult if not impossible to prove PolitiFact has bias in a scientific, objective manner. As noted by Corey above, “you cannot draw a conclusion of bias on the part of Politifact without a control sample” (and all those other caveats).
Your work is definitely appreciated and I’m pleased to see more people seeing the power of the database that PolitiFact has created.
This website appears to be another vehicle for the left to express their hatred toward anyone that doesn’t agree with them. Most of the “lies”, as the site frames them are opinions, and not presented as facts. Has anyone seen the new MoveOn.org commercial regarding the proposed budget cuts to Planned Parenthood funding, and how it “relates” to women’s health? Talk about being insincere… the lies these people pass off as facts are atrocious! Sending women back to the days of coat-hanger abortions? Please give me a break!
The comments here are illustrative of exactly why the Left is incapable of creating a fair “fact check” site. They believe certain things are “lies” no matter what they are told.
“Death panels” is a characterization of something that is very real. You may not like the characterization, but it is what it is. To rate it as a “lie” is disingenuous or ignorant, at best.
To label the talk of a caliphate as a “lie” is even worse. There’s not even any mischaracterization going on, except from the Left as they build straw men, putting words in conservatives’ mouths, all while ignoring the WRITTEN CHARTERS of organizations like The Muslim Brotherhood, as well as the words given by all sorts of experts (not just conservative “talking heads”) on all the major news channels, not just Fox News, about how the Brotherhood is in the best position to seize significant amounts of power and that a caliphate is one of their end goals.
This is pretty interesting and agree that it’s not actually saying there is bias, or that the ratings or findings are wrong, just that of facts checked the GOP statements are found to be false a lot more often that the statements of Democrats. So based on this I wondered how the fact checking of PoliGraph compared to what PolitiFact is finding in comparing the parties. I used all facts listed here – http://www.hhh.umn.edu/centers/cspg/poligraph.html which totaled 30 checks. Some of the entries had more than one check and I counted each check separately. Of course PoliGraph is looking at Minnesota Politics vs at the National level.
Here is what I found.
Dems had 14 statements checked plus one on a Dem source that was not a political candidate/representative – so 15 Dem checks total. The GOP had 11, and 4 were on the Independent Party.
PoliGraph scores the results as Accurate, Inconclusive, or False. Here’s how the groups fared – I’ll list in Accurate, Inconclusive, False order.
DEM – 9 (60%), 4 (26.67%), 2 (13.33%)
GOP – 4 (36.36%), 1 (9%), 6 (54.54%)
Ind – 1 (25%), 2 (50%), 1 (25%)
So PoliGraph’s findings at the Minnesota level are consistent with what we see of PolitiFact’s finds at the National level.
This is classic Republican twisting of facts. If you don’t like what the mainstream media says, start your own biased network “Fox”. If you don’t like the fact that the World Climate Counsel has found that carbon emissions are causing global warming, dispute it as a conspiracy and quote a minority of scientists. Don’t like the fact that we have an African-American president, challenge his citizenship. Don’t like the fact that we are going to finally have some sort of gov’t healthcare, say they have death panels. Don’t like the science of evolution, insist on teaching intelligent(???) design, and only endorse candidates who believe the earth is 6000 years old. I used to be an independent, but the Republicans have not only lost their minds, but any sense of decency and fair play. Yeah, there has always been B.S. in politics, but the Republicans have taken it to a new low. I applaud Politifact for their work, and please keep calling them as you see them, and let the facts speak for themselves. If the Republicans don’t want to be viewed as liars, I have an idea, quit lying. There are to many important issues facing this country, but I think they know if people really know the facts and what is best for them individually, fewer Republicans will be elected.
I read through these responses and rated them according to a sliding scale. It was “sliding”, because I started off with ‘Defamatory’, ‘Automatically Negative’, ‘Automatically Positive’, and ‘Reasoned’. The “sliding” came about because I found responses actually fell into broader categories, and sometimes I couldn’t reconcile the results with a category I already had.
The results are:
Automatically Negative (showing a visible negative bias and not much else) : 6
Automatically Positive: 1 (ditto, but the other way)
Reasoned : 8 (almost, but not all, by the article’s author)
Apologizing for PolitiFact: 2
Misread the Paper: 2 (Commenting, but not about what was actually said)
Generally Confused about the topic of discussion, but talking anyway: 6 (I may qualify here)
Condescending : 2 (This appears to have actually been ‘Liberal Condescending to Obviously Less Intelligent Conservatives’, but I had broader hopes when I added the category. I know Conservatives who could return the favor. But they do seem to be rarer).
Took a Broader View : 2 (Seemed to understand better how science works)
Attacked the Reporter/Carrier of the report : 5
Attacked Somebody Else : 2
Methodology : 0 (There were a few attempts, but all wandered off elsewhere)
Invoked a Conspiracy To Account For It All : 1
Other : 2 (Not on topic, not understanding the discussion, didn’t read the same article I did, and otherwise not connecting with the discussion at all).
Without actually counting (I’d have to re-read this thread and I have my limits), here are my general impressions of the conversation:
Most respondents other than the author appeared to be Liberal (or at least Anti-Conservative). I can’t tell from what I’ve read (without interposing my own political bias as a filter) which way the author votes. I’d guess Conservative simply because he brought up the topic, but I suspect that would be projecting a point of view, not judging from the available data.
Most, but not all ‘negative’ comments were made by people with obvious Liberal views. But then, most respondents were clearly Liberal, and intending to be clearly identified as such, so maybe that’s to be expected. Is that a bias for Smart Politics?
As for PolitiFact itself, the cause of this discussion, my opinion is as follows:
By weighing in on the political scene the way they do, feigning disinterest in the field and using the assotede keywords and watchwords of science (measure, balance, meter, random, prize-winning, count, etc.), PolitiFacts pretends to science while denying and deliberately undermining the method. This is what the author of this paper has brought to our attention. In the shadow of Science, PolitiFacts has dug a trench to undermine critical thought, and to provide sound-bits and ‘factoids’ for political use elsewhere. They have trivialized real political discussion by focusing on (what else?) trivia. They are clearly laughing up their sleeves at the rest of us, and not in a polite way. Anyone quoting their product or giving them prizes for it either doesn’t get the joke, or intends to make some use of them. And they certainly don’t understand the science that is being parodied, apparently for political purpose.
The author of this article does understand the science, at least well enough to point out where PolitiFacts is being unscientific, or hiding the information that would allow other people to be scientific. Pointing out the holes in PolitFacts’ facade without providing proof that they are in fact actually biased is not un-scientific. It is like pointing out that an experimenter hasn’t provided the raw data and and documented methodology to a scientific journal prior to publication. It says nothing about the data: it says that there is no visible data to justify the experimenter’s assertions. The truth is in the data, and an open review of the method used to generate the results, and whether those results justify the experimenter’s assertions.
What the data does say is that PolitiFacts does and says whatever they want, but have provided no means for determining whether or not we should trust their judgment, selection of data, or statements about the data. Since they refuse to provide a rational methodology (or perhaps have none) for choosing who and what they measure, their measurements mean absolutely nothing beyond the obvious: their product has a clear political bias. I mean, after 30 seconds on the site this was patently obvious. On the little screen on my phone. I dug in further simply to see if it was (a) for real and/or (b) today was a quirk (the Republicans being more oblique than usual).
In my opinion, anyone who can’t see this bias is finding just what they want and hope to see on this website. And despite this rant, I can’t really help them.
PolitiFact Editor Bill Adair was on C-SPAN’s Washington Journal for a return visit this morning (Monday, Aug. 22). Although I didn’t get to see the whole interview, it didn’t take long to figure out that he and his contemporaries at the St. Petersburg Times don’t seem to have learned much about research methodology. He repeated his “disclaimer” about not “taking this stuff too seriously” but then in the next breath, he tried to pass it off as “solid journalism”, whatever that means these days. Then about a minute later, he gave a few examples, focusing most of the time on Republican Primary candidates for President. Later he said “we’re not in the commentary business.” Is that so? You expect us to believe that? Of course, the c-span moderator on that morning basically threw him softball questions that weren’t in any way probing.
William (above) basically said about everything I was thinking while reading the responses. Thanks for being more dedicated than I am, Bill. The only thing I might add is that it is amazing how many people will start responding after half-reading an article such as this. A twitter responder during the program remarked how much they “love” the pants on fire rating. So much for hoping for an thoughtful electorate.
And Eric is to be commended for putting up with and attempting to defend his well-researched points about the lack of substance behind this website’s “ratings”, to the plethora of idiots trying to shift the responsibility for THE WEBSITE’s FAILURES onto himself as the article’s author. Well done, Mr. Ostermeier.
FYI, the dates chosen for this study are not arbitrary. One would think that, if Politifact was truly guilty of selection bias, this would be consistent throughout their history. To see if this is the case, I took a look at ratings from 2007 to see if there was a difference from 2010:
“Let’s compare these results to the findings of Ostermeier for 2010:
– Out of 39 statements rated “False” or “Pants on Fire,” each main party received approximately 50% of the rulings. This is much different from 2010, where Republicans received 75.5% of these ratings.
– Politifact has devoted approximately equal time between Republicans (124, 52%) and Democrats (110, 46%). This is nearly the same as 2010.
– Republicans were graded in the “False” or “Pants on Fire” categories 16% of the time (39% in 2010) while Democrats were rated in these categories 16% of the time as well (12% in 2010). This absolutely nowhere close to the “super-majority” found in 2010.
– 73% of Republican statements in 2007 received one of the top 3 ratings (47% in 2010) as opposed to 75% of Democratic statements (75% in 2010). When looking at “True” statements alone, Republicans beat Democrats with 5% more of their statements receiving the coveted award than Democrats. It is hard to see how one party was considered more or less truthful than the others. It depends on how much credit you give the “Half True” rating, as opposed to the “True” rating.
– Republicans received a slightly larger percentage of “Mostly False” ratings than Democrats (1.79%). This is the same as in 2010. However, this only results in a 2% difference between Republicans and Democrats for the bottom 3 ratings. This is MUCH different from the 28% difference in 2010.
As you can see, the results from 2007 can seriously undermine many possible conclusions that a person could draw from the 2010 data. The fact that the kind of results you see are dependent on the dates chosen show this is not a random sample. In fact, focusing solely on 2010 to hint at the possibility of Democrat-centered selection bias within Politifact would actually be an great example of cherry-picking data, a well known fallacy.“
http://contentinreality.blogspot.com/2012/01/supposed-politifact-bias-in-2010-non.html
@ Eric
Actually, there is an alternative explanation for why Politifact’s coverage might have changed. An ideological shift or the level of rigorousness of research could have changed when the Poynter Institute and Congressional Quarterly split with Politifact, which I believe was around 2010.
I think the most succinct explanation for why Politifact is more likely to rate conservatives as “False” or “Pants on Fire” is really fairly simple, though Ostermeier doesn’t specifically state this. Adair admits that he and his staff pick stories to ‘fact-check’ based on their own personal interest. If you allow that Adair and his staff are fairly representative of journalists everywhere (that is, that they mostly identify as either moderate or liberal and generally vote for more liberal candidates), then of course they’ll pick more statements by conservatives to fact-check, because statements that are ‘interesting’ are the ones that stick out, and the ones that stick out are the ones that conflict with our own preconceived notions about how the world works. We also hold the statements of those who disagree to us to a higher standard than we hold our own beliefs. If you hold a liberal worldview, you are already starting from a point of view that anything a conservative says is likely to be wrong, and because it’s wrong, he or she must be lying. Whereas if someone you with whom you generally agree says something you know is wrong, it’s because he or she is simply mistaken.
All polls and studies can be manipulated. This is like that travesty, PIPA. If I get to choose what the question is AND what the correct answer is I can get whatever result I want. I don’t think there are any truly fair political studies out there. The fact politifact thinks overlaying a timeline of who holds the presidency over various economic factors is a viable test of how well economic strategies works worries me more than any bias. I give them some credit for mentioning lag time. Something most journalists seem to have no inkling of.
The problem is CONGRESS writes law, spends money, oversees the economy, sets tax rates, etc. I think for reasons of ignorance or bias politifact has totally overlooked when Dems took over Congress as well as when most of the economy tanked and the debt exploded. Obama is STILL blaming Bush for everything when Dems took control of Congress in the election of 2006. As if Pelosi and Reid don’t even exist.
You silly DUMBocrats! I got a good laugh when I read the posts about how “reality has a well-known liberal bias.” That’s actually hilarious considering our LIBERAL president has exercised his executive privilege to keep the truth from the American public. How can you all claim to be aware of what is true and what is false when we’re all in the dark about these things? Do you have some lefty-liberal sixth sense that makes your lefty-liberal left eye twitch twice if politicians are lying? No! You don’t. You’re all just drinking the Kook-aid and clapping as this country get financially raped from within. Congratulate yourselves.
I hope you all enjoy paying for, not only your own health care, but healthcare for everyone else that doesn’t feel like working hard enough to afford it. A much better solution would be to make doctors post prices on their websites and their front doors. Make insurance companies carry everyone; w/o excluding high-risk people. Then make patients pay 10% of the cost if they choose medical providers with prices in the middle 50%. Pay the full-amount for medical providers whose prices are in the bottom 25%. And no insurance help if you get medical services from the most expensive 25% of providers. That’s called an economically driven solution. We’d see medical costs plummet and nobody would die for lack of access to medicine.
Or you can all just keep arguing with each other about which prick in which office is the bigger scum-bag. I couldn’t care less.
Politifact is run by Journalists in the LSM, which most intelligent people have stopped having any faith in. It makes sense to me the Politifact would reflect the overall media bias which journalists can then use to confirm that bias.
Fortunately many Americans are able to “check facts” and come to conclusions themselves.
If you’re 80 years old and need chemotherapy for cancer and are denied treatment because the cost/benefit analysis determines that the cost of treatment isn’t warranted based on statistical life expectancy of the patient, then what do you call the organization that decides to withold the treatment recommended by the medical providers. That person may live another 20 years or die next week with the treatment, but it is fact that we don’t know. “Death Panel” may seem a harsh title, but it is what it is. Liberals have a problem accepting reality like “death panels” and the 16 trillion dollar debt that we now face.
“Obama is STILL blaming Bush for everything when Dems took control of Congress in the election of 2006”
Tell, how many times did Bush use that veto pen?
I strongly suspect there is no “selection criteria” at politifact they could expose beyond the “criteria” used in every legacy media editorial department. They simply see what they want to see, cherry pick the stories that make a tingle shoot up their leg.
There’s an awful lot of comments here from folks who clearly have no clue what’s being said in the article–even less so in the research itself. I hope y’all are still students, and that you stick with it until you can process the logic. Here’s a summary that might help in this instance:
1) Politifact shows a clear leaning in it’s results toward finding more dishonesty in one group over another.
2) The orginization promotes (and profits from) their results, likely swaying public opinion and possibly elections, without disclosing the details of their methodology.
3) The available evidence suggests that their “methodology” is only slightly better than Puxatawnee Phil’s annual weather forecast.
Give it another read. Maybe you’ll see it this time.
Every weekend i used to visit this web page, because i want enjoyment, as this
this web site conations truly pleasant funny information too.
Has anyone considered that Republicans just lie a lot more than Democrats?
“One could theoretically argue that one political party has made a disproportionately higher number of false claims than the other, and that this is subsequently reflected in the distribution of ratings on the PolitiFact site.
However, there is no evidence offered by PolitiFact that this is their calculus in decision-making.”
You seem to make a priori assumptions that both parties should be assumed to be comparably honest and that anything that reflects a significant difference is a red flag for bias. Failing to show any bias in the actual ratings you suggest that the selection process is biased.
You missed an opportunity to be much more rigorous and scientific in your approach. If you constrained your samples to things like debates and state of the union addresses which are basically guaranteed to be fact-checked in their entirety, you’d have a dataset wherein the selection variable was removed.
Instead, you make unsubstantiated accusations that the selection process is biased and then proceed to make speculations about the alleged bias.
You should be looking at data like this – https://www.politifact.com/truth-o-meter/article/2016/sep/27/trump-clinton-first-debate-fact-checks/
If you did that, you’d get valuable data about whether one party lies much more than the other.
Have you ever considered publishing an e-book or guest authoring on other
websites? I have a blog based on the same ideas you discuss and would really like to have you
share some stories/information. I know my subscribers would enjoy your work.
If you are even remotely interested, feel free to send me an e mail.
[…] 2011, University of Minnesota political science professor Eric Ostermeier analyzed 511 PolitiFact stories from the previous 12 months. He found Republicans were assigned “Pants on […]
[…] the left-leaning PolitiFact debunked this claim the very day the advertisement ran, saying, “Indeed, if you look […]
Well, almost 13 years later, and this whole article aged like fine milk.
They put a con artist on a pedestal and started giving him so much leeway that we even get the Big Lie nowadays.
And it’s still a republican pastime to lie their giant elephant asses off…
[…] with Eric Ostermeier’s PolitiFact study, Trends stops short of bluntly charging PolitiFact with liberal bias. Instead, Trends uses the data […]