Checking how fact-checkers check?

That title made me think of the woodchuck…anyway, here is the abstract:

Fact-checking has gained prominence as a reformist movement to revitalize truth-seeking ideals in journalism. While fact-checkers are often assumed to code facts accurately, no studies have formally assessed fact-checkers’ performance. I evaluate the performance of two major online fact-checkers, Politfact at Tampa Bay Times and Fact Checker at Washington Post, comparing their interrater reliability using a method that is regularly utilized across the social sciences. I show that fact-checkers rarely fact-check the same statement, and when they do, there is little agreement in their ratings. Approximately, 1 in 10 statements is fact-checked by both fact-checking outlets, and among claims that both outlets check, their factual ratings have a Cohen’s κ of 0.52, an agreement rate much lower than what is acceptable for social scientific coding. The results suggest that difficulties in fact-checking elites’ statements may limit the ability of journalistic fact-checking to hold politicians accountable.

That paper (pdf) is by Chloe Lim, political science at Stanford.  For the pointer I thank Andrew Hall, some interesting political science papers on his home page.  Here is his very interesting book manuscript on how the devaluing of political offices drives polarization, worthy of a top publisher…

Comments

Comments for this post are closed