An example was our check on a prominent MP’s claim that 80 percent of Australia’s grocery market is controlled by two big supermarket giants. Our academic looked into that and said, well, if you just include dry packaged groceries, it’s up to around 80 percent, but if you include fresh fruit and vegetables and meat and other groceries it’s more like 55 to 60 percent, which is the industry estimate. We talked around that and discussed giving the statement a “Mostly False” rating, a “Mostly True” rating, or a “Half True” rating. It depends on how literally you are going to take the statement.

Most statements need context to give them meaning, we found, and we saw that other factcheckers in Australia were getting pushback about their ratings, which takes the debate away from the substance of the piece. We’re not criticizing that approach, it just didn’t suit us. The aim of our site is to be useful to voters, to help them understand the basic information about contentious issues, and we found that bald ratings weren’t especially useful.

Our other difference is that The Conversation, which is a non-profit, operates under what’s called a “Creative Commons” policy, which means anyone can run our pieces at no cost. So a big or small media player, a blogger or small website, even someone running a community newsletter, can publish our factchecks if they think their readers might be interested. They just have to give us credit and to check with the author if they want to make substantial changes.

I love the way you are blending academic and journalistic approaches. What substantive difference do you think it makes in practice to the content you produce? Have your expert authors come to different conclusions than PolitiFact Australia (which uses a journalistic approach) or traditional media outlets in any cases yet? Are there issues where you expect the academic approach to add value?

We’ve been going for three weeks. So far, I think the difference has been in the sorts of statements we check. We’re avoiding ones that are clearly more suited to a journalistic approach. It doesn’t limit us too much, but it does mean we are policy-focused so far. We have checked three statements that have also been checked by PolitiFact Australia. One reached the same conclusion as PolitiFact did using similar arguments; another checked the statement in a different way. The latter case was a claim by an opposition politician that Australia was losing one manufacturing job every 19 minutes. PolitiFact rated that “half true”—they said while the figure was correct, it was wrong to blame the government for that. Ours also crunched the numbers and found the statement to be true, but put it in an historical and international context and said that the loss of manufacturing jobs was common in developed nations. So both PolitiFact and The Conversation’s Election FactCheck gave the statement context, which I think is valuable. The third one was about opposition claims that it now took more than three years to get approval for a mine in Australia, up from 12 months a few years ago. PolitiFact found that Half True; we found it false.

Brendan Nyhan is an assistant professor of government at Dartmouth College. He blogs at brendan-nyhan.com and tweets @BrendanNyhan.