JAMIE DUNKIN: Facebook is ditching fact-checkers, but did you notice them in the first place?

Jamie Dunkin
The Nightly
Mark Zuckerberg’s latest token gesture has fallen flat, and nobody is surprised. In fact, nobody even noticed the fact-checker was there in the first place.
Mark Zuckerberg’s latest token gesture has fallen flat, and nobody is surprised. In fact, nobody even noticed the fact-checker was there in the first place. Credit: The Nightly

One of the 21st century’s greatest providers of token gestures has struck again.

Meta, the company arguably most responsible for the global decline in trust in news and science, and the slow death of social cohesion and truth, has axed its oh-so-famous-and-definitely-worthwhile fact-checking program.

The system which rose from the ashes of the 2016 Trump presidency, a global pandemic, and numerous... shall we say curious electoral events across the globe is dead.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

But let’s not beat around the bush: the fact-checking did not work in the first place. As per usual with Mark Zuckerberg and Facebook, it was nothing more than halfhearted lip service to correcting itself and owning up to critics— and maybe designed to not work.

How often would you see clearly, egregiously wrong material get flagged and taken down? How often would content that went beyond disinformation actually get pinged?

The answer is rarely, if ever — and I’m speaking as someone with experience working in social media.

Instead of the fact-checking being used to do its job, the system was easily manipulated by bad actors to get genuine, real, and important breaking news stories and events flagged for points-scoring and nuisance-making.

And once content is flagged with the fact-checker, good luck appealing it — even if you cite the fact-checked and approved material back to Meta.

Like a lot of the content pumped out on the platform, Meta’s fact-checking appeals appeared to be run entirely by artificial intelligence in lieu of human staff who are long since culled from the company.

At the time of writing, The Nightly’s Facebook account still has an outstanding appeal waiting to be seen from August.

If the system is so clogged up by false reports, how could it possibly have done anything about real misinformation? With the endless sewerage canal of AI-generated slop on their platforms — whether that be clickbait, misinformation, engagement bait, or scams — a fact-checking system that used AI instead of humans never stood a chance on Facebook.

The other element that led to the failure of Facebook’s attempt at fact-checking is that people who are most likely to fall for mis- and disinformation are highly unlikely to accept being corrected or challenged.

Once you’re in a space where you accept what you want to be true, you’re probably not going to believe a fact-checker, especially if it’s from a mainstream source you already reject

Instead of taking the correction as ‘fact’, you’re far more likely to bunker down and go further down the rabbit hole that is the self-contained content silo of your algorithm.

One way I can see a form of fact-checking working (at least a tad more effectively) is if Meta follows Elon Musk’s X in introducing “community notes” to posts.

For all the faults with X, community notes have allowed egregiously wrong claims and dangerous advertisements to be easily and quickly debunked, with a mostly well-meaning army of armchair detectives sorting fact from crap.

Meta’s key failing in fact-checking is also its key failing in general: an over-reliance on AI, and a very deliberate culling of human talent to maintain these ineffective AI tools.

This situation is yet more proof that artificial intelligence really isn’t all that advanced, at least not enough to do jobs requiring conjecture and subjectivity.

Community-driven sleuths will always work better because the human element is what went wrong with its original attempt at managing disinformation.

But, with all this said, we’re also dealing with a company what will always put profit and eyeballs over what is right or just. It’s a sad indictment on Meta that its solution to a problem caused by them axing human staff is to get unpaid users to mop up the mess.

So long Meta fact-checking, we hardly noticed you.

Comments

Latest Edition

The Nightly cover for 08-01-2025

Latest Edition

Edition Edition 8 January 20258 January 2025

Underlying inflation figures have experts predicting rate relief for Aussie households - but is it all hot air?