EDITORIAL: Meta and child sexual exploitation material — the worst corporate crime of our generation

The Nightly
Facebook and Instagram represent a huge percentage of the 32 million reports of child sexual exploitation and abuse - but CEO Mark Zuckerberg has shown little desire to do anything about it.
Facebook and Instagram represent a huge percentage of the 32 million reports of child sexual exploitation and abuse - but CEO Mark Zuckerberg has shown little desire to do anything about it. Credit: Tom Williams/CQ-Roll Call, Inc via Getty Imag

When Westpac was revealed to have turned a blind eye to transactions made by customers involved with child exploitation, the reaction was swift and punishment substantial.

An investigation had turned up overseas transfers by 262 paedophiles or people suspected of child exploitation.

In 2020, the bank agreed to pay a record penalty of $1.3 billion to settle legal actions arising out of those and other allegations relating to money laundering levelled against it by financial intelligence agency Austrac.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

Heads rolled at the bank’s highest levels. Casualties included Westpac’s chairman Lindsay Maxsted and chief executive Brian Hartzer.

Australians were, rightly, appalled and outraged at the idea of a corporate company profiting off the most despicable of crimes — the sexual abuse of children.

And as shocking as the allegations against Westpac were and continue to be, they pale in significance in comparison to claims made against big tech giants complicit in turning the internet into a predator’s paradise.

In 2022, the National Centre for Missing and Exploited Children — a centralised global reporting system for the online exploitation of children — received 32 million reports of child sexual exploitation and abuse, including 49.4 million images and 37.7 million videos from tech companies.

And according to Australia’s e-Safety commissioner Julie Inman Grant, that’s “just the tip of a very large iceberg”.

She said authorities are struggling to keep pace with an explosion in abuse material. Offenders who are banned from one platform for disseminating illegal content are often back at it immediately under multiple different accounts.

More than 90 per cent of those 32 million reports came from Facebook, Instagram, Google, WhatsApp, and Omegle. More than 21 million and 5 million reports were about child sexual exploitation material on Mark Zuckerberg’s Facebook and Instagram respectively.

Yet despite clear evidence that the billionaire owners of these companies understand that paedophiles are using their platforms to share depraved material with others, they are showing little appetite to do anything about the crimes occurring under their noses.

Last year Google and X (formerly Twitter) chose not to comply with notices from Australia’s e-safety Commission relating to crimes against children.

In X’s case, the non-compliance was so egregious that it simply didn’t respond at all to questions, and to others supplied information that was incomplete or incorrect.

A fine of $610,500 was issued to X. Predictably, that fine hasn’t been paid.

The fact is that these tech companies simply don’t care.

They’re happy to continue profiting off the misery of children and thumbing their noses at the law enforcement agencies trying to protect those children from harm.

Until those authorities have the power to enforce penalties that actually hurt, these multi-national villains will continue perpetrating the worst corporate crime of this generation.

Responsibility for the editorial comment is taken by The Nightly Editor-in-Chief Anthony De Ceglie


Latest Edition

The front page of The Nightly for 15-07-2024

Latest Edition

Edition Edition 15 July 202415 July 2024

Defiant Trump breaks silence on ‘miraculous’ survival after shock assassination attempt sparks search for answers.