exclusive

The Nightly survey by Painted Dog finds Australians think tech firms failing on child exploitation material

Josh Zimmerman
The Nightly
An overwhelming majority of Australians believe tech giants like Meta and TikTok are failing to stem the wave of child sexual exploitation material.
An overwhelming majority of Australians believe tech giants like Meta and TikTok are failing to stem the wave of child sexual exploitation material. Credit: NurPhoto/NurPhoto via Getty Images

An overwhelming majority of Australians believe tech giants like Meta and TikTok are failing to stem the wave of child sexual exploitation material on their platforms.

A national survey commissioned by The Nightly and conducted by Painted Dog Research found 71 per cent of respondents believed social media titans were not doing enough to stamp out the rapidly growing scourge.

It comes amid growing scrutiny of companies like Meta, X, TikTok and Snap over the steps they are taking – or failing to take – to stop the spread of harmful content.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

The Australian Centre to Counter Child Exploitation, led by the Australian Federal Police, received more than 40,000 reports of child sexual exploitation last year.

Not only are children the victims of the abhorrent abuse captured in videos and images, they can also be exposed to the content through the inaction of platforms that do not work proactively to take it down.

The Painted Dog Research survey of 1014 Australians aged over 18 – carried out over March 27 and 28 – found broad dissatisfaction with the sluggish response of tech firms.

Across the board, 71 per cent of respondents believed social media platforms were “not at all” committed to the task of protecting children from harmful content.

That rose to 75 per cent among women but was slightly lower at 66 per cent for men.

The results were comparable across States and Territories but there was some variation among household types: 80 per cent of empty nesters believed tech giants were sitting on their hands compared to 65 per cent of young families.

Sentiment was also similar among income groups with one notable exception. Respondents earning above $250,000 per year were substantially less likely (62 per cent) to think companies like Meta were failing in their obligations to protect kids than those on lower incomes.

UNITED STATES - JANUARY 31: Mark Zuckerberg, CEO of Meta, is sworn in to the Senate Judiciary Committee hearing titled "Big Tech and the Online Child Sexual Exploitation Crisis," in Dirksen building on Wednesday, January 31, 2024. (Tom Williams/CQ-Roll Call, Inc via Getty Images)
Mark Zuckerberg, CEO of Meta. Credit: Tom Williams/CQ-Roll Call, Inc via Getty Imag

The Nightly this week revealed Human Rights Commissioner Lorraine Finlay favoured hauling the executives of big tech companies before a public inquiry to explain how they proposed to address online child sexual exploitation material.

“It’s clear at the moment that they’re not doing enough,” Ms Finlay said.

“They’ve chosen not to provide the resources that are necessary to combat this and it’s important to make sure they’re aware of the human impacts that are occurring as a result.”

In 2022, the National Centre for Missing and Exploited Children – a centralised global reporting system – received 32 million reports about online child sexual exploitation.

Facebook and Instagram – both of which are operated by Mark Zuckerberg’s Meta – accounted for 26 million of those reports between them.

There is bipartisan political support for stronger enforcement action on child exploitation content.

Australia’s Online Safety Act has empowered the Federal eSafety Commissioner Julie Inman Grant to hold digital platforms to account for this content and force them to remove it from their services or issue them with a $156,500 fine.

The eSafety Commissioner also has the power to compel transparency from big tech companies, requiring them to report on how they are meeting the Government’s Basic Online Safety Expectations through the introduction of six legally binding codes to address seriously harmful content, including child sexual exploitation and abuse material.

Failure to meet a direction from eSafety to comply with an industry code, or failure to comply with an industry standard, carries civil penalties of up to $782,500.

Despite that, Ms Inman Grant said the issue was “getting worse with every passing day” and “spreading at a pace, scale and volume we have not seen before”.

Comments

Latest Edition

The front page of The Nightly for 26-07-2024

Latest Edition

Edition Edition 26 July 202426 July 2024

Peter Dutton on public perception, being bald and why he can win the next election.