exclusive

Social media bosses should face public inquiry in Australia over rampant child sexual exploitation material

Headshot of Kristin Shorten
Kristin Shorten
The Nightly
Human Rights Commissioner Lorraine Finlay, left, says tech bosses like Meta CEO Mark Zuckerberg need to be held to account over the fact that they’re not doing enough to combat the proliferation of child sexual exploitation material online.
Human Rights Commissioner Lorraine Finlay, left, says tech bosses like Meta CEO Mark Zuckerberg need to be held to account over the fact that they’re not doing enough to combat the proliferation of child sexual exploitation material online. Credit: The Nightly

Executives of big tech companies should be hauled before a public inquiry in Australia and forced to confront the proliferation of horrific online child sexual exploitation material on their platforms because “they’re not doing enough”.

The CEOs of social media giants Meta, X, TikTok, Snap and Discord were last month grilled before the US Congress over the “plague of online child sexual exploitation” on their sites.

Human Rights Commissioner Lorraine Finlay told The Nightly she wants to see tech titans face a similar scrutiny here in Australia.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

“Absolutely because it’s clear at the moment that they’re not doing enough,” she said.

“They’ve chosen not to provide the resources that are necessary to combat this and it’s important to make sure they’re aware of the human impacts that are occurring as a result.”

The Australian Centre to Counter Child Exploitation, led by the Australian Federal Police, now receives more than 40,000 reports of child sexual exploitation each year.

And in 2022, the National Centre for Missing and Exploited Children — a centralised global reporting system for the online exploitation of children in the US — received 32 million reports about online child sexual exploitation and abuse.

More than 21 million of those were about CSEM on Facebook, followed by more than 5 million reports related to Instagram.

Ms Finlay, a former lawyer and academic, said that while the statistics were “so big”, the true scale of the problem is “unfathomable”.

“When you see those statistics, presented as numbers and files and data, it’s so easy to forget that there are real victims underpinning that and they are innocent children,” she said.

“Behind a number of like 32 million reports are individual children who are having to deal with this, and that’s just heartbreaking.

“Those numbers really sanitise it … the statistics don’t properly capture the damage that’s been caused to the children who lie behind those numbers.”

Online child sexual exploitation material is proliferating at an alarming rate and victims are retraumatised every time images or videos of their abuse are shared, sold or viewed online.

UNITED STATES - JANUARY 31: Mark Zuckerberg, CEO of Meta, is sworn in to the Senate Judiciary Committee hearing titled "Big Tech and the Online Child Sexual Exploitation Crisis," in Dirksen building on Wednesday, January 31, 2024. (Tom Williams/CQ-Roll Call, Inc via Getty Images)
Tech bosses like Mark Zuckerberg are not doing enough to protect children online. Credit: Tom Williams/CQ-Roll Call, Inc via Getty Imag

“We have to ensure we hold to account, not only the people producing the material, but the tech companies that are allowing it to occur,” Ms Finlay said.

“The damage this causes is so substantial and so profound, that it affects young children for their entire life.

“And the truth of it is, there is no penalty you can apply that removes or reverses the damage that’s been caused.

“That’s not to say we shouldn’t ensure that there are penalties but we also have to stop this from occurring in the first place. You don’t want children to suffer these harms at all.”

Ms Finlay said the Government holds responsibility for ensuring Australia has a robust legal framework and adequately resourcing the office of the eSafety Commissioner and law enforcement who work tirelessly in this area.

Under Australia’s Online Safety Act, the eSafety Commissioner has the power to hold digital platforms to account for this content, and have it removed from their services.

Since December, six binding codes have been registered to address seriously harmful content, including child sexual exploitation and abuse material.

“In Australia, we’re using some important levers to compel companies to be transparent through the Online Safety Act and the Government’s Basic Online Safety Expectations,” Commissioner Julie Inman Grant recently said.

“We’ve issued transparency notices to 13 companies covering 27 different services.

“Industry has made real improvements due to this scrutiny, however, our recent transparency reports show some of the biggest tech companies still aren’t doing enough to tackle the proliferation of horrific and harmful material.”

Ms Finlay said the responses to the notices showed there was “work being done”.

“Just not as much as there should be,” she said.

“There are tools that the tech companies have, tools that they’ve developed, so they are thinking about this, but they need to do a lot more in terms of actually implementing it.

“They need to put kids as the top priority in terms of ensuring that safety by design is not just an afterthought, but is actually front and centre.”

Ms Finlay said digital technologies are constantly evolving and predators are usually ahead of the game.

“There’s no ‘set and forget’ answer to this because we know that the people who produce child exploitation material are constantly adapting and changing and innovating and so we need to be doing exactly the same,” she said.

“You can’t just say that you’ve solved this problem and never have to think about it again.

“That legal framework has to be really robust, but then also have the flexibility for law enforcement to be able to use the latest technologies, to be able to be innovative and to be able to respond as things change and transform.”

Communications Minister Michelle Rowland said the Government has brought forward a statutory review of online safety legislation “to ensure it is fit-for-purpose” to meet new and emerging harms.

Ms Finlay welcomed the review, which will report back to the Government later this year.

Meanwhile, Meta CEO Mark Zuckerberg was among the five executives questioned in Congress, at a hearing titled Big Tech and the Online Child Sexual Exploitation Crisis, in the US last month.

In the week before the hearing, all of the big platforms made major child safety announcements including Elon Musk’s company X.

X, formerly known as Twitter, pledged to build a new trust and safety center, and hire more than 100 moderators to deal with child sexual exploitation and abuse but none of X’s staff are based in Australia.

“I think what it shows is that companies are making decisions about where they allocate resources, not based on necessity, but based on the decisions they’re making,” Ms Finlay said.

“That’s the decision they’ve made, rather than something that’s been imposed on them, because they’ve got the resources to do more and they’re choosing not to.”

Comments

Latest Edition

The Nightly cover for 11-12-2024

Latest Edition

Edition Edition 11 December 202411 December 2024

‘Evil. Shameful. Cowardly. Horrific.’ Is PM’s belated response too late to put anti-Semitism genie back in bottle?