National Centre for Missing & Exploited Children reveals devastating toll of abuse ignored by tech giants
Big tech companies “aren’t doing enough” to tackle the unprecedented explosion in online child sexual exploitation, with 32 million reports about the horrific criminal content on major platforms each year representing “just the tip of a very large iceberg”.
Australia’s e-Safety commissioner Julie Inman Grant said the proliferation of online child sexual abuse material is “getting worse with every passing day” while many of the social media platforms and services that Australian children use enable the production, storage and spread of this illegal material.
Behind these millions of images and videos are our community’s most vulnerable — innocent children — being subjected to heinous crimes.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.“Child sexual abuse material is spreading at a pace, scale and volume we have not seen before,” Ms Inman Grant said this week.
Child sexual exploitation or abuse material (CSEM) — which is any content that presents a child in a sexual context — is illegal to create, share or keep.
In 2022, the National Centre for Missing and Exploited Children — a centralised global reporting system for the online exploitation of children – received 32 million reports of child sexual exploitation and abuse, including 49.4 million images and 37.7 million videos from tech companies.
The NCMEC’s CyberTipline offers the public and online electronic service providers an easy way to quickly report suspected incidents of sexual exploitation of children online.
In 2022, 99 per cent of CyberTipline reports were submitted by ESPs and just five of those companies – Facebook, Instagram, Google, WhatsApp, and Omegle – provided more than 90 per cent of the reports.
More than 21 million reports were about CSEM on Facebook, followed by more than 5 million reports about Instagram.
There were more than two million reports of CSEM on Google and one million on WhatsApp.
Comparatively there were only 98,000 reports about this illegal material on Elon Musk’s company X, which was formerly known as Twitter.
While the high numbers may seem indicative of the amount of CSEM on particular social media platforms, the NECMEC says “one company’s reporting numbers may be higher because they apply robust efforts to identify and remove abusive content from their platforms”.
Ms Inman Grant said that reports of online child abuse material began to spike in early 2020 as COVID-19 lockdowns took hold and that since then, she has seen a year-on-year doubling of reports of online CSEM.
“Every day e-Safety investigators see the same offenders create multiple new accounts, even after they have been banned by a platform,” she said recently.
“They see platforms being used to distribute thousands of links to child sexual exploitation and abuse sites.
“While these numbers are incredibly confronting, we also know they are just the tip of a very large iceberg and fail to tell the full story when it comes to the true scale and scope of the issue.”
In Australia, an investigation team at e-Safety reviews content reported to its office, before referring it to the NCMEC and the Australian Federal Police.
Trying to determine where victims are located, and if they are Australian, is difficult and painstaking work.
It is believed most of this material is hosted offshore due to Australia’s hostile hosting environment.
Every day e-Safety investigators see the same offenders create multiple new accounts, even after they have been banned by a platform.
“If child sexual exploitation and abuse material is found to be hosted in or provided from Australia, e-Safety will notify the relevant police force first and, once we are certain that their investigation, and the potential rescue of a child, will not be compromised, we will direct the relevant online service to remove the material,” Ms Inman Grant said.
In December, five new industry codes were enacted in Australia, forcing social media companies, app stores, internet service providers, hosting providers, device manufacturers and suppliers to take meaningful action to tackle the ‘worst-of-the-worst’ online content.
The codes also require services to provide safety information and reporting tools to respond to user complaints. If a complaint is not resolved, Australians can seek assistance from e-Safety through the industry codes complaints form.
The e-Safety commissioner has the power to investigate possible non-compliance, direct a service to comply with an industry code and take enforcement action if necessary.
“In Australia, we’re using some important levers to compel companies to be transparent through the Online Safety Act and the Government’s Basic Online Safety Expectations,” Ms Inman Grant said.
“We’ve issued transparency notices to 13 companies covering 27 different services.
“Industry has made real improvements due to this scrutiny, however, our recent transparency reports show some of the biggest tech companies still aren’t doing enough to tackle the proliferation of horrific and harmful material.”
Last year, two of the biggest and most widely known companies in the world – Google and Twitter/X – did not comply with the notices from e-Safety by failing to answer a number of key questions related to crimes against children.
Google has been given a Formal Warning to deter it from future non-compliance.
“Twitter/X’s non-compliance was more serious,” Ms Inman Grant said.
“For some questions, Twitter/X failed to provide any response leaving some boxes entirely blank.
“In other instances, Twitter provided a response that was otherwise incomplete or inaccurate.”
Twitter/X did not answer questions about whether tools were used to detect CSEM in live streams or how long it takes for the platform to remove CSEM once a user reports it.
For that reason, Twitter/X was issued with an infringement notice of $610,500, which it did not pay.
As a result, e-Safety has since launched civil proceedings against the company owned by billionaire Elon Musk in the Federal Court. The matter is listed for a first case management hearing on June 7.
In the US, legislation intended to hold social media companies responsible for material posted on their platforms is currently going through Congress.
Last month lawmakers were given a rare opportunity to question the bosses of Meta, TikTok, Snap, X and Discord about what they were doing to protect children from online sexual exploitation.
During the fiery four-hour US Senate hearing on February 1, Meta CEO Mark Zuckerberg – who runs Instagram and Facebook – was invited to apologise to families of children who had been harmed by social media.
“I’m sorry for everything you’ve all gone through, it’s terrible,” Zuckerberg said to the families present at the hearing.
“No one should have to go through the things that your families have suffered.”
Meanwhile, back in Australia, Ms Inman Grant said all tech companies need to “do more and do better” to combat this growing global problem.
“The cost of continued inaction is simply too high.”
—To report online child sexual abuse material to the eSafety Commissioner visit https://www.esafety.gov.au/report