'Nudifying' AI deepfake tools used by students blocked

Three of the most widely used “nudify” services, linked to AI-generated sexual exploitation material of school children, have been removed from Australia.
The UK-based company withdrew access after the eSafety Commission issued an official warning in September over fears it was allowing users to create artificially generated child sexual exploitation material.
This contravened Australia’s mandatory code, which requires all online industry members to take meaningful steps to tackle the worst-of-the-worst online content.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.About 100,000 Australians were visiting the “nudify” services every month and have featured in high-profile cases of students creating fake nude images of their classmates.
The takedowns showed Australia’s world-leading codes and standards were working to make the online world safer, eSafety Commissioner Julie Inman Grant said.
“We know ‘nudify’ services have been used to devastating effect in Australian schools,” Ms Inman Grant said.
“With this major provider blocking their use by Australians, we believe it will have a tangible impact on the number of Australian school children falling victim to AI-generated child sexual exploitation.”
She said the provider had failed to prevent its services being used to create child sexual exploitation material, after marketing features like “undressing any girl” and options for “schoolgirl” image generation and “sex mode”.
eSafety has received a doubling of reports about digitally altered images, including deepfakes, from people under the age of 18 in the past 18 months
Four out of five reports involved the targeting of women and girls.
The action follows global AI model-hosting platform Hugging Face changing its terms of service after warnings Australians were misusing some of its generative tools to create child sexual exploitation material.
Hugging Face’s new terms require users to minimise the risks associated with models that they upload, specifically to prevent generating child sexual exploitation or pro-terror material.
The company is required to enforce the terms if it becomes aware of breaches or risk up to $49.5 million in fines.
Ms Inman Grant said her organisation is working with the government on reforms to restrict access to ‘nudify’ tools.
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
