updated

Survivors and advocates plea for Labor’s deepfake image-based abuse laws to go further and be stronger

Ellen Ransley
The Nightly
Global superstar Taylor Swift became the most high-profile celebrity to be subjected to such material earlier this year. 
Global superstar Taylor Swift became the most high-profile celebrity to be subjected to such material earlier this year.  Credit: The Nightly

Survivors and advocacy groups have warned the Federal Government’s proposed deepfake legislation does not go far enough to deal with the alarming trend of gendered violence and the role tech companies are playing.

A parliamentary committee probing the proposed Bill on Tuesday heard the rise of generative AI technology, which allows users to create or alter images, had prompted a rising scourge of fake sexualised images and videos, that has led some victims to take their own lives.

Global superstar Taylor Swift became the most high-profile celebrity to be subjected to such material earlier this year.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

But eSafety Commissioner Julie Inman Grant told the committee there was compelling data that explicit deepfakes had increased on the internet by as much as 550 per cent year-on-year since 2019.

She noted pornographic videos “shockingly” made up 98 per cent of the material online, and 99 per cent of that imagery was of women and girls.

“Deepfake image-based abuse is not only becoming more prevalent, but it is also very gendered, and incredibly distressing to the victim survivor,” she said.

Under the Albanese Government’s proposed amendments to the Commonwealth Criminal Code, adults found to have shared sexually explicit material without consent — whether deepfake or not — face up to six years in prison.

Where the person also created the deepfake shared without consent, there is a proposed aggravated offence which carries a penalty of seven years imprisonment.

A panel of advocates told the committee it failed to go far enough.

Rachel Burgin, chief executive officer of Rape and Sexual Assault Research and Advocacy, warned that unless the Bill was amended to create an offence explicitly for the creation of deepfake sexual material without consent, a “culture of gendered violence where women’s bodies are easily accessed as the property of all” would continue to exist.

“In Australia, we have a problem with violence against women. We cannot allow women’s safety to play second fiddle to men’s interests,” she said.

She said surveys had found perpetrators had admitted the biggest deterrent for them committing the abuse would have been criminal penalties.

“As it stands, an offence is only committed if images are distributed, and then it becomes an aggravated offence, but creation alone carries no offence. The creation of these images, whether or not they are distributed, is a significant harm,” she said.

She also called for the creation of a criminal offence for threatening to create or share deepfake materials.

“Commonly, threats to circulate intimate images, including deepfakes, is a tactic used by abusers to instil fear and exert control and power over another person,” she said.

The Attorney-General’s department later told the committee the Commonwealth only had “limited constitutional powers” in terms of the criminal offences it could enact.

“The creation or sharing of non-consensual, or the non-consensual sharing of, intimate images ... the offences of the pure creation of the adult material or child sexual abuse material are typically dealt with by the states and territories,” assistant secretary Parker Reeve said.

He said if someone made a threat over a carriage service, they would likely be subject to existing legislation.

Noelle Martin, a survivor and “continuous target” of image-based abuse, said the implications of image-based abuse was “life destroying and shattering”, that could impact a person in “every aspect”.

“It impacts your dignity, your autonomy, your agency. It compromises and threatens people’s capacity to self-determine,” she said.

She said the Government’s proposed laws were a start but were too “limited and narrow” and failed to address how truly far-reaching the issue was.

She was also critical of the role that tech companies played and called for a greater crackdown.

“We really need to tackle the root causes here. The perpetrators aren’t being held to account. Tech companies aren’t being held accountable, and are allowed to enable this and profit from this,” she said, pointing to websites allowing one to “nudify” an image that is easily accessible through search engines.

Ms Inman Grant later cited examples of how open source AI websites advertised their “nudify” services.

“Some might wonder why apps like this are allowed to exist at all, given their primary purpose is to sexualise, humiliate, demoralise, denigrate and create sexual abuse material,” she said.

She said as the law currently stands, eSafety does not have the power to regulate the apps that target adult women.

“There are some gaps in terms of our codes and standards,” she conceded but suggested the ongoing Online Safety Act review could better capture the emerging harm.

Queensland Sexual Assault Network executive officer Angela Lynch called for greater funding in addition to strengthened legislation.

“We would also like consideration of the funding impacts of these new crimes ... which can have a number of victims ... and ultimately, as people will be seeking assistance and support through the specialist sexual violence services who at the moment already have lengthy wait lists for people,” she said.

She said she was also concerned the Bill as it stands didn’t go far enough and should cover animation, avatars, or alter egos created to resemble people.

Professor Jonathon Clough later told the inquiry that while avatars or cartoons could be “terribly degrading and repugnant”, they should not fall within the provision.

He suggested the Bill’s scope might not be narrow enough.

Lifeline 13 11 14

beyondblue 1300 22 4636

Kids Helpline 1800 55 1800 (for people aged 5 to 25)

Latest Edition

The front page of The Nightly for 16-09-2024

Latest Edition

Edition Edition 16 September 202416 September 2024

Defamed war major demands answers from public broadcaster over ‘shocking scandal’