opinion

EDITORIAL: We need agile laws to stop deepfake porn. The new tool of abusive creeps

Editorial
The Nightly
Pop megastar Taylor Swift considered legal action after deepfake AI-generated porn of her appeared on social media platforms.
Pop megastar Taylor Swift considered legal action after deepfake AI-generated porn of her appeared on social media platforms. Credit: Supplied

When sexually explicit images of American pop superstar Taylor Swift appeared on social media platforms in January, millions rushed to their devices to see them.

It didn’t matter that the images were completely fake, generated using artificial intelligence programs. The object was to humiliate and demean her.

The incident prompted Microsoft, the company whose products were believed to be used in the creation of the images, to add extra protections to try to prevent further abuses. Even the White House weighed in, calling the controversy “alarming”.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

Still, by the time the images were eventually expunged from the internet, one of the posts bearing Swift’s deep-faked likeness had been viewed 47 million times.

Swift was reported at the time to be considering legal action, but as yet none has been taken.

If even the most powerful woman in the world, with the might of the White House behind her, is vulnerable to such abuse, imagine the struggles faced by those without Swift’s immense influence and resources.

Most victims of deepfake pornography aren’t Taylor Swift.

They are ordinary women (because they are overwhelmingly women) who become the target of campaigns of hate from ex-partners, associates or even strangers.

In one recent example, “incredibly graphic” and “sickening” doctored images of 50 students at a Victorian high school were allegedly circulated online.

As AI technologies gather pace, so does the risk.

Speaking to a Senate inquiry looking into proposed laws to punish such abuses, eSafety Commissioner Julie Inman Grant said there had been a 550 per cent increase in reports since 2019. Unsurprisingly, 99 per cent of material online targets women and girls.

The Government’s proposed laws would see adults convicted of having shared sexually explicit images without consent — regardless of whether those images were real or faked — face up to six years in prison.

Aggravated offences carrying up to seven years in prison would be created for repeat offenders, and for those guilty of creating or altering the shared material with technology without consent.

Advocates say the laws don’t go far enough. They also want the laws to cover threats to circulate or create such images, which they say are frequently used by abusers to intimidate and control victims.

The internet has become the newest arena for gender-based abuse. Technologies are developing rapidly, and we need laws agile enough to keep pace with the creeps who will seek to abuse them to harass and humiliate.

More responsibility too must be borne by the tech companies whose services are used to both create and circulate non-consensual images.

The wellbeing of a generation of women may depend on it.

Lifeline 13 11 14

beyondblue 1300 22 4636

Comments

Latest Edition

The Nightly cover for 13-12-2024

Latest Edition

Edition Edition 13 December 202413 December 2024

The political battle for Australia’s future energy network has just gone nuclear.