Taylor Swift considering legal action over explicit deepfake images
Taylor Swift is considering legal action against a website that featured explicit – but fake – photographs of her.
The US singer is said to be “fuming” that the highly realistic deepfake pictures, created by artificial intelligence (AI), were then circulated across social media platforms such as X and Telegram and viewed 47 million times.
In the dozens of images originally uploaded to Celeb Jihad, Ms Swift is depicted in a series of sexual acts while dressed in merchandise from the American football team Kansas City Chiefs, which her boyfriend Travis Kelce plays for.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.Sources close to singing sensation Ms Swift, 34, say she is “deeply upset” at the scandal but she is also “determined” to put a stop to it happening to others.
One said: “This is exploitative, and very, very abusive. To do this to a woman, or in fact anyone, without their consent is cruel.”
It is illegal to create deepfakes in the UK under the new Online Safety Act, but there are no US federal laws against it.
However, Ms Swift can sue in individual states.
Her case has prompted calls for nationwide legislation, with Congressman Joe Morelle calling the spread of the pictures “appalling”.
In a statement, X said it was “actively removing” the images and taking “appropriate actions” against accounts spreading them.
It added: “We’re closely monitoring the situation to ensure that any further violations are immediately addressed, and the content is removed.”
The photographs are also understood to have been removed from the Celeb Jihad website.
Last year a study found a 550 per cent rise in the creation of doctored images since 2019, fuelled by AI.