Meta Murders: Instagram removes violent accounts of 20-year-old American following social media investigation
Instagram’s parent company Meta has shut down half a dozen Instagram accounts filled with horrific content belonging to a 20-year-old American following an investigation by The Nightly.
In a special series first published last week, this masthead reported that graphic content normally banned by the Australian Film Classification Board was being boosted by Instagram’s algorithm to users — including young children.
Hiding behind some of those accounts was a young man on the other side of the world, who said it was his “mission” to “spread awareness” of violent footage.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.Experts warn these videos can cause serious psychological trauma.
“I do not personally think there is a set age (where) something like this should be seen. I was exposed to it at an extremely young age myself,” he told The Nightly via the encrypted messaging platform Telegram. “I think hiding the truth from people is far worse.”
A total of five accounts linked to the young man were all shut down late on Tuesday evening — with a spokesperson for Meta citing that they all breached their community guidelines around violent content.
But the social media giant did not answer questions about why it took so long for them to act, despite some of his videos and accounts having previously been taken down for breaching their guidelines.
A previous account belonging to the online agitator, which was shut down for continued content violations in June, had reached more than 260,000 Australians in just 30 days.
Two other accounts with violent footage gained more than 123,000 and 103,000 followers in just under a week. One of those accounts survived the initial ban and was sitting at 116,000 followers under the name bruteshootings before it was shut down on Tuesday.
Since the initial crackdown in June, The Nightly recorded at least three more accounts created by the same user. One of which had grown to 26,000 followers in a matter of months.
Meta did not respond to questions about how many content-violation reports were reviewed by humans, instead saying artificial intelligence “proactively detects” accounts exhibiting suspicious patterns of behaviour.
The company said they review both metadata and public reports before actions are taken against certain account holders or posts.
Meta did not disclose how many public reports it had received against accounts linked to the user or what measures it currently had in place to stop people from creating more accounts once they were deleted.
And since the beginning of The Nightly’s investigation, dozens of other accounts filled with extreme violence are popping up on Meta’s Instagram feed.
The 20-year-old posted to a Telegram group chat after five of his accounts were shut down on Tuesday evening before some of the 457 members expressed their anger with Meta’s decision.
The group chat hosts content so traumatic we are unable to publish its details. It contains several videos of satanic rituals, women being shot dead for sexual gratification and others being beaten.
When speaking to The Nightly, the man said nothing would stop him from creating more accounts if they were ever deleted. He said he had been doing so since 2018.
Curtin University Professor of Internet Studies Tama Leaver said Instagram’s poor auto-detection and the algorithms’ inability to effectively differentiate real and fake violence was keeping harmful content on platforms.
“Meta and the Instagram products tend to be the ones that sound like they’re going to be the most reliable and the most well-policed, but we also know they play hard and fast with their own rules sometimes,” he said.
According to Meta’s latest community standards enforcement report, between January and March this year, four in every 10,000 content views on Instagram would be of graphic violent content — which doubled the amount recorded in the same period in 2022.
Between October and December last year, that number was between five and six out of every 10,000 views, and in early 2021 it was between one and two out of every 10,000 views.
Instagram says they “took action” on 12.1 million pieces of graphic violent content between January and March — which it claims was found and actioned by the company 99.4 per cent of the time.