exclusive

Meta murders: Kids being fed unfiltered footage of people being murdered, mutilated and tortured

Caleb Runciman
The Nightly
Shocking images of people being murdered are proliferating on social media.
Shocking images of people being murdered are proliferating on social media. Credit: The Nightly

Horrific unfiltered footage of people being murdered, mutilated and tortured is slipping into the feeds of anyone with a smartphone, including children who are potentially being normalised to violent content they should never see.

Graphic videos so shocking they would be R-rated or banned under the Australian Film Classification system now flood the internet unprompted.

And Australia’s online safety watchdog, the office of the e-Safety Commission, admits it “doesn’t have the perfect tools” to stop the gore that suddenly appears on our screens.

Sign up to The Nightly's newsletters.

Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.

Email Us
By continuing you agree to our Terms and Privacy Policy.

One of the country’s top psychologists says young people could even develop “post-traumatic stress disorder after witnessing” pictures that are so “distressing”.

The disturbing revelations come as Australia’s national terrorism threat level was last week raised from possible to probable, partly on the back of the internet and social media being the primary platform for radicalisation.

In a special investigation by The West Australian, it took just minutes for graphic footage of shootings, violence and cold-blooded killings to appear on our Meta-owned Instagram feed.

Moments after engaging with an account, the algorithm keeps pumping screens full of horrific images — giving users little reprieve from content historically found on the darkest parts of the internet.

Despite sensitive content warnings, the sickening videos and images are still available to view by anyone of any age. And once you watch a video or engage with an account, the platform’s algorithm bombards you with similar distressing content.

Shocking examples include:

  • one reel posted on a public Instagram account named @bruteshootings in June shows a man being shot in the head before blood pours from his face;
  • another video on an account run by the same user shows a dead body — believed to be of a male — stripped naked before the person is tied up and hung upside down by their feet;
  • a man in a blood-soaked coat can then be seen using a saw to cut the person in half. A part of their body is then wrapped in a package.

Other videos — including a man being electrocuted before falling off a roof — will fill your feeds without warning on Instagram’s reels feature, which was created following the success of TikTok’s For You Page.

Instagram users expressed their disgust by commenting on another video which showed a man tied to a wooden stake before he was set alight. He was wearing a fuel-soaked ski mask as he was pushed around with his hands bound to the stake — all while burning alive.

In another video, the screams of a woman echo from a car as two males attack the vehicle. One man armed with an axe can be seen hacking into a passenger who was trying to kick them away.

A man is shot dead on the pavement in another video — before the shooter returns and discharges more bullets into his limp body.

Photos from autopsies, dead celebrities, people being eaten alive by dogs and gangland executions are some of the videos available to people on a platform used by schoolchildren.

Countless videos seen by this masthead are far too graphic to write about.

Professor of Internet Studies Tama Leaver says algorithms aimed at detecting harmful content struggle to differentiate real and fake violence, making its complete removal impossible. While some footage may automatically get removed, Instagram was still mostly reliant on people reporting videos to be taken down.

One account-holder dedicated to posting “gore” — who is based in the US and aged just 20 — revealed he had reached more than 260,000 Australians with his graphic content in the past 30 days.

He believes about “10,000” similar accounts exist around the world. He said the number will continue to grow unless more people report his videos.

His response to any of his content being removed, or one of his accounts being shut down, is to just post more videos — or simply make more accounts.

Australian Psychological Association President Dr Catriona Davis-McCabe said violent videos viewed by young people could normalise dangerous behaviours in the community.

“(Social media) has been concerning around excessive use, social comparison and cyberbullying . . . but I think more worrying is that we need to consider that social media can really amplify and normalise behaviour that might be illegal for young people,” she said.

“This is a massive risk for young people, because they are now able to view things online that their brains are not developed enough to be witnessing, and it can be incredibly traumatic for them.”

Dr Davis-McCabe said young people who are suddenly exposed to violent or distressing content online can lead to the development of mental health disorders, even despite it not being in person.

“When someone sees something graphic, a young person’s brain has no context for it, they don’t understand it, and it can be incredibly traumatic for them whereby they end up with really distressing symptoms and quite difficult mental health,” she said.

“We can even think about them developing a mental health disorder like post-traumatic stress disorder, after witnessing something graphic and distressing.”

When asked about the prevalence of murder, suicide and shooting videos on a platform such as Instagram, Dr Davis-McCabe said children are known to impulsively return to the content as a means of trying to understand it — which can lead to more “feelings of anxiety and depression”.

“It can normalise (the feelings) of witnessing distressing events,” she said. “What’s happening is that they are watching this over and over, and that can normalise behaviours for people that are not normal.”

Australia’s e-Safety commissioner Julie Inman-Grant told The West Australian that “no country” has perfect tools to hold all of the big social media companies to account — including Australia.

“I think we could double the size of eSafety and still be challenged to keep up with all the work, it means we have to ruthlessly prioritize,” she said.

“I think it’s no secret that having greater enforcement powers — in terms of fines, just so that they’re aligned with other domestic and international regulators is important.”

Dr Davis-McCabe says schoolyard politics is also making the situation worse, whereby children struggle to grasp the violent nature of a video and start showing it to their friends.

“There can be a lot of peer pressure, and we particularly see that in schools today, where people are witnessing and watching things as a group,” she said.

“There’s nobody explaining to them that this isn’t what happens in real life, or this isn’t normal and what should not happen every day . . . there’s a real lack of understanding for young people.

“As parents and teachers, we’ve got a responsibility to try and respectfully talk to the children about the harms of social media, and what is normal behaviour and what is not. Because the reality now is that young people have access to anything online at any time, and that’s not something we didn’t used to have.”

Dr Davis-McCabe said while a lack of support for more psychologists in school wasn’t helping, the Government and tech companies needed to “come to the table” and “do more” to ensure children are “shielded” from graphic content.

“If a young person has witnessed something distressing, they need to be able to access support and talk about it,” she said.

“If someone feels like they need support with this, then reach out to your GP or your psychologist if you can . . . just reach out and ask for that support, because you can talk through this, you can work on this.”

The effect on our youth of violent and extremist online content was singled out earlier this month when Australian Security Intelligence Organisation director-general Mike Burgess raised the level of the country’s terror threat.

Comments

Latest Edition

The front page of The Nightly for 19-09-2024

Latest Edition

Edition Edition 19 September 202419 September 2024

ALP can’t decry the Greens’ support of Hamas and rioters while still chasing their votes, writes Cameron Milner.