New local study proves alarming intersection between terrorism, violent extremism, child sexual exploitation
New research reveals that the majority of Australians who view child sexual abuse material online — often on mainstream social media platforms — also view fringe and radical content.
A new report from the Australian Institute of Criminology shows there is a “considerable overlap” between those who view CSAM and who view extreme content online.
“In recent years, online dissemination of radical material by extremist groups has increased, resulting in exposure to a wider audience,” its authors say.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.“Simultaneously, the growth of the internet and related technologies has increased the availability of child sexual abuse material.
“Access to both types of content has proliferated in recent years; however, they have typically been viewed as separate problems involving different people.”
But the report points out that there is growing acknowledgement of a nexus between CSAM and radical online content.
The report cites a public service announcement from the US Federal Bureau of Investigation, which The Nightly had previously reported on, that warned of violent online groups which produce CSAM and share it with other group members in what Australian Federal Police call ‘sadistic sextortion’.
This new study — which drew on a survey of 13,302 online Australians — examined the characteristics and behaviours of respondents who viewed CSAM or fringe or radical content online, or both.
The research showed that, within a 12-month period, 65 percent of respondents who viewed CSAM had also viewed fringe or radical content, and seven percent of those who accessed fringe or radical content also viewed CSAM.
Respondents who viewed only CSAM or only fringe or radical content were similar to one another.
And those who viewed both types of content were more likely to be younger and male, to have had contact with the criminal justice system and to have been diagnosed with certain mental illnesses.
The report’s authors — Timothy Cubitt, Anthony Morgan and Rick Brown — said that despite the fact that a large proportion of both CSAM and radical content exists on the surface web and social media, to date there had been no research into individuals who engage with both radical content and CSAM.
“This research was prompted by observations from law enforcement and terrorism professionals that there may be individuals prone to accessing both types of content online, as well as separate consultation with law enforcement and academia undertaken by the authors,” they said.
“Using a large‐scale survey of Australian internet users, this study examines the characteristics and behaviours of respondents who viewed CSAM and fringe or radical content online, with a focus on respondents who viewed both types of content.”
This research uses data from a large sample of online Australians surveyed about their political and social beliefs.
“Whether respondents had intentionally or unintentionally viewed one or both types of content was unclear — it could be that where one type of content was sought, the other was also present,” the report said.
“However, it could also be that a large proportion of the sample were seeking both types of content, particularly those who were looking for CSAM online.”
Alarmingly, online child sexual abuse material is widely available online.
“While it is a significant problem on the darknet, CSAM is also found in indexed content on surface web search engines; searches for adult pornography; peer‐to‐peer networks; and image boards, file‐storage sites and widely used social media sites,” the report said.
“While internet relay chat and email were once common methods for privately sharing CSAM, encrypted messaging apps have become popular for sharing content.
“Among the pathways into CSAM offending, the first exposure is an important point in the onset of offending, with impulsivity and risk‐taking behaviour an established contributory factor.”
In a separate report published by the AIC this month, the same authors said fringe or radical content was often accessed through messages, discussions and posts online.
“Mainstream social media and messaging platforms were the platforms most frequently used to share fringe or radical content,” it said.
The researchers found that those who viewed CSAM were more likely to frequently use TikTok, Twitter and Reddit.
Whereas the fringe or radical content group were more likely to frequently use all platforms other than WhatsApp and WeChat.
“Frequent use of Discord was more than three times as common among the overlap group compared with respondents who did not view either type of content,” the report said.
“The group who viewed neither type of content were the least likely to frequently use all platforms other than Signal.”