Social media companies told to step up after Australia’s domestic terror threat level is raised to probable
Australia’s online watchdog says tech companies must step up as security agencies grapple with the increasingly dangerous radicalisation of young people.
Calls for greater regulation of social media platforms have been renewed after Prime Minister Anthony Albanese and ASIO boss Mike Burgess pointed to online radicalisation as a key driver of the increased terror threat level.
Mr Burgess said the internet and social media were exacerbating the challenge, particularly for young people.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.He believes there is a role for parents, governments and social media companies.
ESafety Commissioner Julie Inman Grant said the regulator held “very real concerns” about how violent extremists weaponised technology like live-streaming, algorithms, and systems that recommended or promoted “hugely harmful material”.
“The tech companies that provide these services have a responsibility to ensure that these features and their services cannot be exploited to perpetrate such harm,” she told The Nightly.
The commissioner sent legal notices to Google, Meta, X/Twitter, WhatsApp, Telegram and Reddit in March to find out what they are doing to protect Australians from extremist material and activity.
But Ms Inman Grant said it was disappointing that none of the companies had already provided such information voluntarily.
“This shows why regulation and mandatory notices are needed to truly understand the true scope of challenges and opportunities to make these companies more accountable for the content and conduct they are amplifying on their platforms,” she said.
John Coyne from ASPI said while some social media companies had been “incredibly proactive” in dealing with extremist content, others were falling behind.
He said that given neither social media companies nor governments wanted to be the “thought police”, there was a challenging set of circumstances.
Digital policy organisation Reset.Tech Australia wants new laws to create a duty of care requiring tech companies to look after the public.
This means any harms exacerbated by their platforms – such as financial losses from scams, or mental health and addiction issues – would be covered by expectations the companies address these risks upfront.
Executive director Alice Dawkins said these harms had been going on for some time but Australian regulation was not strong enough to compel the necessary accountability.
She pointed to the anti-immigrant violence in the UK this week which has been fuelled by false information on social media.
“(It) provides that really clear causal link between viral hoaxes online and organised offline violence; it should be a bit of a warning sign to Australia that we need better channels into social media companies,” she told The West.
“We need accountabilities from social media companies that are not just these informal mechanisms that only run on goodwill.”
Her arguments are backed by YouGov polling from July showing nearly three-quarters of people were worried about social media platforms failing to remove unsafe content or misinformation.
Almost seven in 10 people believed digital platforms exploited their market domination to skirt Australian safety laws.
Reset also advocates for much bigger fines for breaches, pointing out that the UK regulator has recently hit TikTok with a $3.8 million penalty while the largest fine handed out here was $600,000.
Mr Albanese pointed to the Government’s accelerated review of the Online Safety Act, which gives Ms Inman Grant her powers, and planned trial of age verification for social media.
Ms Dawkins warned that big tech was likely to attempt to “dilute” any responsibilities imposed on them through legislation.