Google, Yahoo and Bing ordered to stop serving up child sexual exploitation material in search engine results
Access to child sexual exploitation material through Google and other search engines will be “significantly disrupted” under new laws, commencing today, that force internet giants to close “key gateways” to the illegal content.
Big tech companies — including Google, Bing and Yahoo — have been ordered to stop serving up child sexual abuse in their search results or face fines of up to $782,000 a day.
Under a new online safety code, search engines are now required to prevent child sexual exploitation material, including deep-faked versions of harmful content, from appearing in their search results.
Sign up to The Nightly's newsletters.
Get the first look at the digital newspaper, curated daily stories and breaking headlines delivered to your inbox.
By continuing you agree to our Terms and Privacy Policy.Australia’s eSafety Commissioner Julie Inman Grant said today’s launch of the Internet Search Engine Services Code was another significant step in the protection of children online.
Ms Inman Grant said it demands the tech industry play their part in restricting the growing global trade in the “worst-of-the-worst” online content.
“The commencement of the search code is really significant as it helps ensure one of the key gateways to accessing this material, through online search engines, is closed” she said.
“It will target illegal content and I will be able to seek significant enforceable penalties if search engines fail to comply with a direction to comply with the code.”
The new regulations will force a significant shift at Google, which has about 93 per cent of the market share, followed by Bing. They are followed by alternative search engines Yahoo and DuckDuckGo.
Search engine providers whose end-users are in Australia will now be required to take important steps to prevent child sexual abuse material from being returned in search results and ensure AI incorporated into the search engines is not misused.
The code is Australia’s first set of AI regulations and applies to “artificial intelligence features integrated into the search functionality that may be used to generate” illegal content.
Generative AI can and has been misused to create “synthetic” child sexual abuse material, which is not only extremely harmful, but can also divert precious resources away from tackling real child abuse material and rescuing children at risk.
This code, developed by industry, is the sixth registered online safety code to be introduced in Australia in recent months.
The first five industry codes, which commenced in December, cover social media, app stores, internet service providers, hosting providers, device manufacturers and suppliers.
All six codes are enforceable and require participants to take appropriate measures to address the risk of “class 1 material” on their services in Australia.
Class 1 material includes child sexual exploitation material, material that advocates the doing of a terrorist act and material that promotes, instructs or incites in matters of crime and violence.
If a search engine fails to comply with this newest code, e-Safety can apply to the court for civil penalties of up to $782,000 a day.
Ms Inman Grant said creating the new search engine code, under the Online Safety Act, was not “entirely smooth sailing” and followed months of negotiations with big tech companies.
“The sudden and rapid rise of generative AI and subsequent announcements by Google and Bing that they would incorporate AI functionality into their search engine services meant the original code would have been out of date before it commenced,” she said.
“But I want to give thanks to the industry associations and the key search engine providers for their hard work and willingness to go back and redraft the code to make it fit for purpose.
“What we’ve ended up with is a robust code that delivers broad protections for Australians.”
End-users do not have obligations under the new code. However, it is a crime to possess child abuse material obtained through a carriage service.
eSafety is currently preparing draft industry standards for two further industry sectors.
The first is Relevant Electronic Services which includes a range of private messaging and other communication services.
The second is Designated Internet Services, which includes websites and apps not falling within other categories, as well as file and photo storage services.
These standards, to be tabled for consideration by Parliament later this year, will also address the risk associated with generative AI.