Australia clamps downs on ‘nudify’ sites used for AI-generated child abuse | Social Media News


The watchdog says three websites used to create images of abuse had received 100,000 monthly visits from Australians.

Internet users in Australia have been blocked from accessing several websites that use artificial intelligence to create child sexual abuse material, the country’s internet regulator has announced.

eSafety Commissioner Julie Inman Grant said on Thursday that three “Nudify” sites were withdrawn from Australia following an official warning.

Recommended Stories

4 item listend of list

Grant’s office said the sites were receiving approximately 100,000 visits per month from Australians and featured high-profile cases of AI-generated child sexual abuse imagery involving Australian schoolchildren.

Grant said such “nudify” services, which allow users to post images of real people naked using AI, have had a “devastating” impact on Australian schools.

“We took enforcement action in September because this provider failed to take safeguards to prevent its services being used to create child sexual exploitation material and was even marketing itself with features like ‘any girl’ undressing and ‘schoolgirl’ image creation options and a ‘sex mode,'” Grand said in a statement.

The development comes after Grant’s office issued a formal warning to the United Kingdom-based company behind the sites in September, threatening a civil penalty of up to 49.5 million Australian dollars ($32.2 million) if it did not take safeguards to prevent image-based abuse.

Hugging Face, a hosting platform for AI models, has also taken separate steps to comply with Australian law, including changing its terms of service to require account holders to take steps to reduce the risks of abuse associated with its platform, Grant said.

Australia has been at the forefront of global efforts to prevent harm to children online, banning social media for children under 16 and cracking down on apps used to stalk and create deepfake images.

The use of AI to create non-consensual sexual images has been a growing concern amid the rapid proliferation of platforms capable of creating photo-realistic content at the click of a mouse.

In a survey conducted last year by US-based advocacy group Thorne, 10 percent of respondents aged 13-20 reported that they knew someone who had taken a deeply nude image of them, while 6 percent said they had been a direct victim of such abuse.



<a href

Leave a Comment