Australia has shut down three “nudify” websites that were producing sexually exploitative images of children using artificial intelligence.
According to an international media, Australia’s Internet Safety Commissioner said platforms that use AI to turn images of people into nude photos are proving devastating for Australian schools.
E-Safety Commissioner Julie Inman Grant stated that formal warnings were issued before shutting down the three websites, cautioning that violations could result in fines of up to 49.5 million Australian dollars.
She said that despite warnings, the websites failed to introduce proper safeguards to prevent child exploitation. Instead, some features were marketed in ways that encouraged misuse of images of underage girls.
These websites were receiving around 100,000 Australian users per month and were linked to several high-profile cases involving fake sexual images of school students.
Australia has been taking significant steps to protect children online. Recently, a ban was imposed on social media use for children under 16, and action was taken against deepfake applications.
International surveys indicate that non-consensual AI deepfake images are rapidly increasing among teenagers.
According to the US-based organisation Thorn, 10 percent of young people aged 13 to 20 know someone whose fake nude image was created, while 6 percent have been victims themselves.













