Children are being routinely targeted with AI-powered technology that allow users to generate sexual images of them from online photos, prompting a push for urgent legal changes.
These technologies, which allow users to insert a person’s photos and use generative artificial intelligence to sexualise them and create child abuse material, are proliferating.
There has been a 1325 per cent surge in AI-related child sexual exploitation reports, with more than 67,000 made in 2024, according to the US National Centre for Missing and Exploited Children.
Independent MP Kate Chaney has introduced legislation to criminalise a person having access to technology that has the sole purpose of generating child sexual abuse material.
Possessing such technology would carry a prison term of up to 15 years under her bill.
Ms Chaney brought together governmental representatives, law enforcement and child protection advocates at a roundtable at Parliament House on Tuesday to discuss the issue.
Former Australian of the year Grace Tame said offenders were using increasingly sophisticated technology to produce child abuse material, such as AI apps that allow perpetrators to create content en masse.
“The nature of it is worsening because of AI tools that are enabling offenders to make their methods of abuse more sophisticated, as well as make their methods of evading justice more sophisticated,” she told reporters.

Sarah Napier from the Australian Institute of Criminology, whose research focuses on child exploitation, said she had seen offenders take non-sexual images of kids from social media and turn them into child abuse material.
“So even if a child hasn’t been a contact victim of child sexual abuse, they are still being victimised without their parents knowing, without them knowing, and their photos are circulating around on the internet,” she said.
One in 10 adolescents had been sexually extorted and about 40 per cent of that cohort said it happened using AI-generated material, Dr Napier said while pointing to research from the institute.
“Most of them were sexually extorted under the age of 15 years old and the most common age to be sexually extorted was around 14, 15,” she said.

International Centre for Missing and Exploited Children chief executive Colm Gannon said the technology could already be tracked with systems in place in Australia with the help of universities and researchers.
Open-source AI models are commonly uploaded online, providing people with public access to a back-end they can build on.
Specialised add-ons that enable the generation of child abuse material could be added to open-source models and this was the technology law enforcement would be able to target if it was criminalised, Mr Gannon said.
“Criminal intent is always going to be there but there also has to be criminal responsibility,” he said.
Communications Minister Anika Wells pledged to work with the industry on swift reforms that would be “as tight as possible” to stop people accessing apps that sexualise kids.

“One child in every classroom in Australia has now been a victim of deep fake imagery,” she said.
“These apps are only designed to abuse, to believe, to humiliate and to harass, and we are determined to restrict them.”
The onus would also be on tech companies to prevent the availability of “nudifying” tools and there would be consequences for failures, Ms Wells added, although she didn’t outline a timeline for reform.
“Maybe the eSafety commissioner could block these apps from being on app stores or websites, maybe we could direct the platforms to take them down or to be responsible for taking them down in the first place,” she said.
1800 RESPECT (1800 737 732)
National Sexual Abuse and Redress Support Service 1800 211 028
Lifeline 13 11 14
Kids Helpline 1800 55 1800 (for people aged 5 to 25)
Australian Associated Press is the beating heart of Australian news. AAP is Australia’s only independent national newswire and has been delivering accurate, reliable and fast news content to the media industry, government and corporate sector for 85 years. We keep Australia informed.