Surge in AI-Generated Child Abuse Material in Past Year, AFP Warns
9News
SKIPPED
Details
- Date Published
- 28 Jan 2025
- Priority Score
- 4
- Australian
- Yes
- Created
- 10 Mar 2025, 10:27 pm
Description
<p>Some of the illegal material is being made by students to humiliate and attack their peers, police say.</p>
Summary
The Australian Federal Police (AFP) has raised alarms about a significant rise in AI-generated child abuse material, marking an urgent need for parental awareness and dialogue about online safety. This phenomenon highlights how AI technologies, including deepfakes, are being misused by students, sometimes to harm peers, which underscores the broader issue of AI misuse for malicious purposes. The AFP is actively involved in educational initiatives like ThinkUKnow to combat this trend by promoting awareness among students, parents, and carers. This development is crucial in Australia's context, as it pertains to safeguarding children against new-age digital threats, advocating for swift policy responses to address these ethical challenges posed by AI advancements.
Body
The use ofAIto create child abuse material is surging, prompting a warning to parents about the dangers of its use.The Australian Centre to Counter Child Exploitation, led by the Australian Federal Police, has witnessed an increase in illegal AI-generated material in the past year.Part of that increase has been a higher incidence of students creating images such as deepfakes for numerous reasons, including to harass or embarrass classmates.READ MORE:Man, 43, charged after memorial for fallen police vandalised in SydneyThe AFP has warned of a rise in AI-generated child abuse material.(Getty Images/iStockphoto)Last year, one Australian man was jailed for possession of AI-generated child abuse material, and another was jailed for using AI to produce child abuse images.AFP Commander Helen Schneider urged parents to have open and non-judgemental conversations with their children, and said many young people might not be aware that using AI to create material featuring their classmates could be criminal."Children and young people are curious by nature, however, anything that depicts the abuse of someone under the age of 18 – whether that's videos, images, drawings or stories – is child abuse material, irrespective of whether it is 'real' or not," Schneider said."The AFP encourages all parents and guardians to have open and honest conversations with their child on this topic, particularly as AI technology continues to become increasingly accessible and integrated into platforms and products."READ MORE:Luxury cars worth over $600k seized as five charged in VictoriaSome of the child abuse material is being created by young people, sometimes to target classmates and peers.(Getty Images/iStockphoto)She said an AFP-led education program,ThinkUKnow, offered free resources in this area for parents and carers.Research conducted by the ACCCE in 2020 revealed only about half of parents talked to their children about online safety."These conversations can include how they interact with technology, what to do if they are exposed to child abuse material, bolstering privacy settings on online accounts, and declining unknown friend or follower requests," Schneider said.In the 2023-24 financial year, ThinkUKnow delivered 2218 presentations about online child sexual exploitation to 202,905 students across Australia.The program, run by the AFP, state and territory police and industry volunteers, also delivered 317 presentations to more than 21,500 parents, carers and teachers during the same period.People seeking support, resources or ways to report child abuse material should visitthe ACCCE website.Support is available from theNational Sexual Assault, Domestic and Family Violence Counselling Serviceat1800RESPECT (1800 737 732).