The AI Toolkit for K-12 Education

Using AI for Child Sexual Abuse Material

Child Sexual Abuse Material (CSAM) is the modern term for child pornography. Unfortunately, AI tools are being used to create CSAM. As AI technology improves and becomes readily available, the number of cases of AI generated CSAM is quickly rising.

Reports of AI-generated CSAM (January – June)

2024: 6,835

2025: 440,419

Source

AI Deepfakes and Sextortion

Sextortion is a serious crime that occurs when someone threatens to distribute private and sensitive material if not provided with money or other goods/services. Young people are often targeted in these types of scams because they can be easily manipulated. AI can produce fabricated images and videos of minors unclothed and participating in sexual activities. This can create serious psychological damage on young victims, leading to bullying, anxiety, and self harm. The trauma experienced by children and their families remains genuine and severe, regardless of whether the exploitative content was artificially generated rather than photographed. Learn more by reading The Growing Concern of Generative AI and Child Exploitation from The National Center for Missing & Exploited Children.

55

Made with FlippingBook Digital Proposal Creator