The AI Toolkit for K-12 Education

Watch: AI in Education – Safety Considerations

In April 2024, MSBA’s Director of K-12 Safety Bob Klausmeyer presented a one-hour webinar on the safety threats related to AI usage in today’s world.

Using AI for Child Sexual Abuse Material

Child Sexual Abuse Material (CSAM) is the modern term for child pornography. Unfortunately, AI tools are being used to create CSAM. In 2023 alone, the National Center for Missing & Exploited Children (NCMEC) tipline received 4,700 reports of CSAM generated using AI tools. Sometimes these materials are used in attempt to extort a child or their family for financial means. To learn more, read Generative AI CSAM is CSAM on NCMEC’s website.

Made with FlippingBook Digital Proposal Creator