Human Accountability
When it comes to using AI in schools, humans must always be “in the loop. ” While AI tools may be incredibly helpful for making decisions and solving problems, they cannot have the final say. Humans must use the information provided by the AI tools to make decisions. If there are negative consequences to decisions aided by AI, a human will be held responsible. Saying, “Don’t blame me, the robot did it!” won’t hold up in court.
This slide from a 1979 IBM presentation is more true today than ever.
AI’s Impact on Federal Laws
FERPA – AI use must not disclose student education records.
COPPA – Additional protections for users under 13. Can district consent on parents’ behalf?
ADA – AI must be accessible and offered on an equal basis to people with disabilities.
IDEA – Unclear if AI may prove adequately private and effective to assist in creating IEPs.
Rehabilitation Act – What goes for IEPs likely goes for Section 504 plans.
Title VII – Unequal access to AI based on race, sex, etc., may be a violation.
Made with FlippingBook Digital Proposal Creator