In November 2024, the United States Department of Education’s Office of Civil Rights (“OCR”) issued an informational resource regarding the discriminatory use of artificial intelligence. The resource emphasizes the importance of ensuring that all artificial intelligence software used in educational programs are transparent, equitable, and accountable to avoid discriminatory practices.
In this article, KingSpry’s Investigation and Compliance Services and Education Law Attorneys, Taisha K. Tolliver-Duran and Brian T. Taylor discuss the Office of Civil Rights’ latest guidance and offers recommendations to Pennsylvania public school districts as it pertains to avoiding discriminatory use of artificial intelligence.
Purpose
For purposes of OCR’s guidance, artificial intelligence (“AI”) is defined as a “machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.”
While AI has the capability of positively enhancing educational programs, it may simultaneously subject students to discrimination in violation of federal civil rights laws. Such laws include Title VI of the Civil Rights Act of 1964 (“Title VI”), Title IX of the Educational Amendments of 1972 (“Title IX”), Section 504 of the Rehabilitation Act of 1973 (“Section 504”), and Title II of the Americans with Disabilities Act of 1990 (“Title II”).
OCR breakdowns the aforementioned civil rights laws and offers examples of potential discriminatory practices that schools must avoid.
Title VI: Race, Color, or National Origin Discrimination
Title VI prohibits discrimination on the basis of race, color, or national origin in any program or activity receiving federal financial assistance.
1. Plagiarism and AI Detectors. Schools must cautiously utilize plagiarism and AI detectors, as they may unintentionally discriminate against English Learners and/or individuals with limited language proficiency. If schools utilize online services to scan students’ work for plagiarism or use of generative AI, it must be cognizant of the services’ error rate. For example, if a service has a high error rate for work written by English Learners and a low error rate for work written by native English speaking students, English Learners may be inaccurately punished for academic dishonesty. This could result in an OCR investigation.
2. Facial Recognition Technology. Additionally, the use of AI facial recognition technology for safety purposes may result in discrimination against students. For example, if such technology relies on discriminatory data, and it consistently misidentifies students of color, said students may be falsely flagged and subject to a hostile environment. This could result in an OCR investigation.
Title IX: Sex Discrimination
OCR also enforces Title IX, which prohibits discrimination on the basis of sex in federally-funded education programs.
1. AI-created Nudity. As AI-created “deepfakes” continue to spread across the internet, AI poses great concerns for the safety and privacy of students. OCR discusses the instance of an anonymous social media account posting AI-created nude photos of students. If such conduct is reported to a school, and the school fails to conduct a proper investigation, OCR may have reason to open an investigation.
2. Scheduling. Additionally, if schools utilize AI to develop students’ academic or athletic schedules, schools must be cognizant of what data it relies on. For example, an AI tool used to generate students’ academic schedules could rely on discriminatory historical data. If the data suggests that boys are more likely to enroll in computer science courses than girls, and girls are denied the opportunity to enroll in said courses, there may be grounds for OCR to open an investigation. The same principle applies to the scheduling of athletic practices and events. If an AI tool relies on discriminatory data that consistently schedules boys’ teams on preferred dates, e.g., Friday-nights, the school may not be providing equal athletic opportunity to all students.
Section 504: Disability Discrimination
Section 504 protects qualified individuals from disability discrimination and requires public schools to provide students with disabilities with a free appropriate public education (“FAPE”).
1. Drafting Section 504 Plans. OCR highlights concerns with using AI to develop students’ Section 504 Plans. If schools use such technology, and have no procedural safeguards in place to ensure accuracy and effectiveness of the Plan, it may result in Plans that do not meet students’ specific needs. If reported, this may result in an OCR investigation, as the school may not be providing its special education students with a FAPE.
2. Moderation Software. Additionally, schools may face issues with the use of content moderation software. OCR provides the example of a school using content moderation software to alert it when any explicit language is used on school-issued devices. This may be problematic if a student with a disability has a compulsion to say certain explicit words and has a Section 504 Plan in place to address such behavior. If the student is flagged and incorrectly disciplined for their behavior, the school may violate the Section 504 Plan and OCR may have reason to open an investigation.
Multiples Bases Discrimination
Multiple bases discrimination is discrimination that occurs on the basis of more than one perceived characteristic. OCR provides several examples as to how a school’s use of AI could result in multiple bases discrimination.
For example, some schools may use AI to streamline the screening process for speech and language disorders. If the AI tool falsely flags students who are Englisher Learners as students with a speech disorder, and the school takes no further action to evaluate the students, it may be denying qualified students with disabilities proper evaluation and a FAPE.
Bottom Line for Schools
Schools must be cautious and cognizant of the AI tools used across their district. This includes careful evaluation and oversight to ensure that their district remains compliant with all federal civil rights laws. Reported violations must be investigated.
For more information regarding the integration of AI in schools, check out our recent article, U.S. Department of Education Releases Toolkit for AI Integration in Schools.
If your school has questions regarding its use of artificial intelligence, please consult your local counsel or one of the Education attorneys at KingSpry. School Law Bullets are a publication of KingSpry’s Education Law Practice Group. They are meant to be informational and do not constitute legal advice.