Office of Civil Rights Weighs in On Discriminatory Use of Artificial Intelligence
Posted on January 7th, 2025
by Avery E. Smith
Co-author Taisha K. Tolliver-Duran
In November 2024, the United States Department of Education’s Office of Civil Rights (“OCR”) issued an informational resource regarding the discriminatory use of artificial intelligence (AI).
The resource emphasizes the importance of ensuring that all AI software used in educational programs are transparent, equitable, and accountable to avoid discriminatory practices.
For purposes of OCR’s guidance, artificial intelligence (“AI”) is defined as a “machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.”
While AI has the capability of positively enhancing educational programs, it may simultaneously subject students to discrimination in violation of federal civil rights laws. Such laws include Title VI of the Civil Rights Act of 1964 (“Title VI”), Title IX of the Educational Amendments of 1972 (“Title IX”), Section 504 of the Rehabilitation Act of 1973 (“Section 504”), and Title II of the Americans with Disabilities Act of 1990 (“Title II”).
OCR breakdowns the aforementioned civil rights laws and offers examples of potential discriminatory practices that institutions receiving federal financial assistance (hereinafter referred to as “institutions”) must avoid.
Title VI: Race, Color, or National Origin Discrimination
Title VI prohibits discrimination on the basis of race, color, or national origin in any program or activity receiving federal financial assistance.
1. Plagiarism and AI Detectors. Institutions must cautiously utilize plagiarism and AI detectors, as they may unintentionally discriminate against English Learners and/or individuals with limited language proficiency. If institutions utilize online services to scan students’ work for plagiarism or use of generative AI, it must be cognizant of the services’ error rate. For example, if a service has a high error rate for work written by English Learners and a low error rate for work written by native English-speaking students, English Learners may be inaccurately punished for academic dishonesty. This could result in an OCR investigation.
2. Facial Recognition Technology. Additionally, the use of AI facial recognition technology for safety purposes may result in discrimination against students. For example, if such technology relies on discriminatory data, and it consistently misidentifies students of color, said students may be falsely flagged and subject to a hostile environment. This could result in an OCR investigation.
Title IX: Sex Discrimination
OCR also enforces Title IX, which prohibits discrimination on the basis of sex in federally funded education programs.
1. AI-created Nudity. As AI-created “deepfakes” continue to spread across the internet, AI poses great concerns for the safety and privacy of students. OCR discusses the instance of an anonymous social media account posting AI-created nude photos of students. If such conduct is reported to an institution, and the institution fails to conduct a proper investigation, OCR may have reason to open an investigation.
2. Scheduling. Additionally, if institutions utilize AI to develop students’ academic or athletic schedules, they must be cognizant of what data the AI tool relies on. For example, an AI tool used to generate students’ academic schedules could rely on discriminatory historical data. If the data suggests that men are more likely to enroll in certain types of courses, such as computer science, than women, and women are denied the opportunity to enroll in said courses, there may be grounds for OCR to open an investigation. The same principle applies to the scheduling of athletic practices and events. If an AI tool relies on discriminatory data that consistently schedules men’s teams on preferred dates, e.g., Friday-nights, the institution may not be providing equal athletic opportunity to all students.
Section 504: Disability Discrimination
Section 504 protects qualified individuals from disability discrimination in any program or activity receiving federal financial assistance. Further, Section 504 requires institutions to provide equal opportunities to students with disabilities. This may include academic adjustments, modifications, and/or auxiliary aids.
1. AI Proctoring Software. Institutions must utilize AI proctoring software with caution, as such software may unintentionally discriminate against students with disabilities. OCR provides an example of an institution that uses an AI test proctoring software to track students’ eye movements for cheating and consequently accusing a student with vision impairment of cheating. If said student requests an academic adjustment and appeals such discipline, and the institution does not appropriately respond to the student’s request, OCR may have grounds to open an investigation.
2. Auxiliary Aids. Additionally, institutions must be aware of potential issues with auxiliary aids. OCR offers an example of an institution that provides a deaf student with an AI-aided closed circuit captioning transcription application for class. If the student repeatedly informs the institution that the application does not accurately capture information essential to their program, and the institution does not provide the student with an alternative auxiliary aid, OCR may have reason to open an investigation.
Multiples Bases Discrimination
Multiple bases discrimination is discrimination that occurs on the basis of more than one perceived characteristic. Institutions’ administrators must be aware of how AI tools are utilized across their campus and how they may actually be discriminating against students on multiples bases.
What This Means for Your College or University
Institutions must be cautious and cognizant of the AI tools used across their campuses. This includes careful evaluation and oversight to ensure that their institution remains compliant with all federal civil rights laws.
Higher education administrators with questions or concerns should contact their legal counsel or an attorney at KingSpry for guidance.
Office of Civil Rights Weighs in On Discriminatory Use of Artificial Intelligence
Posted on January 7th, 2025
by Avery E. Smith
Co-author
Taisha K. Tolliver-Duran
In November 2024, the United States Department of Education’s Office of Civil Rights (“OCR”) issued an informational resource regarding the discriminatory use of artificial intelligence (AI).
The resource emphasizes the importance of ensuring that all AI software used in educational programs are transparent, equitable, and accountable to avoid discriminatory practices.
For purposes of OCR’s guidance, artificial intelligence (“AI”) is defined as a “machine-based system that can, for a given set of human-defined objectives, make predictions, recommendations, or decisions influencing real or virtual environments.”
While AI has the capability of positively enhancing educational programs, it may simultaneously subject students to discrimination in violation of federal civil rights laws. Such laws include Title VI of the Civil Rights Act of 1964 (“Title VI”), Title IX of the Educational Amendments of 1972 (“Title IX”), Section 504 of the Rehabilitation Act of 1973 (“Section 504”), and Title II of the Americans with Disabilities Act of 1990 (“Title II”).
OCR breakdowns the aforementioned civil rights laws and offers examples of potential discriminatory practices that institutions receiving federal financial assistance (hereinafter referred to as “institutions”) must avoid.
Title VI: Race, Color, or National Origin Discrimination
Title VI prohibits discrimination on the basis of race, color, or national origin in any program or activity receiving federal financial assistance.
1. Plagiarism and AI Detectors. Institutions must cautiously utilize plagiarism and AI detectors, as they may unintentionally discriminate against English Learners and/or individuals with limited language proficiency. If institutions utilize online services to scan students’ work for plagiarism or use of generative AI, it must be cognizant of the services’ error rate. For example, if a service has a high error rate for work written by English Learners and a low error rate for work written by native English-speaking students, English Learners may be inaccurately punished for academic dishonesty. This could result in an OCR investigation.
2. Facial Recognition Technology. Additionally, the use of AI facial recognition technology for safety purposes may result in discrimination against students. For example, if such technology relies on discriminatory data, and it consistently misidentifies students of color, said students may be falsely flagged and subject to a hostile environment. This could result in an OCR investigation.
Title IX: Sex Discrimination
OCR also enforces Title IX, which prohibits discrimination on the basis of sex in federally funded education programs.
1. AI-created Nudity. As AI-created “deepfakes” continue to spread across the internet, AI poses great concerns for the safety and privacy of students. OCR discusses the instance of an anonymous social media account posting AI-created nude photos of students. If such conduct is reported to an institution, and the institution fails to conduct a proper investigation, OCR may have reason to open an investigation.
2. Scheduling. Additionally, if institutions utilize AI to develop students’ academic or athletic schedules, they must be cognizant of what data the AI tool relies on. For example, an AI tool used to generate students’ academic schedules could rely on discriminatory historical data. If the data suggests that men are more likely to enroll in certain types of courses, such as computer science, than women, and women are denied the opportunity to enroll in said courses, there may be grounds for OCR to open an investigation. The same principle applies to the scheduling of athletic practices and events. If an AI tool relies on discriminatory data that consistently schedules men’s teams on preferred dates, e.g., Friday-nights, the institution may not be providing equal athletic opportunity to all students.
Section 504: Disability Discrimination
Section 504 protects qualified individuals from disability discrimination in any program or activity receiving federal financial assistance. Further, Section 504 requires institutions to provide equal opportunities to students with disabilities. This may include academic adjustments, modifications, and/or auxiliary aids.
1. AI Proctoring Software. Institutions must utilize AI proctoring software with caution, as such software may unintentionally discriminate against students with disabilities. OCR provides an example of an institution that uses an AI test proctoring software to track students’ eye movements for cheating and consequently accusing a student with vision impairment of cheating. If said student requests an academic adjustment and appeals such discipline, and the institution does not appropriately respond to the student’s request, OCR may have grounds to open an investigation.
2. Auxiliary Aids. Additionally, institutions must be aware of potential issues with auxiliary aids. OCR offers an example of an institution that provides a deaf student with an AI-aided closed circuit captioning transcription application for class. If the student repeatedly informs the institution that the application does not accurately capture information essential to their program, and the institution does not provide the student with an alternative auxiliary aid, OCR may have reason to open an investigation.
Multiples Bases Discrimination
Multiple bases discrimination is discrimination that occurs on the basis of more than one perceived characteristic. Institutions’ administrators must be aware of how AI tools are utilized across their campus and how they may actually be discriminating against students on multiples bases.
What This Means for Your College or University
Institutions must be cautious and cognizant of the AI tools used across their campuses. This includes careful evaluation and oversight to ensure that their institution remains compliant with all federal civil rights laws.
Higher education administrators with questions or concerns should contact their legal counsel or an attorney at KingSpry for guidance.