Take ActionAboutSpotlightlibraryCONTACTresearchprojectstalks/EVENTSpolicy/advocacyexhibitionseducationPRESS
Follow AJL on TwitterFollow AJL on InstagramFollow AJL on Facebook

AI and Education

Teaching Tech to Close Digital Gaps

AI and Education
overviewexamples in the worldsupportResourcesSimilar harms
OVERVIEWExamples in the worldSUPPORTResourcesSimiliar harms

AI and Education

overview

REport harm

The use of AI in educational technology continues to increase, especially after the Covid-19 pandemic. From grade schools to universities, AI is used for tasks like enrollment, grading, remote testing, and data administration. However, many students are unaware of how these systems work or if their information is being shared, such as with law enforcement. Moreover, without transparent and careful oversight, these algorithms can make mistakes and worsen education achievement gaps.

It is possible that these algorithms contribute to colleges meeting less student financial need, higher debt burdens, student dropout, and racial disparities.

Brookings.edu

Brookings.edu

Enrollment algorithms are contributing to the crises of higher education

Read more

AI and Education

In the World

Resource Allocation

School systems are using algorithms to evaluate students and make decisions about resource allocation.

  • 🏆The Houston Federation of Teachers won an injunction against the Houston Independent School District to pause use of an opaque algorithm, called EVAAS, which helped make decisions about teacher pay and termination.
  • In 2024, Nevada’s school system faced criticism for using an opaque machine learning system which diverted critical funding away from some students. The tool, created by a company called Infinite Campus, was meant to identify at-risk students but inexplicably excluded some low-income and even homeless students.

Automated Grading

Artificial intelligence algorithms can help evaluate and provide feedback to students’ work, but these systems may be unfairly biased against some students.

  • During the height of the COVID-19 pandemic in 2020, International Baccalaureate students felt cheated on their A-level exams when a seemingly-biased algorithm assigned their test grades, sometimes with serious future implications.

Admissions

Predictive algorithms use past student data to rank students for admissions and recommend them for scholarships. However, this might leave out students who could still enroll even if they don't match those characteristics.

  • An investigative report by Brookings.edu showed how automated tools made way for racial and socioeconomic bias in college admissions and financial aid distribution.

E-Proctoring

E-proctoring software uses AI to monitor students and prevent cheating. However, it has limited accuracy for some demographic groups and raises equity concerns.

  • 🏆Facing numerous challenges with the rise of remote proctoring tools during the COVID-19 pandemic, students went online to share stories and push back against unfair practices.
  • A reporter explored the impact of remote monitoring tools after a Florida highschooler was wrongly accused of cheating.
  • A report by the CDT explores how using automated test proctoring tools during the pandemic created additional challenges for disabled students.

Data Sharing and Surveillance

Schools collect personal and academic data about students through various apps. This raises concerns when data is shared or used to make decisions without students’ knowledge. 

  • 🏆In 2023, the FTC sued Edmodo, an educational technology company, for using children’s personal information for advertising without proper consent.
  • An investigative report by The74 dives into how schools used digital monitoring tools to discipline students for minor offenses rather than help them.

AI-Detection Tools

Teachers are seeking tools to tell if assignments are AI-generated or not. However, current AI detection software is unreliable, raising concerns about wrongly accusing students of cheating.

  • 🏆Just six months after releasing it, OpenAI discontinued its AI text detection tool due to its “low rate of accuracy.”
  • Parents of a U.S. high school student sued their son’s school after he was wrongly accused of cheating by an AI detection tool.
REPORT HARM
REPORT HARM
REPORT HARM
REPORT HARM

Amplify Your Voice. Support the Movement.

REport harm

You believe you have been
harmed by AI

If you believe you’ve been harmed by Artificial Intelligence, please fill out our harms report form. We will get back to you within 48 hours on weekdays and 72 over the weekend.

You are seeking advice

If you are seeking legal advice or representation, consider reaching out to an ACLU office in your respective state.

REport harm

AI and Education

resources

Harms resource
Coded Bias Documentary

The Coded Bias film explores the fallout of AJL founder Dr. Joy Buolamwini’s discovery that facial recognition technologies don’t always work well for darker skin tones or female-appearing faces. Through the story of a Houston teacher who was almost fired, it warns about the risks of over-relying on automated tools to make important decisions about education.

Harms resource
Federal Trade Commission

The FTC published a statement on EdTech and the Children’s Online Privacy Protection Act, making it clear that it’s illegal for companies to compromise children’s privacy rights when using educational technology.

Harms resource
Center for Democracy and Technology

The Center for Democracy and Technology has published numerous resources on protecting student privacy.

Harms resource
Defend Digital Me

Defend Digital Me is a non-profit organization that provides research about student privacy and the use of AI. In 2022, they released “The State of Biometrics 2022: A Review of Policy and Practice in UK Education” report.

Harms resource
Digital Promise

Digital Promise, a global nonprofit, created the AI Digital Equity Framework, to help schools make informed decisions about using AI technology responsibly.

Harms resource
The Student Data Privacy Project

This project, organized by parent advocates, provides templates parents can use to ask their children’s schools for information on how data is being used by educational technology.

Harms resource
EdTech Equity Project

The EdTech Equity Project offers toolkits to help schools, tech developers, and community members to make sure AI technology is fair and works for everyone.

Harms resource
The Red Flag Machine

The Electronic Frontier Foundation’s Red Flag Machine quiz, and accompanying research report show how student monitoring software, like GoGuardian, can display significant errors in the online content they flag.

Harms resource
Office of Educational Technology

After the release of Biden’s Executive Order on Artificial Intelligence, the U.S. Department of Education began publishing guidance to help schools use AI technology to benefit all students while also protecting their privacy.

Harms resource
National Disabled Law Students Association Report

The NDLSA’s Report on Concerns Regarding Online Administration of Bar Exams highlights the challenges disabled students face with e-proctoring tools when remotely taking bar exams during the COVID-19 pandemic. It focuses on concerns like AI bias and privacy.

SIMILAR Harms

harm
AI Surveillance
See all harms

Join the Algorithmic Justice League Newsletter.

Stay up to date with the movement towards equitable and accountable AI.

SIGN UP

@AJLUNITED

FOLLOW US ON SOCIAL
TWITTER
FACEBOOK
LINKEDIN
YOUTUBE
View on twitter
View on twitter
View on instagram
View on instagram
View on instagram
View on twitter
FOLLOW US

#CodedBias #EquitableAI #AccountableAI #InclusiveAI #ResponsibleAI #EthicalAI #AIbias #AIharms #MachineBias #ArtificialIntelligence #InclusiveTech #AJL #AlgorithmicJusticeLeague

Navigate
  • Home
  • Take Action
  • About
  • Spotlight
  • Library
  • Learn MorePrivacy Policy
our library
  • Research
  • Projects
  • Talks/Events
  • Policy/Advocacy
  • Exhibitions
  • Education
  • Press
contact us
  • Get in Touch
  • Share Your Story
  • Journalists
  • Donate
  • Twitter
  • Instagram
  • Facebook
  • LinkedIn
©Algorithmic Justice League 2025
Powered by Casa Blue