Take ActionAboutSpotlightlibraryCONTACTresearchprojectstalks/EVENTSpolicy/advocacyexhibitionseducationPRESS
Follow AJL on TwitterFollow AJL on InstagramFollow AJL on Facebook

AI and Employment

Advancing Worker Empowerment, Not Displacement

AI and Employment
overviewexamples in the worldsupportResourcesSimilar harms
OVERVIEWExamples in the worldSUPPORTResourcesSimiliar harms

AI and Employment

overview

REport harm

AI systems are fundamentally changing the workplace. These technologies are used for hiring, recruiting, and firing and may discriminate against some workers. The growing use of biometric workplace surveillance tools also raises concerns about workers' privacy, especially as they may not be aware that these technologies are being used on them.

The nature of the workplace is ever-changing - gig workers may be exploited through black box systems, automation is increasingly causing job placement, and the growing use of generative AI is resulting in creatives losing their jobs.

AI is coming for the jobs that were supposed to be automation-proof.

The Washington Post

The Washington Post

ChatGPT took their jobs. Now they walk dogs and fix air conditioners.

Read more

AI and Employment

In the World

Job Displacement

Recent advancements in large language models (LLMs), like ChatGPT, have reignited concerns about job displacement and its potential impact on employment. AI systems can automate tasks previously performed by humans, leading job displacement in a range of industries. 

  • Olivia Limpkin, a 25-year- old copywriter in San Francisco, was slowly pushed out of a tech startup by managers looking to cost-saving AI tools to replace people doing creative roles.
  • In 2022, the National Eating Disorders Association reportedly replaced unionizing staffers with an AI-powered chatbot. However, the chatbot was taken offline after reports of it providing harmful advice.

Human Resources

Applicant tracking systems used for recruiting, screening, and hiring may inadvertently discriminate against candidates. AI systems may prioritize candidates with similar backgrounds, excluding other qualified candidates.

  • In 2020, a UK-based makeup artist was evaluated for a role using an AI screening system called HireVue after being furloughed during the pandemic. Despite performing well on her skills evaluations, the AI tool negatively scored her body language and cost her the job. 
  • 🏆In 2019, the Electronic Privacy Information Center (EPIC) filed a complaint with the FTC against HireVue for its use of facial analysis to assess job candidates. HireVue stopped using the technology.
  • 🏆The Houston Independent School District settled with educators who brought a case against the district’s use of the Educational Value Added Assessment System (EVAAS) that used algorithms to evaluate, fire, and give bonuses to teachers.

Workplace Surveillance

The rise of wearable tech, biometric systems, and computer monitors raises privacy concerns about employer surveillance. Though intended to boost productivity, such monitoring can create high-pressure environments and erode trust between employers and employees.

  • A 2024 report from Oxfam found that Amazon and Walmart’s excessive warehouse surveillance erodes workers’ rights. 

Gig Economy

Gig economy tools use algorithms to assign tasks, reward, and punish workers based on their performance. However, the black box nature of these technologies make it easier for companies to surveil and exploit workers.

  • In 2024, Pa Edrissa Manjang, the Equality and Human Rights Commission, and the Couriers Union brought a case against Uber after Manjang’s account was disabled due to failed identity verification checks. The app used a tool that repeatedly failed to recognize Manjang’s face, illegally depriving him of income.

REPORT HARM
REPORT HARM
REPORT HARM
REPORT HARM

Amplify Your Voice. Support the Movement.

REport harm

You believe you have been
harmed by AI

If you believe you’ve been harmed by Artificial Intelligence, please fill out our harms report form. We will get back to you within 48 hours on weekdays and 72 over the weekend.

You are seeking advice

If you are seeking legal advice or representation, consider reaching out to an ACLU office in your respective state.

REport harm

AI and Employment

resources

Harms resource
#MyWorkMyRights Campaign

In 2024, AJL launched the #MyWorkMyRights Campaign to advocate for Consent, Compensation, Control, and Credit for writers in the age of generative AI. Share your story with AJL, post social media content online, or add your name to the Author's Guild Open Letter.

Harms resource
Coded Bias Documentary

Coded Bias explores the fallout of AJL founder Dr. Joy Buolamwini’s discovery that facial recognition struggles to see dark-skinned faces accurately. The film underscores the dangers of relying on AI to make employment decisions through the story of an award-winning teacher who nearly loses his job because of a poor assessment from an automated tool.

Harms resource
U.S. Equal Employment Opportunity Commission

In 2021, the EEOC launched an initiative to examine the effects of  employment- related AI tools and offer guidance on how to ensure algorithmic fairness. Their joint statement outlines how federal agencies will ensure employers use these tools fairly and responsibly.

Harms resource
Upturn

Upturn is a nonprofit that drives policy change to advance equity in the design, governance, and use of technology. protect people’s opportunities. Their 2018 report on fairness in hiring algorithms is a key resource for understanding the landscape of different tools.

Harms resource
Georgetown Law Draft Legislation

Georgetown Law’s Center on Privacy and Technology drafted the Worker Privacy Act bill which outlines protections against the invasive collection of employees’ data.

Harms resource
Have I Been Trained

Have I Been Trained allows users to discover if their work has been used to train AI. Users can then opt out of future training by adding their work to the Do Not Train registry.

Harms resource
The Worker Info Exchange

The Worker Info Exchange helps gig workers access data related to their employment at companies such as Uber, Amazon Flex, Bolt, and others. Their published research on tech and the gig economy provides insights and recommendations for advocacy.

Harms resource
Coworker.org

Coworker.org also published a framework called, Little Tech is Coming for Workers, for reclaiming and building worker power. Their Bossware and Employment Tech database compiles more than 500 tech products impacting employees.

SIMILAR Harms

harm
AI and Finance
harm
AI Surveillance
harm
AI Deepfakes
See all harms

Join the Algorithmic Justice League Newsletter.

Stay up to date with the movement towards equitable and accountable AI.

SIGN UP

@AJLUNITED

FOLLOW US ON SOCIAL
TWITTER
FACEBOOK
LINKEDIN
YOUTUBE
View on twitter
View on twitter
View on instagram
View on instagram
View on instagram
View on twitter
FOLLOW US

#CodedBias #EquitableAI #AccountableAI #InclusiveAI #ResponsibleAI #EthicalAI #AIbias #AIharms #MachineBias #ArtificialIntelligence #InclusiveTech #AJL #AlgorithmicJusticeLeague

Navigate
  • Home
  • Take Action
  • About
  • Spotlight
  • Library
  • Learn MorePrivacy Policy
our library
  • Research
  • Projects
  • Talks/Events
  • Policy/Advocacy
  • Exhibitions
  • Education
  • Press
contact us
  • Get in Touch
  • Share Your Story
  • Journalists
  • Donate
  • Twitter
  • Instagram
  • Facebook
  • LinkedIn
©Algorithmic Justice League 2025
Powered by Casa Blue