Take ActionAboutSpotlightlibraryCONTACTresearchprojectstalkspolicy/advocacyexhibitionseducationPRESS
Alexa? Siri? Cortana? Are you listening?
WATCH VOICING ERASURE
VOICE RECOGNITION

WHOSE VOICE DO YOU 
ACTUALLY HEAR?

WHOSE VOICE 
DO YOU 
ACTUALLY HEAR?

Is it okay for machines of silicon and steel or flesh and blood to erase our contributions? Is it okay for a machine to erase you and me? Is it okay for machines to portray women as subservient? Is it okay Google and others to capture data without our knowledge? These questions and new research led by Allison Koenecke inspired the creation of “Voicing Erasure”: a poetic piece recited by champions of women’s empowerment and leading scholars on race, gender, and technology.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.


VOICE SYSTEMS HAVE BIASES

A recent research study led by Allison Koenecke reveals large racial disparities in the performance of five popular speech recognition systems, with the worst performance on African American Vernacular English speakers. See Original Research

VOICES ARE BEING SURVEILLED

Voices recognition devices are known for "listening" into our conversations and storing that information often without our knowledge.

VOICE SYSTEMS REINFORCE STEREOTYPES

These systems are frequently given women voices and subservient "personalities", which further accentuates the negative stereotype about women being submissive.

CONTRIBUTIONS IN THE FIELD ARE BEING ERASED
CONTRIBUTIONS IN THE FIELD ARE 
BEING ERASED

A New York Times article highlighting the research about biases in speech recognition systems, failed to include the lead researcher, Allison Koenecke, and all the other women that were part of the research team.

LET’S PUT A FACE TO THE HARMS OF AI

HAVE YOU WITNESSED UNJUST artificial intelligence IMPACTING YOU OR OTHERS?

Help us shed light on the impact of AI harms on civil rights and people’s lives around the world. You can share your story using the hashtag #CodedBias or send us a private message.

share your story
GET INVOLVED IN THE FIGHT AGAINST ALGORITHMIC BIAS
GET INVOLVED IN THE FIGHT 
AGAINST ALGORITHMIC BIAS

WHAT IS YOUR #CODEDBIAS STORY?

We cannot let the promises of AI overshadow real and present harms. Like facial recognition, voice recognition systems also reflect the biases of its creators and our society. Now, more than ever, we must fight back. If you're aware of any algorithmic biases that impact you or others in your communities, please share it with the world.

TWEET YOUR #CODEDBIAS STORY
RESEARCH THAT INSPIRED VOICING ERASURE
RESEARCH THAT INSPIRED 
VOICING ERASURE

Racial disparities in
automated speech recognition

Automated speech recognition (ASR) systems, which use sophisticated machine-learning algorithms to convert spoken language to text, have become increasingly widespread, powering popular virtual assistants, facilitating automated closed captioning, and enabling digital dictation platforms for health care. Over the last several years, the quality of these systems has dramatically improved, due both to advances in deep learning and to the collection of large-scale datasets used to train the systems. There is concern, however, that these tools do not work equally well for all subgroups of the population. Here, we examine the ability of five state-of-the-art ASR systems—developed by Amazon, Apple, Google, IBM, and Microsoft—to transcribe structured interviews conducted with 42 white speakers and 73 black speakers.

READ OFFICIAL RESEARCH REPORT
CODED BIAS FILM

SUNDANCE FILM 
FESTIVAL CODED BIAS 
DOCUMENTARY

Coded Bias illuminates our mass misconceptions about AI and emphasizes the urgent need for legislative protection, and follows the Algorithmic Justice League’s journey to push for the first-ever legislation in the U.S to place limits to facial recognition technology. Coded Bias weaves the personal stories of people whose lives have been directly impacted by unjust algorithms. You can make an impact by helping us spread the word about the film, hosting a screening, and/or sharing your #CodedBias with your network.

LEARN MORE
CODED BIAS FILM

SUNDANCE FILM FESTIVAL CODED BIAS DOCUMENTARY

Coded Bias illuminates our mass misconceptions about AI and emphasizes the urgent need for legislative protection, and follows the Algorithmic Justice League’s journey to push for the first-ever legislation in the U.S to place limits to facial recognition technology. Coded Bias weaves the personal stories of people whose lives have been directly impacted by unjust algorithms. You can make an impact by helping us spread the word about the film, hosting a screening, and/or sharing your #CodedBias with your network.

LEARN MORE

Join the Algorithmic Justice League in the movement towards equitable and accountable AI.

Technology should serve all of us. Not just the privileged few.

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.
Whose voice do you hear when you think of intelligence, innovation, and ideas that shape our worlds?
IN ADDITION TO AJL'S FOUNDER Joy BuolAMWINI DEPICTED ABOVE,

Highlighted Voices IN VOICING ERASURE

THE 6 vOICES IN 
VOICING ERASURE

@allisonkoe
@allisonkoe
Allison Koenecke

Lead Author of “Racial Disparities in Automated Speech Recognition” Study
‍@allisonkoe

@sandylocks
@sandylocks
Kimberlè Crenshaw

Professor of Law at UCLA and Columbia Law School
‍@sandylocks

@smithmegan
@smithmegan
Megan Smith

CEO of Shift7
@smithmegan

@safiyanoble
@safiyanoble
Safiya Noble

Author of Algorithms of Oppression
‍@safiyanoble

@ruha9
@ruha9
Ruha Benjamin

Author of Race after Technology
‍@ruha9

@schock
@schock
Sasha Costanza-Chock

Author of Design Justice
‍@schock

FROM THE ORIGINAL
RESEARCH:

FROM THE 
ORIGINAL
RESEARCH:

“Automated speech recognition (ASR) systems are now used in a variety of applications to convert spoken language to text, from virtual assistants, to closed captioning, to hands-free computing. By analyzing a large corpus of sociolinguistic interviews with white and African American speakers, we demonstrate large racial disparities in the performance of five popular commercial ASR systems.”

MEDIA OPPORTUNITIES

Please contact comms@ajlunited.org or download our media kit. We appreciate every opportunity that helps us unmask the imminent harms and biases of AI.

CONTACT US

@AJLUNITED

FOLLOW US ON SOCIAL
View on twitter
View on twitter
View on instagram
View on instagram
View on instagram
View on twitter
hashtags

#CodedBias #EquitableAI #AccountableAI #InclusiveAI #ResponsibleAI #EthicalAI #AIbias #AIharms #MachineBias #ArtificialIntelligence #InclusiveTech #AJL #AlgorithmicJusticeLeague

Navigate
  • Home
  • Take Action
  • About
  • Spotlight
  • Library
  • Learn MorePrivacy Policy
our library
  • Research
  • Projects
  • Talks
  • Policy/Advocacy
  • Exhibitions
  • Education
  • Press
contact us
  • Get in Touch
  • Share Your Story
  • Journalists
©Algorithmic Justice League 2022
Powered by Casa Blue