To date, we’ve educated millions of individuals about the harms and biases of AI, helped technology companies change their processes, and advised policy makers locally and globally. We wouldn’t be a movement without the support given to our art, advocacy, research, and outreach initiatives.
If you have a face, you have a place in the conversation about how AI shapes our lives. The film Coded Bias reminds us of the power of individual voices coming together to spark collective action to resist harmful uses of tech. If you’re aware (as an employee, creator, consumer, or in any role) of AI harms or biases, we want to hear from you.
Our workshops like #DRAGVSAI are designed to help educate people about AI bias and harms. We also produce materials for educators. Whether you're a curious individual, an educator, or a team leader in an organization that develops algorithms, leave us your information below and we'll keep you in the loop about our workshop offerings.
Follow us on Twitter, Instagram, and Facebook, and remember to tag @AJLunited when you share one of our posts or craft your own. Here are some hashtag ideas: #CodedBias #EquitableAI #AccountableAI #InclusiveAI #ResponsibleAI #Ethicalai #AIbias #AIharms #MachineBias #ArtificialIntelligence #InclusiveTech #AJL #AlgorithmicJusticeLeague
If you’re an organization involved in the creation or deployment of artificial intelligence, you can request a system audit to help shift your company culture towards equitable and accountable AI.
Leveraging inclusive datasets to test algorithms and AI models can surface issues before technologies are widely deployed . If you’re a researcher, you can request access to our Pilot Parliaments Benchmark Dataset (for non-commercial use only).