Take ActionAboutSpotlightlibraryCONTACTresearchprojectstalks/EVENTSpolicy/advocacyexhibitionseducationPRESS
Follow AJL on TwitterFollow AJL on InstagramFollow AJL on Facebook

AI and Transportation

Steering Safe and Accessible Mobility

AI and Transportation
overviewexamples in the worldsupportResourcesSimilar harms
OVERVIEWExamples in the worldSUPPORTResourcesSimiliar harms

AI and Transportation

overview

REport harm

AI technologies are increasingly used for surveillance, tracking movement, and gathering biometric data. Although transit agencies, police, and self-driving car companies build these technologies to improve safety and ease, they create more risk for people if the technology doesn’t work well for them. Using AI in transportation can also threaten our freedom to move around on roads and in the air.

They’re taking information that I didn’t realize was going to be shared and screwing with our insurance

ProPublica

ProPublica

Automakers Are Sharing Consumers’ Driving Behavior With Insurance Companies

Read more

AI and Transportation

In the World

Biometric Security

Transportation services are using facial recognition and other biometric systems for security, but these technologies can make mistakes, treat certain racial groups unfairly, and raise data privacy concerns.

  • The TSA’s new face scanners in security lines have made some travelers worry about being pressured to share their biometric data just to avoid delays.

Body Scanners

Body scanners at airports were designed to improve and speed up security checks, but they unfairly give more false alarms for transgender people, people with disabilities, and those wearing certain hairstyles or religious head coverings leading to unfair treatment.

  • Olivia was flagged for extra screening leading to a painfully uncomfortable experience because airport security practices are not inclusive of transgender travelers.
  • Dorian shares how she, like other women of color, have faced extra security checks due to body scanners flagging their hairstyles.

Automated License Plate Readers (ALPR) Surveillance

ALPRs let law enforcement track drivers’ movements, raising privacy issues, especially for people in communities that are already heavily policed.

  • During the 2020 Black Lives Matter protests, police used ALPRs and other surveillance tech to identify protesters.

Auto Insurance

AI used by insurance companies to track how people drive can raise concerns about privacy, consent, discriminatory treatment based on where people live, and unfairly high rates for certain groups.

  • After a sudden jump in his insurance rate, Ken discovered that his car insurance company was tracking his driving through a data company called LexisNexis.

Autonomous Vehicles

Self-driving car designers are hoping to help people who can’t drive or easily access vehicles, but improving safety has been shown to be a large problem. Cars sometimes struggle to detect people or objects in unusual or adverse conditions, and can show bias against certain groups.

  • 🏆Researchers are working with disability advocates to make self-driving vehicles more accessible for people with disabilities.
  • Disabled people voice concerns about self-driving cars not working well for them due to inaccessible vehicle designs and cars not recognizing them as pedestrians.
  • In 2023, U.S. automobile regulator the NHTSA began investigating Tesla Inc.'s driver-assist technology following a fatal crash. The NHTSA has also investigated vehicles run by Zoo, Waymo, and Ford.

REPORT HARM
REPORT HARM
REPORT HARM
REPORT HARM

Amplify Your Voice. Support the Movement.

REport harm

You believe you have been
harmed by AI

If you believe you’ve been harmed by Artificial Intelligence, please fill out our harms report form. We will get back to you within 48 hours on weekdays and 72 over the weekend.

You are seeking advice

If you are seeking legal advice or representation, consider reaching out to an ACLU office in your respective state.

REport harm

AI and Transportation

resources

Harms resource
AJL’s Freedom Flyers Campaign

The #FreedomFlyers campaign raises awareness about the TSA's expanding use of facial recognition at airports. Make your voice heard by filling out a TSA Scorecard.

Harms resource
Design Justice by Sasha Costanza-Chock

In Design Justice, Dr. Sasha Costanza-Chock, Senior Research Advisor to AJL, talks about designing technology that works for everyone. She touches on how airport security scanners are biased against transgender people, because they are based on outdated social ideas.

Harms resource
Racial Bias in Object Detection

In Predictive Inequity in Object Detection, Georgia Tech researchers showed that self-driving cars' systems work differently depending on pedestrians’ skin type.

Harms resource
Bias in Autonomous Driving

In Bias Behind the Wheel, researchers from China, London, and Singapore analyzed how self-driving cars sometimes have trouble detecting pedestrians based on things like age and gender.

Harms resource
Alabama Automated Vehicle Law

In 2024, Alabama passed a law to regulate the use of self-driving cars, requiring companies and drivers to register vehicles with automated driving systems.

Harms resource
EFF Car Data Resources

The EFF has compiled a list of resources that allows individuals to figure out what data their cars are tracking and how to opt-out of sharing if possible.

Harms resource
Mozilla’s Privacy Not Included

This project helps people track the privacy practices of car manufacturers, providing ratings of car companies and their privacy habits to help consumers make informed choices.

Harms resource
Examining ALPR Data

The Electronic Frontier Foundation (EFF) studied eight days of Automated License Plate Reader (ALPR) data to show how ALPRs work and push for more police accountability.

SIMILAR Harms

harm
AI and Finance
harm
AI Surveillance
See all harms

Join the Algorithmic Justice League Newsletter.

Stay up to date with the movement towards equitable and accountable AI.

SIGN UP

@AJLUNITED

FOLLOW US ON SOCIAL
TWITTER
FACEBOOK
LINKEDIN
YOUTUBE
View on twitter
View on twitter
View on instagram
View on instagram
View on instagram
View on twitter
FOLLOW US

#CodedBias #EquitableAI #AccountableAI #InclusiveAI #ResponsibleAI #EthicalAI #AIbias #AIharms #MachineBias #ArtificialIntelligence #InclusiveTech #AJL #AlgorithmicJusticeLeague

Navigate
  • Home
  • Take Action
  • About
  • Spotlight
  • Library
  • Learn MorePrivacy Policy
our library
  • Research
  • Projects
  • Talks/Events
  • Policy/Advocacy
  • Exhibitions
  • Education
  • Press
contact us
  • Get in Touch
  • Share Your Story
  • Journalists
  • Donate
  • Twitter
  • Instagram
  • Facebook
  • LinkedIn
©Algorithmic Justice League 2025
Powered by Casa Blue