Take ActionAboutSpotlightlibraryCONTACTresearchprojectstalks/EVENTSpolicy/advocacyexhibitionseducationPRESS
Follow AJL on TwitterFollow AJL on InstagramFollow AJL on Facebook

AI and Housing

Constructing Access To Housing Justice For All

AI and Housing
overviewexamples in the worldsupportResourcesSimilar harms
OVERVIEWExamples in the worldSUPPORTResourcesSimiliar harms

AI and Housing

overview

REport harm

The use of artificial intelligence in housing, sometimes called "Landlord Tech," may look like illegal surveillance of tenants, discriminatory rental denials, or automated rent increases. These systems may encode biases and give landlords an unfair advantage that makes it harder for everyone to access suitable housing.

The risks of this technology affect both renters and those looking to own homes. Constant surveillance threatens residents' privacy and autonomy. Additionally, errors in biometric systems and automated property valuation models can worsen existing inequalities in the housing market.

These tools 'are not foolproof,' and their mistakes can adversely impact public housing residents… this is the type of technology that the department is cautioning against

The Washington Post

The Washington Post

Eyes on the poor: Cameras, facial recognition watch over public housing

Read more

AI and Housing

In the World

Rental Increase

AI algorithms may be used to determine rental prices, often leading to inflated and discriminatory rent increases.

  • 🏆In 2023, the U.S. Department of Justice filed a suit against landlords using an AI-driven software, RealPage, to illegally inflate rent prices.

Rental Applications

Tenant screening and selection systems may be used by landlords to make rental decisions. However, the databases these systems use to collect information about credit scores, eviction records, and criminal records may be inaccurate and lead to discrimination against certain applicants. 

  • ‍Screening software has unfairly rejected applicants without context for reasons like evictions due to domestic abuse and criminal records of people with the same name. The lack of transparency of these tools makes it hard for people to know why they were rejected.

Tenant Surveillance

Biometric systems, such as facial recognition and fingerprint scanners, may control access to homes and monitor activities inside and outside people’s homes. These systems may mistakenly lock people out their homes and can be used to unfairly target residents.

  • Public housing residents across the U.S. have faced targeting, eviction, and arrest because of housing officials using more surveillance cameras to strictly enforce eviction rules, even for small mistakes.
  • 🏆In 2019, tenants in a Brooklyn housing complex successfully fought their landlord’s effort to install a facial recognition system to access their building, calling it a violation of their privacy rights.

Targeted Advertising

Tech companies can create housing ads that only offer opportunities to certain potential residents while excluding others based on specific demographics, leading to discrimination against protected groups.

  • In 2016, ProPublica released a report about how Facebook’s microtargeting advertisements enabled sellers to discriminate against protected groups.
  • 🏆In 2019, Facebook agreed to overhaul its targeted advertising system after it was found that landlords were using them to discriminate against potential buyers by age, race, gender, disability status, and other demographic factors.

Mortgages

Algorithmic systems that decide who can get a mortgage and how much to charge may favor applicants from certain zip codes. This can lead to minorities being unfairly denied loans because of past discrimination. 

  • A study by Berkeley researchers found that otherwise equal Black and Latino/Latine/Latinx borrowers were charged more for mortgages, costing them $765M yearly. Additionally, over 1.3 million credit worthy lenders were denied loans due to factors related to race.

Automated Valuation Model

Algorithmic systems that estimate property values might undervalue homes in certain zip codes, especially historically underserved neighborhoods, because of historical patterns of exclusion. This may make existing wealth gaps even worse.

  • In 2020, the Urban Institute released a report on How Automated Valuation Models Can Disproportionately Affect Majority-Black Neighborhoods | Urban Institute.
REPORT HARM
REPORT HARM
REPORT HARM
REPORT HARM

Amplify Your Voice. Support the Movement.

REport harm

You believe you have been
harmed by AI

If you believe you’ve been harmed by Artificial Intelligence, please fill out our harms report form. We will get back to you within 48 hours on weekdays and 72 over the weekend.

You are seeking advice

If you are seeking legal advice or representation, consider reaching out to an ACLU office in your respective state.

REport harm

AI and Housing

resources

Harms resource
Coded Bias Documentary

The Coded Bias documentary, released in 2020, tells the story of Dr. Joy Buolamwini’s discovery that facial recognition does not see dark skinned faces accurately. The film highlights the story of a building management company in Brooklyn that planned to implement facial recognition technology to allow tenants to enter their homes.

Harms resource
Amicus Letter In Support Of The Brooklyn Tenants

In 2019, algorithmic bias researchers, including Dr. Joy Boulamwini, Dr. Timnit Gebru, and Inioluwa Deborah Raji, submitted an amicus support letter in support of the Brooklyn Tenants who were pushing back against the use of facial recognition technology in their building.

Harms resource
National Low Income Housing Commission

In 2023, the National Low Income Housing Commission (NLIHC) submitted a report on unjust automated screening processes to the Consumer Financial Protection Bureau and Federal Trade Commission. The report outlines the various ways that these systems discriminate particularly against low-income renters.

Harms resource
Countering Tenant Screening Initiative

The Countering Tenant Screening Initiative collects tenant screening reports to hold tenant screening algorithms accountable and to teach individuals about how tenant screening works.

Harms resource
National Fair Housing Alliance

The National Fair Housing Alliance published the Method for Improving Mortgage Fairness report on how to improve mortgage fairness by underwriting data models through methods like Distribution Matching.

SIMILAR Harms

harm
AI and Finance
harm
AI Surveillance
See all harms

Join the Algorithmic Justice League Newsletter.

Stay up to date with the movement towards equitable and accountable AI.

SIGN UP

@AJLUNITED

FOLLOW US ON SOCIAL
TWITTER
FACEBOOK
LINKEDIN
YOUTUBE
View on twitter
View on twitter
View on instagram
View on instagram
View on instagram
View on twitter
FOLLOW US

#CodedBias #EquitableAI #AccountableAI #InclusiveAI #ResponsibleAI #EthicalAI #AIbias #AIharms #MachineBias #ArtificialIntelligence #InclusiveTech #AJL #AlgorithmicJusticeLeague

Navigate
  • Home
  • Take Action
  • About
  • Spotlight
  • Library
  • Learn MorePrivacy Policy
our library
  • Research
  • Projects
  • Talks/Events
  • Policy/Advocacy
  • Exhibitions
  • Education
  • Press
contact us
  • Get in Touch
  • Share Your Story
  • Journalists
  • Donate
  • Twitter
  • Instagram
  • Facebook
  • LinkedIn
©Algorithmic Justice League 2025
Powered by Casa Blue