×
AI tenant screening tool will stop scoring tenants after class action lawsuit
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The rapid expansion of AI tools in property management is facing increased scrutiny as discriminatory practices come to light through legal challenges.

Settlement overview: SafeRent, a prominent AI tenant screening service, has agreed to stop using algorithmic scoring for housing voucher applicants following a discrimination lawsuit in Massachusetts.

  • The company will pay approximately $2.3 million to Massachusetts residents who were denied housing due to their SafeRent scores while using housing vouchers
  • U.S. District Judge Angel Kelley granted final approval for the settlement on Wednesday
  • The agreement stems from a 2022 class action lawsuit that alleged discrimination against Black and Hispanic applicants

System mechanics and concerns: SafeRent’s algorithmic scoring system evaluated potential tenants using factors like credit history and non-rental debts, but faced criticism for its lack of transparency and potential bias.

  • The scoring process did not disclose its methodology to landlords
  • Critics argued the system disproportionately assigned lower scores to Black and Hispanic tenants and housing voucher recipients
  • The lawsuit claimed these practices violated both Massachusetts law and the federal Fair Housing Act

Key changes under settlement: The five-year agreement implements significant modifications to SafeRent’s screening process nationwide.

  • The company must stop displaying tenant screening scores for all housing voucher applicants
  • SafeRent cannot provide “accept” or “deny” recommendations for voucher holders
  • Landlords will need to evaluate voucher recipients based on their complete rental application rather than relying solely on an algorithmic score

Expert perspective: Industry experts have raised doubts about the validity of using credit-based scoring for rental decisions.

  • Shennan Kavanagh, director of the National Consumer Law Center, noted that credit score-style evaluations have only been validated for credit obligations, not rent payment prediction
  • SafeRent maintains its scoring system complied with applicable laws but chose to settle to avoid expensive, time-consuming litigation

Broader industry implications: The SafeRent case represents part of a larger trend of legal challenges to AI-driven property management tools.

  • The Department of Justice recently sued RealPage over its algorithmic rent-pricing software
  • This settlement could set precedent for how AI tools are regulated in housing applications
  • The case highlights growing concerns about AI bias in high-stakes decision-making processes

Looking ahead: While this settlement addresses discrimination in tenant screening, the broader challenge of ensuring fairness in AI-powered housing tools remains unresolved, suggesting more legal and regulatory scrutiny may follow as the industry grapples with balancing technological innovation and equitable access to housing.

AI landlord screening tool will stop scoring low-income tenants after discrimination suit

Recent News

Baidu reports steepest revenue drop in 2 years amid slowdown

China's tech giant Baidu saw revenue drop 3% despite major AI investments, signaling broader challenges for the nation's technology sector amid economic headwinds.

How to manage risk in the age of AI

A conversation with Palo Alto Networks CEO about his approach to innovation as new technologies and risks emerge.

How to balance bold, responsible and successful AI deployment

Major companies are establishing AI governance structures and training programs while racing to deploy generative AI for competitive advantage.