×
AI tenant screening tool will stop scoring tenants after class action lawsuit
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The rapid expansion of AI tools in property management is facing increased scrutiny as discriminatory practices come to light through legal challenges.

Settlement overview: SafeRent, a prominent AI tenant screening service, has agreed to stop using algorithmic scoring for housing voucher applicants following a discrimination lawsuit in Massachusetts.

  • The company will pay approximately $2.3 million to Massachusetts residents who were denied housing due to their SafeRent scores while using housing vouchers
  • U.S. District Judge Angel Kelley granted final approval for the settlement on Wednesday
  • The agreement stems from a 2022 class action lawsuit that alleged discrimination against Black and Hispanic applicants

System mechanics and concerns: SafeRent’s algorithmic scoring system evaluated potential tenants using factors like credit history and non-rental debts, but faced criticism for its lack of transparency and potential bias.

  • The scoring process did not disclose its methodology to landlords
  • Critics argued the system disproportionately assigned lower scores to Black and Hispanic tenants and housing voucher recipients
  • The lawsuit claimed these practices violated both Massachusetts law and the federal Fair Housing Act

Key changes under settlement: The five-year agreement implements significant modifications to SafeRent’s screening process nationwide.

  • The company must stop displaying tenant screening scores for all housing voucher applicants
  • SafeRent cannot provide “accept” or “deny” recommendations for voucher holders
  • Landlords will need to evaluate voucher recipients based on their complete rental application rather than relying solely on an algorithmic score

Expert perspective: Industry experts have raised doubts about the validity of using credit-based scoring for rental decisions.

  • Shennan Kavanagh, director of the National Consumer Law Center, noted that credit score-style evaluations have only been validated for credit obligations, not rent payment prediction
  • SafeRent maintains its scoring system complied with applicable laws but chose to settle to avoid expensive, time-consuming litigation

Broader industry implications: The SafeRent case represents part of a larger trend of legal challenges to AI-driven property management tools.

  • The Department of Justice recently sued RealPage over its algorithmic rent-pricing software
  • This settlement could set precedent for how AI tools are regulated in housing applications
  • The case highlights growing concerns about AI bias in high-stakes decision-making processes

Looking ahead: While this settlement addresses discrimination in tenant screening, the broader challenge of ensuring fairness in AI-powered housing tools remains unresolved, suggesting more legal and regulatory scrutiny may follow as the industry grapples with balancing technological innovation and equitable access to housing.

AI landlord screening tool will stop scoring low-income tenants after discrimination suit

Recent News

How NASA’s camera tech for space became the foundation for smartphone selfies

NASA's space-focused imaging innovation transferred sophisticated signal-processing technology to consumer devices, enabling high-quality cameras to fit inside increasingly thin smartphones.

Study reveals LLM coding tools’ productivity gains mostly benefit power users

New study finds productivity boosts from AI coding assistants primarily benefit experienced users who have heavily customized their workflows.

Cohere launches Aya Vision, multilingual AI model bridging images and text across 23 languages

Cohere's new AI model processes images across 23 languages, addressing a critical gap for enterprises operating in diverse global markets.