×
AI tenant screening tool will stop scoring tenants after class action lawsuit
Written by
Published on
Join our daily newsletter for breaking news, product launches and deals, research breakdowns, and other industry-leading AI coverage
Join Now

The rapid expansion of AI tools in property management is facing increased scrutiny as discriminatory practices come to light through legal challenges.

Settlement overview: SafeRent, a prominent AI tenant screening service, has agreed to stop using algorithmic scoring for housing voucher applicants following a discrimination lawsuit in Massachusetts.

  • The company will pay approximately $2.3 million to Massachusetts residents who were denied housing due to their SafeRent scores while using housing vouchers
  • U.S. District Judge Angel Kelley granted final approval for the settlement on Wednesday
  • The agreement stems from a 2022 class action lawsuit that alleged discrimination against Black and Hispanic applicants

System mechanics and concerns: SafeRent’s algorithmic scoring system evaluated potential tenants using factors like credit history and non-rental debts, but faced criticism for its lack of transparency and potential bias.

  • The scoring process did not disclose its methodology to landlords
  • Critics argued the system disproportionately assigned lower scores to Black and Hispanic tenants and housing voucher recipients
  • The lawsuit claimed these practices violated both Massachusetts law and the federal Fair Housing Act

Key changes under settlement: The five-year agreement implements significant modifications to SafeRent’s screening process nationwide.

  • The company must stop displaying tenant screening scores for all housing voucher applicants
  • SafeRent cannot provide “accept” or “deny” recommendations for voucher holders
  • Landlords will need to evaluate voucher recipients based on their complete rental application rather than relying solely on an algorithmic score

Expert perspective: Industry experts have raised doubts about the validity of using credit-based scoring for rental decisions.

  • Shennan Kavanagh, director of the National Consumer Law Center, noted that credit score-style evaluations have only been validated for credit obligations, not rent payment prediction
  • SafeRent maintains its scoring system complied with applicable laws but chose to settle to avoid expensive, time-consuming litigation

Broader industry implications: The SafeRent case represents part of a larger trend of legal challenges to AI-driven property management tools.

  • The Department of Justice recently sued RealPage over its algorithmic rent-pricing software
  • This settlement could set precedent for how AI tools are regulated in housing applications
  • The case highlights growing concerns about AI bias in high-stakes decision-making processes

Looking ahead: While this settlement addresses discrimination in tenant screening, the broader challenge of ensuring fairness in AI-powered housing tools remains unresolved, suggesting more legal and regulatory scrutiny may follow as the industry grapples with balancing technological innovation and equitable access to housing.

AI landlord screening tool will stop scoring low-income tenants after discrimination suit

Recent News

What business leaders can learn from ServiceNow’s $11B ARR milestone

ServiceNow's steady 23% growth rate and high customer retention paint a rare picture of sustainable expansion in enterprise software while larger rivals struggle to maintain momentum.

Why retail investors keep flocking to AI chip darling Nvidia

Individual investors have shifted their focus from meme stocks to AI giants, with Nvidia attracting twice as much retail money as S&P 500 index funds in early 2024.

The year of the AI election wasn’t quite what most had predicted — here’s why

Political campaigns in 2024 embraced AI for internal operations like email writing and strategy planning, while largely avoiding synthetic media and deepfakes that many initially feared would dominate elections.