LIEB BLOG

Legal Analysts

Showing posts with label tenant-screening AI. Show all posts
Showing posts with label tenant-screening AI. Show all posts

Thursday, November 21, 2024

Discriminatory Tenant-Screening Tool Results in $2.275MM Payment

On November 20, 2024, the Honorable Angel Kelley of the United States District Court of Massachusetts issued a Final Approval Order for a $2.275 million settlement involving SafeRent Solutions, LLC ("SafeRent"). In the lawsuit, it was alleged that SafeRent's tenant-screening algorithm was used to evaluate rental applicants where it disproportionately disadvantaged housing voucher recipients, particularly Black and Hispanic applicants. 


Under the settlement, SafeRent committed to:

  • No longer use unvalidated scoring models for applicants with housing vouchers unless validated by organizations like the National Fair Housing Alliance.
  • Educate landlords on the differences between its scoring models and the implications for housing voucher applicants.

In addition, SafeRent will pay $1.175 million into a settlement fund for affected applicants and $1.1 million for attorneys’ fees. Moreover, landlords using SafeRent’s screening products must certify whether applicants are housing voucher recipients. If certification isn’t provided, tenant-screening scores will be excluded.


For those using tenant-screening services, this case highlights the risks of relying on AI-driven tools without thoroughly understanding or auditing the impact of these tools. Algorithms that inadvertently reinforce biases, whether based on income, race, or other protected characteristics, could lead to significant legal and financial liabilities under the Fair Housing Act and state and local anti-discrimination laws.


Landlords and PropTech should conduct regular audits by trusted third-party validators to avoid discrimination as technology rapidly emerges in this field.


Landlords and PropTech should take this case as motivation to review your screening process, including:

  • Do your tools account for biases in their data or design?
  • Are they validated for compliance with anti-discrimination laws?
  • Are you confident they don’t inadvertently exclude protected groups?

As SafeRent’s case demonstrates, the stakes are high. It’s not just about avoiding lawsuits, it’s about ensuring equitable access to housing and fostering trust in the rental process. Invest in a third-party audit of the AI tools you use, update your policies, and ensure your practices align with Federal, State, and Local fair housing laws.