For those habitational property owners using AI tenant screening (most through third-party software to screen applicants), may be opening themselves up to additional risk. AI tenant screening uses artificial intelligence (AI) and machine learning to evaluate potential tenants. The technology analyzes a range of data, including rent scores, rental history and social media profiles to help landlords make decisions about who to rent to. The benefits to landlords include:

Reduce risk – AI can help landlords identify tenants who are the most likely to default on payment or damage property.

Save time – AI can automate the process of reviewing applications and background checks, which can help landlords save time and money.

However, AI tenant screening has also been criticized for potentially worsening housing discrimination. AI tools can be biased and make unfair conclusion based on the quality of their data. Several lawsuits have been filed. In the suit outlined below, the potential tenant was denied even though she had evidence of 16 years of timely rent payments. In addition, she had vouchers from the government for rent assistance which was not considered.

Companies providing this service include below (but there are many more). Checking out their websites will give you an idea of how these services are being used.

  • RealPage (who has been under siege with other class action lawsuits involving collusion to fix rents),
    - SafeRent (who is no longer screening low income tenants)
  • Propertyware Tenant Screening
  • Turbo Tenant LLC
  • Beam AI

What is more troubling is we are starting to see our clients investigated by the Department of Justice (DOJ) and/or the being served with a Civil Investigative Demand (CID) from the Consumer Financial Protection Bureau (CFPB) piggybacking on these civil lawsuits. Unfortunately, investigations and fines by the government are not covered by professional policies. There are standard policy exclusions for intentional misconduct, fraudulent acts or violations of law, which is often the basis for government investigations and fines. Attached is a brief explanation of Real Estate Professional coverage and examples of claims covered.

On occasion, depending on the nature of the investigation and policy language, some aspects of a government investigation may be covered– usually legal defense even is the final fine is not covered. Also, professional policy often have an enhancement endorsement that throws in small sub limits for things typically not covered or considered a claim. For instance these endorsements often included $10K to $30K of subpoena expense coverage. Not a lot but something for a situation typically not considered a claim. When a government investigation is thrust upon your client, don’t give them the wrong impression that Professional picks up coverage for this, most likely there will only be a small sub limit for some expenses due to the standard policy exclusions. Typically policy wording below:
C. Claim mean:

1. Written demand for monetary or non-monetary relief made against an Insured:

2. A civil proceeding brought against an Insured;

3. a request to toll or waive applicable statute of limitations; alleging a Wrongful Act for which is provided under this Policy.

A claim does not include: (i) a complaint to, or an inquiry, investigation or proceeding initiated by a governmental, administrative or regulatory body: (ii) a criminal investigation o proceeding; (iii) a Disciplinary Proceeding; (iv) a Subpoena or Records Request; or (v) Public Relations/Crisis Management Event.


Recent Settled Class Action Suit
https://www.insurancejournal.com/news/national/2024/11/25/###-###-#### htm
Mary Louis’ excitement to move into an apartment in Massachusetts in the spring of 2021 turned to dismay when Louis, a Black woman, received an email saying that a “third-party service” had denied her tenancy.

That third-party service included an algorithm designed to score rental applicants, which became the subject of a class action lawsuit, with Louis at the helm, alleging that the algorithm discriminated on the basis of race and income.

A federal judge approved a settlement in the lawsuit, one of the first of its kind, on Wednesday, with the company behind the algorithm agreeing to pay over $2.2 million and roll back certain parts of its screening products that the lawsuit alleged were discriminatory.

The settlement does not include any admissions of fault by the company SafeRent Solutions, which said in a statement that while it “continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive.”

While such lawsuits might be relatively new, the use of algorithms or artificial intelligence programs to screen or score Americans isn’t. For years, AI has been furtively helping make consequential decisions for U.S. residents.

When a person submits a job application, applies for a home loan or even seeks certain medical care, there’s a chance that an AI system or algorithm is scoring or assessing them like it did Louis. Those AI systems, however, are largely unregulated, even though some have been found to discriminate.

“Management companies and landlords need to know that they’re now on notice, that these systems that they are assuming are reliable and good are going to be challenged,” said Todd Kaplan, one of Louis’ attorneys.

The lawsuit alleged SafeRent’s algorithm didn’t take into account the benefits of housing vouchers, which they said was an important detail for a renter’s ability to pay the monthly bill, and it therefore discriminated against low-income applicants who qualified for the aid.

The suit also accused SafeRent’s algorithm of relying too much on credit information. They argued that it fails to give a full picture of an applicant’s ability to pay rent on time and unfairly dings applicants with housing vouchers who are Black and Hispanic partly because they have lower median credit scores, attributable to historical inequities.

Christine Webber, one of the plaintiff’s attorneys, said that just because an algorithm or AI is not programmed to discriminate, the data an algorithm uses or weights could have “the same effect as if you told it to discriminate intentionally.”

When Louis’ application was denied, she tried appealing the decision, sending two landlords’ references to show she’d paid rent early or on time for 16 years, even if she didn’t have a strong credit history.

Louis, who had a housing voucher, was scrambling, having already given notice to her previous landlord that she was moving out, and she was charged with taking care of her granddaughter.

The response from the management company, which used SafeRent’s screening service, read, “We do not accept appeals and cannot override the outcome of the Tenant Screening.”

Louis felt defeated; the algorithm didn’t know her, she said.

“Everything is based on numbers. You don’t get the individual empathy from them,” said Louis. “There is no beating the system. The system is always going to beat us.”

While state lawmakers have proposed aggressive regulations for these types of AI systems, the proposals have largely failed to get enough support. That means lawsuits like Louis’ are starting to lay the groundwork for AI accountability.

SafeRent’s defense attorneys argued in a motion to dismiss that the company shouldn’t be held liable for discrimination because SafeRent wasn’t making the final decision on whether to accept or deny a tenant. The service would screen applicants, score them and submit a report, but leave it to landlords or management companies to accept or deny a tenant.

Louis’ attorneys, along with the U.S. Department of Justice, which submitted a statement of interest in the case, argued that SafeRent’s algorithm could be held accountable because it still plays a role in access to housing. The judge denied SafeRent’s motion to dismiss on those counts.

The settlement stipulates that SafeRent can’t include its score feature on its tenant screening reports in certain cases, including if the applicant is using a housing voucher. It also requires that if SafeRent develops another screening score it plans to use, it must be validated by a third-party that the plaintiffs agree to.

Louis’ son found an affordable apartment for her on Facebook Marketplace that she has since moved into, though it was $200 more expensive and in a less desirable area.

“I’m not optimistic that I’m going to catch a break, but I have to keep on keeping, that’s it,” said Louis. “I have too many people who rely on me.”

Associated Press 2024