AI in Housing May Worsen Discrimination Without Oversight, Experts Warn

Written By

Mathew Abraham

Updated on

Mathew Abraham

Mathew Abraham, editor of Century Homes America, brings his passion for architectural history to explore the stories behind America’s most iconic homes.

AI in Housing May Worsen Discrimination Without Oversight, Experts Warn
ThisIsEngineering/Pexels

Artificial intelligence is playing a growing role in the tenant screening process across the U.S., promising faster and more detailed evaluations of prospective renters. However, as these tools gain popularity, concerns are mounting among experts and advocates that AI may reinforce discrimination, especially when transparency and human oversight are lacking. In an increasingly tight housing market, these screening systems could mean the difference between stable housing and repeated rejections.

Millions Remain Invisible to Traditional Credit Systems

Millions Remain Invisible to Traditional Credit Systems
Thirdman/Pexels

According to Snappt, a PropTech company specializing in AI-based tenant screening, 28 million adults in the U.S. are “credit invisible”, and another 21 million are “unscorable.” Many of these individuals are financially responsible but use nontraditional financial tools like debit cards or peer-to-peer apps instead of credit cards. This is especially true for younger renters, including many from Gen Z, who are often missed by legacy credit-based evaluations.

Beyond the Credit Score, Wider Financial Lens

Beyond the Credit Score A Wider Financial Lens
Markus Winkler/Pexels

Snappt’s platform aims to address this issue by evaluating more than just credit scores. It looks at a renter’s overall financial health, including rent-to-income ratios, cash flow, account stability, and expense habits. This allows leasing agents to assess applicants based on a fuller financial picture. For example, a gig worker with a steady income and no overdraft history might qualify where traditional systems would have failed them.

Fraudulent Documents on the Rise

Fraudulent Documents on the Rise
Alena Darmel/Pexels

Fraudulent rental applications have become more common in recent years. A Snappt survey found that over 85 percent of property managers had encountered fake pay stubs or altered financial records. In response, Snappt partners with financial data services like Finicity and Argyle to directly verify income from banks, gig platforms, and payroll systems, with user consent. This verification process helps flag inconsistent data and reduces the risk of fraud.

Balancing Data With Human Judgement

Balancing Data With Human Judgement
Ron Lach/Pexels

Snappt emphasizes that its technology does not allow final decisions about applicants to be made. Instead, it equips leasing teams with reliable data to make informed choices. The importance of maintaining human oversight is reinforced by examples like one tenant, initially flagged due to low credit caused by a divorce, who turned out to be a reliable resident for eight years after further evaluation.

Widespread Use of Algorithms With Limited Oversight

Widespread Use of Algorithms With Limited Oversight
Ivan Samkov/Pexels

A recent survey by TechEquity Collaborative revealed that two-thirds of landlords in California use automated screening tools. Of these, about 20 percent pay for services that generate predictive risk scores, and 37 percent rely solely on the system’s recommendation without reviewing supporting data. Alarmingly, only 3 percent of renters surveyed knew which company had provided the screening report that may have led to their denial.

Federal Scrutiny and Legal Action

Federal Scrutiny and Legal Action
Matheus Bertelli/Pexels

The growing role of AI in tenant screening has led to increased regulatory attention. In 2023, the U.S. Department of Housing and Urban Development (HUD) issued guidelines urging property owners to incorporate human review into AI-assisted decisions. That same year, the Consumer Financial Protection Bureau fined TransUnion $23 million for inaccurate data used in tenant screenings. Several lawsuits followed, including one involving CoreLogic and another against an AI chatbot that rejected a housing inquiry.

Regulatory Backlash Under New Administration

Regulatory Backlash Under New Administration
Pixabay/Pexels

With the new administration in place, industry lobbying has intensified. The National Apartment Association and the National Multifamily Housing Council recently sent a letter to President Trump requesting the rollback of over 30 housing regulations. One specific request was the withdrawal of HUD’s guidance on AI use in screening, arguing that such policies raise development costs and slow down housing availability. Seasonal income fluctuations or one-off expenses can wrongly flag an applicant as risky without someone to contextualize the data.

Related Posts

Brokers continue charging tenants, exploiting confusion around who hired them under the FARE Act.
Although some policies advanced meaningfully, not all proposals made it through.
January notice warned of lawsuit over $18K in earlier missed payments
Trump order ends enforcement of policies with discriminatory effects unless intent is proven.
$7,000 per tenant placed in fund to protect landlords from financial loss.
Families already on assistance for 2+ years make up 70% of those likely to be affected.
Harlin House demolition follows severe structural decay and safety concerns.
Critics say Cuomo’s timing is political, not principled after he wants NYC rent control powers he denied as governor.
Experts warn landlords could violate the Fair Housing Act by complying with unsigned immigration subpoenas.

Leave a Comment