Artificial Intelligence Is Making The Housing Crisis Worse - The Lever

[The tenant application of a 75 year old man named Chris Robinson was rejected by a screning program which evaluates “higher-risk renters” with so-called artificial intelligence. This rejection was on the grounds of a past conviction for littering; a crime which was committed by a 33 year old halfway across the country. Despite correcting this error, Robinson] lost the apartment and his application fee nonetheless, according to a federal class-action lawsuit that moved towards settlement this month. The credit bureau TransUnion, one of the largest actors in the multi-billion-dollar tenant screening industry, agreed to pay $11.5 million to resolve claims that its programs violated fair credit reporting laws.

Landlords are increasingly turning to private equity-backed artificial intelligence (AI) screening programs to help them select tenants, and resulting cases like Robinson’s are just the tip of the iceberg. The prevalence of incorrect, outdated, or misleading information in such reports is increasing costs and barriers to housing, according to a recent report from federal consumer regulators.

Even when screening programs turn up real data, housing and privacy advocates warn that opaque algorithms are enshrining high-tech discrimination in an already unequal housing market — the latest example of how AI can end up amplifying existing biases.

Alongside the TransUnion lawsuit, at least four other tenant screening companies, many of which purport to predict “rental risk” through the use of AI, are currently facing more than 90 federal civil rights and consumer lawsuits, according to a Lever review of court records

The Consumer Data Industry Association, a lobbying group for screening and credit reporting companies, has reported spending more than $400,000 so far this year lobbying in states considering legislation to increase transparency in the development and use of AI.

An estimated 2,000 third-party screening companies offer mega-landlords, who often lack staff on the ground, a faster alternative to traditional background checks.

The technology has also attracted the interest of private equity and venture capital, with billions of dollars pouring into companies with names like Turbo Tenant, RentSpree, and LandlordStation — the latter of which proclaims, “We work hard to make your life a little bit easier!”

The costs of screening reports vary, but they’re often paid for by tenants, and those who receive scores low enough for a “conditional acceptance” are often forced to pay higher deposits, according to reporting by ProPublica.

Most screening companies say their algorithms rely on the same types of records that many landlords would otherwise check themselves, including credit score reports and criminal histories. That longtime practice has already increased barriers to high-quality housing for many people of color. Available research has found that criminal records are generally not a good predictor of how someone will behave as a tenant, whereas housing instability is closely associated with recidivism.

Read the full story here

The Lever

Organizations: TransUnion Consumer Data Industry Association Turbo Tenant RentSpree LandlordStation ProPublica Consumer Financial Protection Bureau (CFPB) National Multifamily Housing Council SafeRent Solutions National Housing Law Project 

People: Chris Robinson 

Tags: Housing Automation Discrimination Crime 

Type: Headlines