This article is part of the Under the Lens series
Top Takeaways
Tenant screening software has become an industry standard. It sifts through publicly available and proprietary data to gauge the suitability of potential renters. An applicant’s rent-paying history is often not part of a screening report.
Eviction-related records are a big problem. Data brokers often scrape eviction information from publicly available court websites, but eviction filings don’t always result in judgments against tenants. There are fights across the U.S. to pass laws that seal eviction records so that data brokers can’t sell them.
Some protections are in place, but tenants and landlords frequently aren’t aware of them. The Trump administration has shown signs that it won’t help enforce existing guidance and regulations.
Mary Louis had been renting a home in Malden, Massachusetts, for 17 years. She and her landlord had a good relationship, but she felt it was time for a change. After some scouting, she found a complex nearby with a vacant unit big enough for her and her adult son. It seemed perfect.
Louis had always worked, sometimes two jobs at once, and her longtime landlord even offered to write her a letter of reference. She thought she was a strong candidate for the apartment, even though her credit wasn’t very good.
She was upfront about it with the complex’s property manager. “I told the lady, ‘My credit is no good. She said, ‘Oh, that’s OK’ . . . she said my son’s good credit would balance out my bad credit.”
Louis, who now works as a receptionist in state government, waited for over two months for approval to move in. It seemed like every day she was asked for more documentation—her birth certificate, Social Security number, information about her son and granddaughter. “I was uploading, uploading, so many times.” The one thing they never asked for was information proving she could pay her rent.
Finally, she got an email from the rental management company: her credit score was too low to qualify for the apartment.
At the time, Louis felt betrayed. She’d already given notice at her current rental because the complex’s manager said her credit wouldn’t be an issue. It was only months later that a lawyer with Greater Boston Legal Services explained that this wasn’t a personal decision: the complex’s owners had used a tenant screening service offered by SafeRent Solutions to determine her trustworthiness as a renter. It was all done by algorithm, the lawyer clarified, using criteria that might not even be relevant to her status as a tenant. “If you’re low income and your credit is not good . . . even if I showed her that I lived in one place for 20 years, that’s not enough,” says Louis.
But she was lucky. She found another place to live soon after the complex rejected her application, and the legal services lawyer connected her to a law firm that was gathering tenants to sue SafeRent for violation of fair housing laws. The firm won the case, and Louis walked away with several thousand dollars—and the knowledge that she’d helped make things a little better for other renters.
But the tenant screening field is huge and marked by major problems that remain under-regulated. Around the U.S., thousands of companies offer tenant screening tools that promise to make life easier for landlords and property managers by vetting prospective tenants for them. The software sifts through publicly available data and proprietary data to assess the suitability of potential renters.
But the data these companies use is often riddled with errors, the algorithms that screening programs use to evaluate prospective tenants could have biases baked into them, and the results of a report may have no actual bearing on whether someone will be a good tenant. And unlike Louis, many tenants never find out why they were declined—and those who were rejected in error frequently find it difficult to correct their records.
Nonetheless, these tools have surged in popularity. While the Trump administration has shown signs that it won’t help enforce even existing guidance and regulations, housing advocates and state and local governments are working to limit the power of tenant screening companies.
Tenant Screening: A Housing Industry Standard with Little Oversight
Landlords have been checking potential tenants’ histories for decades. A majority of landlords use some form of screening technology to choose tenants, according to the Consumer Financial Protection Bureau (CFPB), which has received thousands of complaints about the practice and has reported deeply on it. That number could be as high as 90 percent. Usually the cost of a report is covered by apartment seekers through application fees.
Screening software examines details like an applicant’s employment and income, as well as their criminal record, rental history, eviction status, and credit background. Sometimes the resulting information is provided to a property manager or landlord, but many screening tools simply offer a score or a thumbs-up/thumbs-down recommendation, without disclosing the reason for their decision.
The field is large, opaque, and constantly in flux. “The companies are springing up rapidly, to the point that no one really knows how many there are,” says Hannah Holloway, vice president of housing programs at TechEquity, which has been scrutinizing the tech industry’s effects on housing and labor for years.
Housing advocates estimate the market is worth more than $1.3 billion, with roughly 2,000 companies offering screening software. Some are big firms like CoreLogic and the giant credit reporting agency TransUnion; others are smaller startups. “It’s a little bit of the Wild West,” says Holloway.
In part, that’s because the barrier to entry is low. So much personal data is now available online that a team with even basic coding skills can get into the business by gathering information and selecting for certain characteristics using proprietary algorithms. Some incorporate artificial intelligence, or AI, and that’s expected to grow.
Private-equity firms have taken an energetic and increasing interest; in 2021, NBC News reported that private-equity investments in the field grew from $1.7 billion in 2018 to $6.6 billion in 2020. That number is likely larger now.
After all, 45 million Americans rent their homes, close to an all-time high in the last half-century, and the very tight housing market means landlords can afford to be picky about who they let in. Increasingly, apartments and even single-family rentals are owned and managed by large corporations that want to standardize the entire process.
“If they can do it all through technology and automate it, that makes it a lot more profitable,” says Holloway.
In response, tenant screening firms have convinced the real estate industry that they can be relied on to select the very best tenants.
Eric Dunn isn’t buying it. Dunn is the director of litigation at the National Housing Law Project, a housing law and advocacy group. “None of the information that the companies provide to landlords is of meaningful value. No studies show it has any real benefit,” he says, echoing findings by the CFPB.
When they’re short on money, most people will pay their rent first and scrimp on other necessities, say advocates for low-income people. Therefore, the only indicator that really illustrates whether someone will pay their rent on time is their history of paying rent. But that’s something few screening tools include, say housing advocates.
Nonetheless, tenant screening software has become an industry standard. While landlords may find them useful, they can have a very real impact on people who need housing.
“The increasing utilization of algorithmic technologies in this sector is just automating discrimination.”
Jasmine Rangel, PolicyLink
“They’re rife with errors, these reports,” says Ariel Nelson, a senior attorney at the National Consumer Law Center who has been litigating and advocating on the issue.
It’s part of the industry’s business model. The CFPB found that most screening companies get their answers through automated searches of online material—either directly, or through data brokers that gather public information and resell it. The screening services rarely confirm that information manually, because it would be more expensive and time consuming. As a result, the information provided to landlords frequently contains inaccuracies. The findings might be for someone with a similar name or address as the applicant, or could include arrests that never led to convictions, or convictions that were irrelevant or occurred many years ago.
The biggest problem is with eviction records, which are almost always featured on screening reports. Data brokers often scrape eviction information from publicly available court websites. But the reality is that eviction filings don’t always result in judgments against tenants.
In Washington, D.C., in 2018, for example, only 5.5 percent of eviction filings led to a formal eviction. Other cases were dropped, or the two parties came to an agreement, or the tenant won, or the landlord was in the wrong—but the cursory information on the screening report doesn’t reflect that. Instead, a single eviction dispute can lead to a tenant being denied housing, not just once but for years to come.
Writ large, that kind of outcome makes the housing crisis worse, as tenants struggle to find shelter and pay ever-larger sums in application fees. Renters of color—who due to systemic bias and historical disparities have statistically worse credit histories, higher eviction numbers, and more interactions with the justice system—may be particularly affected by negative screening reports. So are households with children and older adults.
A TechEquity survey of screening outcomes in California found that Black and Latino applicants were around half as likely to have their rental applications accepted as whites—suggesting, says the report, “that automated tenant screening has the potential to deepen racial bias in housing.”
Some protections are in place. Like conventional credit bureaus, tenant screening companies are covered under the 1970 Fair Credit Reporting Act, which limits the amount of time that evictions can be reported to seven years. It also requires landlords to provide denied rental applicants with contact information for the screening company that was used and to tell them that they’re entitled to get a copy of the screening report and to dispute the errors.
In practice, however, tenants and landlords frequently aren’t aware of these mandates. And many screening companies are unresponsive when tenants try to correct their records, according to many advocates who work with low-income tenants. Even if they do respond, weeks or even months may have passed by the time the process is done, and the apartment mostly likely has been taken. And the same error could pop up at the next apartment building, which may be using different screening software. Meanwhile, the companies are rarely held accountable for using false information.
“The increasing utilization of algorithmic technologies in this sector is just automating discrimination,” says Jasmine Rangel, a senior associate at PolicyLink, a nonprofit research and advocacy group that is examining tenant screening programs.
Last year, under the Biden administration, the U.S. Department of Housing and Urban Development posted guidance on its website that outlined fair housing practices in the tenant screening process. The guidance encouraged landlords to independently verify whether a rental applicant is a strong candidate, rather than relying solely on automated results. It also made plain that landlords themselves, even if they use screening technology, are responsible for ensuring that their choices are compatible with fair housing laws. In particular, certain practices—like turning away anyone with a low credit score—may violate the disparate impact principle that’s part of the Fair Housing Act. That is, they could be discriminatory, even if a policy seems neutral on its face.
Advocates hailed the guidelines, saying they created some guardrails in a field that sorely needs them. “It was really great guidance, really incredible,” says Rangel.
But in early 2025, the Trump administration removed the guidance from HUD’s website. In addition, it’s been firing most of the CFPB’s staff and weakening enforcement of the Fair Housing Act. And in late April, President Trump ordered federal agencies to “deprioritize enforcement” of statutes and regulations covering the disparate impact principle.
“We’re all really nervous,” says Rangel.
protecting renters
The good news is that, for now, state and local governments remain free to pass regulations that rein in screening companies, and an increasing number are doing so. In some cases, federal guidelines have provided a road map for state policymakers, even when the guidelines themselves were later revoked.
For example, in 2016 under the Obama administration, HUD issued guidelines discouraging landlords from turning people away from housing because of past criminal records, arguing that it’s discriminatory under the disparate impact standard. That guidance was removed by the first Trump administration. Nonetheless, in the years after it was issued, jurisdictions like Seattle, Oakland, Portland, Detroit, Minneapolis, and Washington, D.C., passed Fair Chance Housing laws that ban or discourage landlords from considering applicants’ criminal histories. The laws have been generally successful, improving formerly incarcerated renters’ ability to secure a home and reducing recidivism.
More recently, state and local governments have passed legislation that seals eviction records so that data brokers can’t sell them to tenant screening companies. The ideal law, advocates say, should render the information unavailable from the moment an eviction notice is filed. Around 17 states and six jurisdictions have policies that seal eviction records, according to PolicyLink, and other states are currently debating them.
California has had a comprehensive eviction sealing law since 2016 that only makes eviction information publicly available if the landlord prevails within 60 days of filing; otherwise, screening companies can’t view it. The law has been found to promote renter stability in the state.
Colorado passed a robust eviction-sealing law in 2020, and the state has also limited the screening companies’ ability to consider prospective tenants’ credit history, income, and criminal background.
And Washington, D.C., passed an eviction sealing law in 2022 that included protections for rental applicants using housing vouchers.
Philadelphia has taken a different tack to address screening problems. Its 2021 Renters’ Access Act outlines some mandates for landlords: they must conduct an individualized assessment of tenant applications, provide rejected applicants with a copy of their screening report, and allow them to correct errors within seven days. If the tenant can demonstrate that the information is wrong, the landlord—if they own more than five rental units in the city—must offer them the next available apartment.
The relief included SafeRent changing its practices broadly—so now all of the landlords using that screening company will be in better compliance with the law.”
Christine Webber, Cohen Milstein
It’s a good law, advocates say, but enforcement has been difficult. In that sense, it’s not unique. “All tenant screening legislation is challenging to enforce,” says Natasha Duarte, a senior project director with Upturn, which helped pass D.C.’s legislation. The nonprofit research and advocacy group examines how technology affects an individual’s opportunities.
Litigation is one way to enforce the legislation. In the case of Mary Louis, the Massachusetts tenant whose rental application was turned down based on an algorithm, lawyers from the firm Cohen Milstein gathered a class of Black and Latino low-income tenants to sue SafeRent. In the end, the judge ruled that under the disparate impact principle, the company’s denial was discriminatory against Black and Hispanic renters, who are more likely to have subprime credit scores due to lending discrimination, lower household wealth, and thin credit histories. In addition to making a $2.3 million payment to the 400-plus plaintiffs, the company agreed to stop scoring applicants who used housing vouchers, as Louis does.
“The relief included SafeRent changing its practices broadly—so now all of the landlords using that screening company will be in better compliance with the law,” says Christine Webber, a Cohen Milstein lawyer who worked on Louis’s case.
In an emailed statement, SafeRent told Shelterforce that while it “continues to believe that the SRS score complies with all applicable laws, litigation can be time consuming and costly. It became increasingly clear that continuing to defend this case would divert time and resources from our core mission of giving housing providers the tools they need to screen applicants.”
It may seem like tenant screening problems are being addressed by a patchwork of laws and legal actions. And that’s accurate. But credit scores, too, went through a series of ultimately successful public policy and legal battles in the 1970s, as leaders and advocates fought to enforce consumer protections. Tenant screening may be following a similar path.
The companies are paying attention. “We’re seeing legislation being proposed, in places that are normally not as tenant-friendly”—such as Kansas, Oklahoma, and South Carolina—”to completely restrict eviction and criminal records,” says Alexandra Alvarado, director of marketing and education at the American Apartment Owners Association, which represents more than 100,000 landlords and property managers. The organization has offered its own screening program, TenantAlert, for decades, and Alvarado says it takes accuracy very seriously. But as a result of some less-ethical companies, she observes, things are changing.
“I see the writing on the wall,” she says. “I see these laws just spreading.”
The rapidly expanding tenant screening industry, and the concerns surrounding its oversight, highlight a critical issue for both renters and the housing market as a whole. This discussion appropriately draws attention to the need for greater transparency, accuracy, and robust protections for applicants. Addressing the challenges in this billion-dollar sector is essential to ensure fair housing access and to safeguard individuals from potentially biased or inaccurate information
This is an absolutely critical topic that needs more attention! The title perfectly captures the startling reality of the tenant screening industry: a massive, impactful sector operating with concerningly “little oversight.” The potential for errors, bias, and unfair denials, especially when relying on imperfect data or opaque algorithms, has enormous consequences for renters’ access to housing. I’m eager to read about the specific initiatives and legislative efforts being undertaken to protect renters from these systemic issues. This post addresses a vital justice and housing access issue.