Washington – RealEstateRama – Congresswoman Maxine Waters (D-CA), the top Democrat on the House Financial Services Committee, sent a letter to the Government Accountability Office (GAO) expressing concerns about instances in which the use of Artificial Intelligence (AI) and other housing and property technology (PropTech), such as automated valuation models (AVMs), online housing platforms, tenant screening companies, and rent-setting companies, have been shown to lead to increased housing costs, discrimination, and other barriers to fair and affordable housing. In the letter, Ranking Member Waters asks the GAO to study and assess the effects that AI and PropTech may have on consumers and the U.S. housing market and report to Congress with their findings as well as any policy recommendations, as appropriate.
“Many point to technological developments as a source of innovation and efficiency in the housing market. However, there is nothing innovative about technologies that generate corporate profits while neither improving consumer affordability nor increasing access to housing for all,” said Ranking Member Waters. “In light of these mounting concerns, and the lack of transparency among many of these companies, I ask that GAO analyze the effect these entities may have on consumers and the U.S. housing market and report to Congress. I also request that in its reporting, GAO provide policy recommendations, as appropriate, for relevant Federal agencies and Congress to consider in addressing these effects, as well as any gaps in federal data, oversight, and regulation.”
See the letter here and below.
The Honorable Eugene Dodaro
Comptroller General of the United States
Government Accountability Office
441 G St., NW
Washington, DC 20548
Dear Comptroller General Dodaro:
I write to express my concerns regarding the effects that online platforms, tenant screening companies, rent-setting companies, and other housing or property technology (PropTech) companies are having on consumers’ ability to access fair and affordable housing. To better understand this issue, I am requesting that the Government Accountability Office (GAO) carry out a body of work to study and assess:
- The role of online platforms and tenant screening companies in the housing market, including their purchasing of homes, engaging in mortgage lending, and providing housing-related services to landlords and consumers, including tenants, homeowners, and prospective homebuyers;
- Relevant federal anti-trust, fair housing, fair lending, and consumer financial protection laws and regulations, including the Fair Housing Act, the Equal Credit Opportunity Act, the Fair Credit Reporting Act, and other relevant statutes and regulations, and in consideration of such laws and regulations, steps that PropTech companies take in developing and administering their digital advertising, digital listing, rent setting, and tenant screening products, policies, and practices;
- How such entities are supervised by Federal agencies for compliance with the aforementioned laws, including any differences that are based on the type, size, and complexity of the entity;
- How such entities are currently using artificial intelligence (AI), including machine learning (ML) and generative AI, in their services, and how these technologies are being assessed for compliance with appropriate anti-trust, fair housing, and fair lending laws;
- The extent to which federal regulators, such as the Federal Housing Finance Agency (FHFA) and the Department of Housing and Urban Development (HUD), currently allow for the use of artificial intelligence (AI), including machine learning (ML) and generative AI, in their policies and practices, and how these technologies are being assessed for compliance with appropriate anti-trust, fair housing, and fair lending laws; and
- The potential benefits and implications these entities and technologies have on the availability and affordability of housing (both rental and purchase) and mortgage lending on fair and equitable terms, especially for low- and moderate-income communities and communities of color.
For years, major companies like Zillow and Redfin have grown their role in the real estate market. Today, they offer housing search, mortgage lending, marketing, and even contract agreement services that are used by homebuyers, renters, and landlords. For example, we have heard of landlords who are increasingly relying on tenant screening services and leasing contracts generated by Zillow, which may include imbalanced terms and agreements that benefit landlords, such as additional rent clauses, while leaving renters with limited protections.
Many online platforms also rely on algorithms, as well as machine learning, to scale up their business services that help advertise housing, determine home values through Automated Valuation Models (AVMs), and make loan pricing decisions. However, it is not always clear how the algorithms come to the decisions they make or whether the algorithms are furthering bias. In some cases, the use of such algorithms and other company policies and practices have been demonstrated to have discriminatory outcomes. A 2022 redlining and housing discrimination lawsuit against Redfin demonstrated how an online real estate services company offered disproportionately fewer or no services in zip codes with predominately non-White residents compared to zip codes with predominately White residents due to company policies, such as offering services based on a minimum housing price. This shows why it is integral that when entities rely on artificial intelligence, the development and implementation of these technologies are evaluated for transparency, explainability, privacy, and fairness.
Despite a lack of robust quality control standards over AI technologies, federal housing regulators also allow for the use of PropTech. For example, for more than 20 years, FHFA has allowed Fannie Mae and Freddie Mac (the Enterprises) to maintain proprietary AVMs. Lenders are also permitted to use AVMs to determine home values in certain circumstances, such as when appraisal waivers are granted. While AVMs can help reduce costs and expedite the origination process, they have also been found to produce inaccurate valuations in rural areas, as well as “racially disparate outcomes—namely, higher error as a percentage of value in majority-Black neighborhoods.” Yet federal regulators only recently proposed quality control standards that include a nondiscrimination factor. Similarly, despite HUD’s recognition that the use of facial recognition technology in the name of security in public housing is not an acceptable use of government funds, it has been reported that public housing agencies and other owners of HUD-assisted housing continue their use of biometric technologies that may have adverse outcomes on residents, especially residents of color.
Many in the real estate market are also turning to third-party tenant screening companies to make decisions about who has access to housing. Evidence shows that the data that fuels these screening technologies can disproportionately affect people of color’s ability to equitably access housing. Numerous lawsuits also highlight the ways in which tenant screening technologies are prone to pulling inaccurate data that may not even be connected to the applicant in question. Furthermore, according to the Consumer Financial Protection Bureau (CFPB), 68% of renters pay application fees when seeking housing that are often used to pay for a tenant background check report. However, they have hardly any visibility into that information before a rental decision is made, and they have few options to correct their report if it contains wrong, misleading, or old information. Tenant screening reports often pull eviction filings even when a tenant was not ultimately evicted from their home. Given that Black women are more likely to have an eviction filed against them that ended up being dismissed, tenant screening errors may have a disparate impact on Black women and other women of color’s ability to access housing on fair and unbiased terms.
I am also concerned about skyrocketing housing costs that are often a cause of evictions and homelessness. Today, median rents are topping $2,000 per month and the average U.S. renter is paying more than 30% of their income on rent, the highest levels recorded. Meanwhile, recent reporting has highlighted the concerning role that rent setting algorithms, like those offered through RealPage’s YieldStar software, may play in using private data to artificially inflate housing costs in ways that may also pose anti-trust concerns.
I am pleased that the CFPB and Federal Trade Commission (FTC) are investigating these issues and have sought public feedback, but there is more that can be done. Many point to technological developments as a source of innovation and efficiency in the housing market. However, there is nothing innovative about technologies that generate corporate profits while neither improving consumer affordability nor increasing access to housing for all. In light of these mounting concerns, and the lack of transparency among many of these companies, I ask that GAO analyze the effect these entities may have on consumers and the U.S. housing market and report to Congress. I also request that in its reporting, GAO provide policy recommendations, as appropriate, for relevant Federal agencies and Congress to consider in addressing these effects, as well as any gaps in federal data, oversight, and regulation.
I look forward to your analysis and recommendations and thank you for your attention to this critical issue. Please contact Alia Fierro (), Director of Housing and Insurance Policy, with any questions.
U.S. House of Representatives
House Committee on Financial Services
CC: The Honorable Patrick McHenry, Chair, House Committee on Financial Services
 Curbed, Housing discrimination goes high tech, (Dec. 17, 2019); Robert Bartlett et al., Consumer-Lending Discrimination in the FinTech Era (Mar. 19, 2018). National Fair Housing Alliance, National Fair Housing Alliance and Redfin Agree to Settlement Which Greatly Expands Access to Real Estate Services in Communities of Color (Apr. 29, 2022).
 Freddie Mac, Home Value Explorer® (Accessed on Aug. 11, 2023); See also Fannie Mae, Appraising the Appraisal (Feb. 2022).
 FHFA Office of Inspector General, An Overview of Enterprise Appraisal Waivers (Sep. 14, 2018); See also FHFA Office of Inspector General, FHFA Ensured that Fannie Mae Submitted Required Property Valuation Data to the Agency’s Mortgage Loan Integrated System (May 31, 2022).
 FHFA, Appraisal Accuracy and Automated Valuation Models in Rural Areas (Mar. 2019); See also Urban Institute, Revisiting Automated Valuation Model Disparities in Majority-Black Neighborhoods (May 18, 2022).
 House Financial Services Committee, Ranking Member Waters, Congresswoman Pressley Urge HUD to Prohibit Use of Racially Biased Surveillance Technology in Federally Assisted Housing (May 26, 2023).
 Housing Matters an Urban Institute Initiative, How Tenant Screening Services Disproportionately Exclude Renters of Color from Housing (Dec. 21, 2022).
 New York Times, How Automated Background Checks Freeze Out Renters (May 28, 2020).
 CFPB, CFPB Reports Highlight Problems with Tenant Background Checks (Nov. 15, 2022).
 ACLU, Clearing the Record: How Eviction Sealing Laws Can Advance Housing Access for Women of Color (Jan. 10, 2020).
 AP News, Eviction filings are 50% higher than they were pre-pandemic in some cities as rents rise (Jun. 17, 2023).
 NPR, Rents across the U.S. rise above $2,000 a month for the first time ever (Jun. 9, 2022); See also Moody’s Analytics, Moody’s Analytics: Spending 30% of Income on Rent is the New Normal (May 16, 2023).
 ProPublica, Rent Going Up? One Company’s Algorithm Could Be Why. (Oct. 15, 2022).
 CFPB, Tell us about your experiences with rental background checks and fees (Feb. 28, 2023).