AI, Proptech, and Fair Lending: GAO’s Warning Shot for the Digital Homebuying Era
Property technology—“proptech”—has moved from niche innovation to core infrastructure in U.S. homebuying. A recent report by the Government Accountability Office (GAO) analyzes how online real estate platforms, automated valuation models, automated underwriting systems, and electronic closing tools are reshaping the mortgage market, and what that means for fair lending and consumer protection.
The report describes a homebuying ecosystem in which nearly every stage of the transaction is now mediated by software. Online platforms aggregate listings, neighborhood data, and financing options into a one-stop interface. Automated valuation models estimate property values in seconds using large datasets. Lenders rely on the proprietary underwriting systems of Fannie Mae and Freddie Mac to evaluate loan eligibility. Electronic closing tools, meanwhile, digitize documents and signatures, promising faster and more accurate closings. Collectively, these tools can lower transaction costs, reduce processing times, and widen informational access for prospective borrowers.
Yet GAO’s central argument is that efficiency gains are tightly intertwined with new forms of risk. Online platforms can collect highly sensitive consumer data and deploy AI-driven advertising or search tools that may “steer” users in ways that conflict with the Fair Housing Act. Algorithms that filter listings or target ads can invisibly exclude certain protected groups from seeing specific homes or neighborhoods, making discrimination harder to detect and contest.
Automated valuation models illustrate a further tension between innovation and equity. On one hand, they can reduce some human biases observed in traditional appraisals—for example, the tendency of appraisers to anchor values to contract prices. On the other hand, models trained on historical sales data risk reproducing and entrenching long-standing patterns of undervaluation in communities that have already suffered from discrimination and disinvestment. The opacity of many commercial models, combined with limited public disclosure of methodology and inputs, complicates efforts by regulators and consumers to evaluate fairness or accuracy.
Automated underwriting systems offer similar trade-offs. They can enhance data quality, standardize risk assessment, and, with responsible design, potentially expand access to credit through the use of alternative data. But where AI-based models are used, the “black box” problem looms large. Lenders remain legally obligated to provide specific reasons for adverse actions, yet complex machine-learning models may not yield explanations that are intelligible to borrowers—or even to institutions themselves. This threatens both effective enforcement of fair lending laws and borrowers’ ability to understand and challenge denials.
GAO also highlights an evolving and uneven oversight landscape. Multiple federal actors—the Consumer Financial Protection Bureau, the Department of Housing and Urban Development, the Federal Trade Commission, the Department of Veterans Affairs, state regulators, and the Federal Housing Finance Agency (FHFA)—share responsibility for fair lending and consumer protection. However, with the notable exception of FHFA’s examinations of Fannie Mae and Freddie Mac, oversight has rarely focused directly on specific proptech products. While a 2024 interagency rule on quality control standards for automated valuation models provides an emerging regulatory framework, broader supervisory practice is still catching up to the technological reality.
The report’s most pointed criticism is directed at FHFA. In 2025, FHFA revised key aspects of its fair lending oversight—changing its examination approach, waiving components of its fair lending rule, and rescinding related guidance—without clearly communicating new compliance expectations to the government-sponsored enterprises. GAO concludes that written direction is needed so that Fannie Mae and Freddie Mac understand how they are now expected to comply with fair lending requirements and how FHFA will supervise that compliance. Without such clarity, the risk is that proptech will continue to evolve faster than the regulatory architecture designed to ensure that innovation supports, rather than undermines, equitable and sustainable access to housing.
Disclaimer: This blog post is for informational and educational purposes only and does not constitute legal, financial, or other professional advice. Readers should consult qualified counsel or advisors before taking any action based on the issues discussed here.