Trump Administration Launches Long-Promised Challenge to Fair Housing Law
The Trump administration will introduce a new rule on Monday that may reshape the way the government enforces fair housing law, making it harder for people to bring forward discrimination complaints under the Fair Housing Act.
The proposed regulation from the U.S. Department of Housing and Urban Development would replace an Obama-era rule on disparate impact, a legal theory that has guided fair housing law for more than 50 years. Disparate impact refers to practices or policies that have an adverse impact on minorities without discriminating against them in explicit terms. The Supreme Court has recognized this form of bias as prohibited under the Fair Housing Act. But the new rule from HUD would substantially raise the burden of proof for parties claiming discrimination.
“This is a proposal to very dramatically revise and effectively destroy an existing 2013 civil rights regulation,” says Megan Haberle, deputy director for the Poverty & Race Research Action Council. “This is a core part of the Fair Housing Act, and very early fair housing cases across the country have recognized the discriminatory effects standard.”
Housing Secretary Ben Carson signaled that the department was rethinking the disparate impact doctrine last June. The new rule, a version of which was leaked to Politico, will be published in the Federal Registeron Monday, triggering a 60-day comment period before it can be officially implemented.
Under the current rule, disparate impact cases proceed by meeting a three-part burden-shifting test: A plaintiff makes an allegation, a defendant offers a rebuttal, then the plaintiff responds. The new rule would set a five-point prima facie evidentiary test on the plaintiff side alone. This means that a party looking to bring a discrimination case under the Fair Housing Act would need to establish some level of evidence in the pleading stage. To bring forward an accusation of implicit discrimination, plaintiffs would need to demonstrate—before any discovery process—that the policy itself is flawed.
In addition, the new HUD rule would establish three new defenses for landlords, lenders, and others accused of discrimination based on models and algorithms. The first defense would enable defendants to indicate that a model isn’t the cause of the harm. The second would allow the defendant to show that a model or algorithm is being used as intended, and is the responsibility of a third party. Finally, the new rule would allow the defendant to call on a qualified expert to show that the alleged harm isn’t a model’s fault.
Critics say that this new development gives lenders and landlords a big loophole. Many if not most financial institutions are not capable of developing their own in-house credit-risk algorithms; instead, they turn to third-party vendors. By putting the onus of fairness on these vendors, HUD is establishing a perverse incentive for banks and vendors alike to decline to study the outcomes of automated decision-making systems, according to Jacob Metcalf, a researcher for the nonprofit research institute Data & Society and founder of Ethical Resolve, a data-ethics consultancy.
“As long as the bank or lender is buying this tool from a third party that claims it has been adequately tested for algorithmic fairness, then the bank or lender is shielded from liability,” Metcalf says. “That’s a problem because there are no established standards—and the HUD rule doesn’t set out to establish any standards—about disparate impact.”
Meanwhile, a plaintiff has no way of knowing what data a vendor uses to model credit risk. A plaintiff might not be able to determine which vendors are responsible for what algorithmic effects. Third parties would be able to shield their practices behind trade secrets; any plaintiff looking to suss out whether an algorithm has a discriminatory impact might wind up “lost in a web of vendor relationships,” Metcalf says—with little recourse, especially prior to the discovery stage.
This is the first federal regulation to directly address algorithms and disparate impact. Attorneys couldn’t point to any caselaw that addresses algorithmic models and disparate impact, either. It’s not a wholly unreasonable idea for a regulation, Metcalf says: Many banks don’t have the resources to gauge the liability of an algorithm, after all. But without sufficient due-diligence standards, vendors will have every incentive to drag their feet. And as long as their models aren’t blatantly discriminatory, then the vendors likely wouldn’t be held responsible for disparate impacts, either.
But under the proposed rule, it falls on the plaintiff to determine, case by case, how an algorithm affects them by suing the company or companies responsible for making the algorithm—without any standard in place for what algorithmic fairness means.
“How do you build a model to avoid disparate impact?” Metcalf says. “How often should it be tested? When does it need to be retested? How do you know if it’s appropriate from one population to another? Maybe it’s fair for the population of Ann Arbor. Maybe it’s unfair for the population of Detroit. How do you know which population it was trained on?”
He adds, “If HUD isn’t going to answer those questions, it’s a get-out-of-jail-free card. They’re creating the liability loopholes that all of the potential plaintiffs will fall through by default.”
Civil rights organizations are already gearing up for a fight over the rule. The National Fair Housing Alliance, Leadership Conference on Civil and Human Rights, NAACP Legal Defense Fund, and others are joining forces under the banner of Defend Civil Rights. This new coalition aims to oppose efforts by the Trump administration to dial back regulations that safeguard minorities from discrimination, according to a civil rights attorney familiar with the project who couldn’t speak on the record before the group’s launch on Monday.
“When it comes to policymaking, most institutions, whether they’re lending institutions or landlords, have long since abandoned explicit racial [or other] discrimination. Disparate impact is really the best tool we have to level the playing field,” Greene says.
Defenders of the administration’s efforts say that it’s necessary to bring the department’s regulations in line with the Supreme Court’s 2015 decision in Texas Department of Housing and Community Affairs v. The Inclusive Communities Project. Francis Riley, a partner for Saul Ewing Arnstein & Lehr who represents defendants in the civil rights arena, says that the decision will constrain claims from plaintiffs.
“It puts the courts front-and-center to control claims that move on to discovery,” Riley says. “If [plaintiffs] are using a defendant’s [Home Mortgage Disclosure Act] data, or regional HMDA data, that shows that a particular area is not being served by the defendant, that is not enough. They have to actually assert, and in more than just a perfunctory way, that the lender has a policy or practice that they are effectively enforcing which has the goal of discriminating against those individuals.”
“We know what HUD’s doing,” Riley says. “What’s the [Consumer Financial Protection Bureau] going to do? What’s the [Federal Housing Administration] going to do? All of these departments have fair housing divisions.”
The language of the new regulation relies heavily on the text of former Justice Anthony Kennedy’s 5–4 decision for the majority in Inclusive Communities. The court ruled that disparate impact is “cognizable under the Fair Housing Act,” affirming prior decisions by eleven federal appellate courts that relied on this doctrine. Kennedy’s decision did not rely on the Obama-era HUD rule on disparate impact, which codified practices across the department. But the Trump administration saw the decision as a reason to revise the housing department’s rule.
In the Inclusive Communities decision, the court considered an ongoing challenge from the Dallas area. There, housing authorities had long been distributing housing tax credits, which are used to build low-income housing, in a way that consolidated construction in mostly black areas. This is a straightforward example of disparate impact, with none of the complications of machine learning or artificial intelligence that the new HUD rule anticipates.
On Friday, advocacy groups such as the National Low Income Housing Coalition and the National Community Reinvestment Coalition condemned the rule in strong terms. Civil rights attorneys worry that the new standard will unwind the protections afforded by the Fair Housing Act.
“Fundamentally, if this rule is adopted, and disparate impact is no longer available as a legal bulwark against facially neutral or unintentionally discriminatory policies,” Green says, “we’re in a lot of trouble.”