In Public Comments to Proposed Rule, Booker Blasts HUD for Eroding Fair Housing Act Protections

 

In Public Comments to Proposed Rule, Booker Blasts HUD for Eroding Fair Housing Act Protections

Rule would subject millions of Americans to housing discrimination, Booker says

 

WASHINGTON, D.C. – U.S. Senator Cory Booker (D-NJ) today released public comments he submitted opposing a proposed rule that dismantles vital federal protections for victims of housing discrimination. The proposed Housing and Urban Development (HUD) rule dramatically weakens the Fair Housing Act’s (FHA) disparate impact standard, a standard that protects some of the country’s most vulnerable populations by banning practices that have discriminatory impact, not just discriminatory intent.

 

The proposed HUD rule would gut the disparate impact standard in two key ways. First, defendants in housing discrimination claims would only need to demonstrate that their algorithms do not use substitutes or close proxies for protected classes. Yet algorithms rely on complex instructions to make inferences from data about people, and algorithmic bias rarely stems from a single protected characteristic.

 

Second, the proposed rule would allow housing providers to avoid liability for algorithmic discrimination if they use algorithmic tools developed by a third party. This would allow housing providers to avoid responsibility for algorithmic bias as long as they have not directly “caused” a discriminatory effect. With this proposed rule, HUD, the nation’s main civil rights enforcer for housing discrimination, effectively provides a road map for landlords and lenders who wish to discriminate by algorithm.

 

“Should the administration enact the proposed rule, millions of Americans could be subject to housing discrimination without recourse,” Booker wrote. “Allowing lenders, landlords, and realtors to outsource the liability for housing discrimination ensures they will ignore discriminatory outcomes and actually incentivizes them to avoid asking their vendors questions. Rather than absolving housing stakeholders from responsibility, HUD should be encouraging actors to take steps to address biases.”

 

“This burden-shifting framework establishes lines of defense for accused violators of the FHA that are fundamentally flawed and demonstrate a lack of understanding of how machine learning technologies work,” Booker continued. “As financial institutions, landlords, and other housing providers increasingly rely on the use of algorithmic decision-making, this proposal weakens protections for consumers, instead of ensuring industries adhere to best practices as they implement new technologies.”

 

Housing discrimination is personal for Booker. In 1969, Booker’s parents faced what was known as “real estate steering” – in which black couples were steered away from certain neighborhoods – when they tried to buy a home in an affluent suburb of New Jersey. Only after enlisting the local Fair Housing Council, which helped set-up an elaborate sting operation, were they able to purchase a home in Harrington Park, New Jersey.

 

The full text of Booker’s comments to the proposed rule is available here and below.

 

Background on Booker’s record fighting algorithmic bias:

 

Booker was one of the first lawmakers to call for increased antitrust scrutiny of major tech companies and he has consistently fought to ensure that extant discrimination and bias in our society does not become augmented and automated in the future.

 

Earlier this year, along with Senator Ron Wyden (D-OR) and Congresswoman Yvette Clarke (D-NY), Booker introduced the Algorithmic Accountability Act, which requires companies to fix discriminatory algorithms and outlines methods the federal government can use to mitigate the impacts of such algorithms.

 

Last year, he secured a commitment from Facebook CEO Mark Zuckerberg — for the first time — to conduct a civil rights audit at Facebook. Among other things the audit is meant to address the algorithmic biases on Facebook’s platforms.

 

In July 2018, Booker partnered with Senator Wyden to survey 39 federal law-enforcement agencies about their use of facial recognition technology, and what policies, if any, they have installed to prevent abuse, misuse, and discrimination. That same month he also joined Senators in requesting the Government Accountability Office study commercial and government uses of facial recognition technology, and examine what safeguards companies and law enforcement have implemented to prevent misuse and discrimination based on race, gender, and age.

 

In October 2018, Senator Booker secured language in the FAA Authorization Act requiring the TSA to report to Congress on methods to eliminate bias on race, gender, and age as TSA begins deployment of facial recognition technology for domestic flights.

 

October 18, 2019

 

The Honorable Ben Carson
Secretary
U.S. Department of Housing and Urban Development
1400 7th Street, S.W.
Washington, D.C. 20410

 

RE: FR-6111-P-02

Dear Secretary Carson:

 

I am writing to express my opposition to the U.S. Department of Housing and Urban Development’s (HUD) proposal to amend the Fair Housing Act’s (FHA) disparate impact standard, which would erode federal protections for victims of housing discrimination. Not only does the proposed rule strip away existing protections for some of the most vulnerable populations in our country, but it also removes critical safeguards on emerging technologies. In particular, regarding the use of models and algorithms, the new guidance significantly undermines FHA protections while disregarding the mechanics of machine learning. As financial institutions, landlords, and other housing providers increasingly rely on the use of algorithmic decision making, this proposal weakens protections for consumers, instead of ensuring industries adhere to best practices as they implement new technologies.

 

Although I share HUD’s desire to increase access to credit to underserved communities, this proposal essentially makes lenders and landlords using algorithmic models exempt from the disparate impact standard. This burden-shifting framework establishes lines of defense for accused violators of the FHA that are fundamentally flawed and demonstrate a lack of understanding of how machine learning technologies work.

 

First, under the proposal, a defendant need only demonstrate that their inputs in a model or algorithm are not a protected characteristic, or a substitute for a protected characteristic, in order to dismiss an accusation of a FHA violation.[1] However, it is well established that algorithmic bias rarely stems from a single protected characteristic or a substitute, but, rather, arises from incomplete data sets and historic human biases[2]. There are countless examples of this[3], not the least of which was HUD’s own decision to sue Facebook for housing discrimination earlier this year.[4]  Despite the fact that Facebook removed suspect categories from its menu items, recent research indicates that the algorithmic model in the ad delivery system itself was also responsible for bias in delivering housing advertisements.[5]  This shows that HUD should delve deeper into the mechanics of the decision making tools used by lenders, landlords, and realtors in order to determine whether these tools cause a disparate impact. HUD’s position in the Facebook case is plainly inconsistent with the proposed rules.

 

Second, the proposal removes liability from offenders by shifting responsibility to the third party that provides the decision making tool. Should a bank or rental company engage in unfair practices through a biased algorithm, no party would be held accountable. Allowing lenders, landlords and realtors to outsource the liability for housing discrimination ensures they will ignore discriminatory outcomes and actually incentivizes them to avoid asking their vendors questions. Rather than absolving housing stakeholders from responsibility, HUD should be encouraging actors to take steps to address biases. In the Algorithmic Accountability Act, which I recently introduced with Senator Ron Wyden and Congresswoman Yvette Clarke, we outline methods with which the federal government can mitigate the impacts of biased and discriminatory algorithms. HUD should require housing providers to adopt these policies, which include third party audits of algorithmic systems that study impacts on accuracy, fairness, bias, discrimination, privacy, and security.

 

Should the administration enact the proposed rule, millions of Americans could be subject to housing discrimination without recourse. At a time when the country faces an affordable housing crisis[6], the proposal not only weakens the federal government’s ability to execute the Fair Housing Act, but could potentially perpetuate biased algorithmic decision making across industries. I urge you to reconsider and rescind this proposal, and I appreciate your consideration of my request.

 

Sincerely,

 

Cory A. Booker
United States Senator

 

[1] https://www.federalregister.gov/d/2019-17542/p-95

[2] https://www.brookings.edu/research/algorithmic-bias-detection-and-mitigation-best-practices-and-policies-to-reduce-consumer-harms/#footnote-27

[3] In one example, Amazon’s same day delivery service was found to discriminate against primarily black communities, even though Amazon did not use race or a proxy for race in their decision-making process., see https://www.bloomberg.com/graphics/2016-amazon-same-day/

[4]https://www.hud.gov/press/press_releases_media_advisories/HUD_No_18_085

[5]Ali, M., Sapiezynski, P., Bogen, M., Korolova, A., Mislove, A. and Rieke, A. (2019), Discrimination Through Optimization: How Facebook’s Ad Delivery Can Lead To Skewed Outcomes. arXiv preprint arXiv:1904.02095.

[6]https://www.citylab.com/perspective/2019/10/affordable-housing-crisis-cities-rent-zoning-development/599758/?utm_campaign=citylab&utm_term=2019-10-11T14%3A33%3A22&utm_content=edit-promo&utm_medium=social&utm_source=twitter

(Visited 5 times, 1 visits today)

Comments are closed.

News From Around the Web

The Political Landscape