Booker, Wyden Demand Answers on Biased Health Care Algorithms

U.S. Senator Cory Booker (D-NJ)

Booker, Wyden Demand Answers on Biased Health Care Algorithms

Letters to FTC, CMS, health care companies follow disturbing revelations of flawed algorithms impacting care for black patients

 

WASHINGTON, D.C. – U.S. Senators Cory Booker (D-NJ) and Ron Wyden (D-OR) today called on the Federal Trade Commission (FTC), the Centers for Medicare and Medicaid Services (CMS), and five of the largest health care companies in the nation to provide more information on how they are addressing bias in algorithms used in many health care systems. The series of letters follows recent revelations published in the journal Science that a widely-used software program severely underestimated the health care needs of black patients because of its racially-biased algorithm.

 

In the letters to CMS Administrator Seema Verma, FTC Chairman Joseph Simons and top executives of UnitedHealth Group, Blue Cross Blue Shield, Cigna Corporation, Humana, and Aetna, the lawmakers highlighted the profound threat biased algorithms can pose to marginalized communities by systematically denying them care.

 

“In using algorithms, organizations often attempt to remove human flaws and biases from the process,” the lawmakers wrote. “Unfortunately, both the people who design these complex systems, and the massive sets of data that are used, have many historical and human biases built in. Without very careful consideration, the algorithms they subsequently create can further perpetuate those very biases.”

 

The lawmakers asked the FTC and CMS a series of questions about the steps they are taking to address algorithmic bias in the health care system, how well-equipped their current enforcement mechanisms were to handle algorithmic biases, and the scope of the challenge. They also asked the FTC to commit to investigate the ways that algorithms unfairly discriminate against marginalized communities.

 

In their letter to the executives, the lawmakers asked for specific details about the algorithms their companies use to improve patient care and what safeguards the companies have put in place to prevent bias.

 

The letter to CMS can be found here. The letter to FTC can be found here. And the letter to the executives can be found here.

 

Background on Booker’s record fighting algorithmic bias:

 

Booker was one of the first lawmakers to call for increased antitrust scrutiny of major tech companies, and he has consistently fought to ensure that extant discrimination and bias in our society does not become augmented and automated in the future.

 

Booker recently introduced the No Biometric Barriers to Housing Act, which prohibits the use of facial recognition and other biometric identification technology in public housing units, an alarming trend that poses significant privacy risks for vulnerable communities.

 

In October, Booker submitted public comments opposing a proposed Housing and Urban Development (HUD) rule that would allow housing providers to avoid liability for algorithmic discrimination if they use algorithmic tools developed by a third party. With this proposed rule, HUD, the nation’s main civil rights enforcer for housing discrimination, effectively provides a road map for landlords and lenders who wish to discriminate by algorithm.

 

Booker and Wyden authored the Algorithmic Accountability Act, which requires companies to fix discriminatory algorithms and outlines methods the federal government can use to mitigate the impacts of such algorithms.

 

Last year, Booker secured a commitment from Facebook CEO Mark Zuckerberg — for the first time — to conduct a civil rights audit that, among other items, would address the algorithmic biases on its platforms. Also last year, he secured language in the Federal Aviation Administration Authorization Act requiring TSA to report to Congress on methods to eliminate bias on race, gender, and age as it begins deploying facial recognition on domestic flights.

 

Booker also partnered with Senator Wyden in July 2018 to survey 39 federal law-enforcement agencies about their use of facial recognition technology, and what policies, if any, they have installed to prevent abuse, misuse, and discrimination. That same month he also joined Senators in requesting the Government Accountability Office study commercial and government uses of facial recognition technology, and examine what safeguards companies and law enforcement have implemented to prevent misuse and discrimination based on race, gender, and age.

 

###

(Visited 7 times, 1 visits today)

Comments are closed.

News From Around the Web

The Political Landscape