Biased Artificial Intelligence has Sinister Consequences for Marginalized Communities, Argue Panelists – BroadbandBreakfast.com

WASHINGTON, February 13, 2020 Biased artificial intelligence poses obstacles for marginalized communities when trying to access financial services like applying for a mortgage loan, said panelists speaking before the House Committee on Financial Services.

In a statement before the committee on Wednesday, privacy and AI advisor Br A. Williams wrote, Data sets in financial services are used to determine home ownership and mortgage, savings and student loan rates; the outcomes of credit card and loan applications; credit scores and credit worthiness, and insurance policy terms.

In practice, biased AI could mean that black homeowners were confined to specific areas of a city and that their credit worthiness led to higher interest rates, Williams said.

Rayid Ghani, of the Machine Learning Department at Carnegie Mellon Universitys Heinz College of Information Systems and Public Policy, said that it is not enough to create an equitable AI. Rather, there needs to be equity across the entire decision-making process.

Machine bias is not inevitable, nor is it final, concurred Brookings Institution Fellow Makada Henry-Nickie.

This bias though, is not benign. AI has enormous consequences for racial, gender, and sexual minorities, said Henry-Nickie.

University of Pennsylvania Professor Michael Kearns said biased AI is generally not the result of human malfeasance, such as racist or incompetent software developers.

However, Williams argued that if AI is being fed historical data, its already biased.

In order to create an equal AI system, Ghani included steps to an equitable process in the actual construction of AI. Ghani suggested:

Follow this link:
Biased Artificial Intelligence has Sinister Consequences for Marginalized Communities, Argue Panelists - BroadbandBreakfast.com

Related Posts

Comments are closed.