Battling Bias in AI

Researcher's work explores how artificial intelligence can avoid inequity

As artificial intelligence becomes increasingly embedded in everyday life, concerns have grown around bias in its programming. Many of the tasks performed by AI are simple and innocuous, but as its capabilities expand, so does its potential for wide-ranging impact. Bias can cause artificial intelligence to make decisions that are systematically unfair to particular groups of people, and researchers have found this can cause real harm. Rutgers–Camden researcher Iman Dehzangi, whose most recent article, "A review on deep learning approaches in healthcare systems," was published in the Journal of Biomedical Informatics, believes institutions must carefully balance the advantages and drawbacks of AI.

“Artificial intelligence and machine learning are poised to create valuable opportunities by automating or accelerating many different tasks and processes,” said Dehzangi, assistant professor of computer science in the Camden College of Arts and Sciences. “One of the challenges, however, is to overcome potential pitfalls such as bias.” 

Biased AI can give consistently different outputs for certain groups compared to others. Biased outputs can discriminate based on race, gender, biological sex, nationality, social class, or many other factors. Human beings choose the data that algorithms use, and even if these humans make conscious efforts to eschew bias, it can still be baked into the data they select. Extensive testing and diverse teams can act as effective safeguards, but even with these measures in place, bias can still enter machine-learning processes. AI systems then automate and perpetuate biased models.

Iman Dehzangi, assistant professor of computer science

Iman Dehzangi, assistant professor of computer science

"Because machine learning is dependent upon data, if the data is biased or flawed, the patterns discerned by the program and the decisions made as a result will be biased, too," said Dehzangi, pointing to a common saying in the tech industry: "garbage in, garbage out." He suggests taking the time to deconstruct and understand the problem looking to be solved, which can help identify potential bias before any technology solution is implemented.

Organizations and corporations have jumped on the opportunity presented by AI and Big Data to automate processes and increase the speed and accuracy of decisions large and small. Market research has found that 84 percent of C-suite executives believe they must leverage artificial intelligence to achieve their growth objectives. Three out of four believe that if they don't take advantage of AI in the next five years, they risk going out of business entirely.

“There is not a successful business in operation today that is not using AI and machine learning,” said Dehzangi. Whether it is making financial investments in the stock market, facilitating the product development life cycle, or maximizing inventory management, forward-thinking businesses are leveraging this new technology to remain competitive and ahead of the curve. However, if they fail to account for bias in these emerging technologies, they could fall even further behind, remaining mired in the flawed data of the past. Research has revealed that if care is not taken in the design and implementation of AI systems, longstanding social biases can be embedded in the systems' logic.

When it comes to AI, businesses and other organizations should maintain their enthusiasm about new technology but understand the challenges that come with it. Dehzangi believes potential biases must be addressed as soon as a new technology is adopted, rather than after a problem has already occurred.

“Businesses should look to engage data scientists and other individuals from across the organization as early as possible,” Dehzangi. "It is worth investing the time to understand the process being solved with the technology, to ensure the models accurately reflect the decision-making process and data is weighted properly.”

Creative Design: Beatris Santos

Connect with Us

LinkedIn Icon Instagram Icon Facebook Icon Twitter Icon Youtube Icon Website Icon