To read this content please select one of the options below:

A systematic review of socio-technical gender bias in AI algorithms

Paula Hall (School of Management, IT and Governance, University of KwaZulu-Natal - Pietermaritzburg Campus, Pietermaritzburg, South Africa)
Debbie Ellis (School of Management, IT and Governance, University of KwaZulu-Natal - Pietermaritzburg Campus, Pietermaritzburg, South Africa)

Online Information Review

ISSN: 1468-4527

Article publication date: 14 March 2023

Issue publication date: 8 November 2023

4022

Abstract

Purpose

Gender bias in artificial intelligence (AI) should be solved as a priority before AI algorithms become ubiquitous, perpetuating and accentuating the bias. While the problem has been identified as an established research and policy agenda, a cohesive review of existing research specifically addressing gender bias from a socio-technical viewpoint is lacking. Thus, the purpose of this study is to determine the social causes and consequences of, and proposed solutions to, gender bias in AI algorithms.

Design/methodology/approach

A comprehensive systematic review followed established protocols to ensure accurate and verifiable identification of suitable articles. The process revealed 177 articles in the socio-technical framework, with 64 articles selected for in-depth analysis.

Findings

Most previous research has focused on technical rather than social causes, consequences and solutions to AI bias. From a social perspective, gender bias in AI algorithms can be attributed equally to algorithmic design and training datasets. Social consequences are wide-ranging, with amplification of existing bias the most common at 28%. Social solutions were concentrated on algorithmic design, specifically improving diversity in AI development teams (30%), increasing awareness (23%), human-in-the-loop (23%) and integrating ethics into the design process (21%).

Originality/value

This systematic review is the first of its kind to focus on gender bias in AI algorithms from a social perspective within a socio-technical framework. Identification of key causes and consequences of bias and the breakdown of potential solutions provides direction for future research and policy within the growing field of AI ethics.

Peer review

The peer review history for this article is available at https://publons.com/publon/10.1108/OIR-08-2021-0452

Keywords

Citation

Hall, P. and Ellis, D. (2023), "A systematic review of socio-technical gender bias in AI algorithms", Online Information Review, Vol. 47 No. 7, pp. 1264-1279. https://doi.org/10.1108/OIR-08-2021-0452

Publisher

:

Emerald Publishing Limited

Copyright © 2023, Emerald Publishing Limited

Related articles