Algorithmic Bias and the Power of Code: Investigating the Reproduction of Gender Structures in the Development of Computer Technologies
Research Article
Open Access
CC BY

Algorithmic Bias and the Power of Code: Investigating the Reproduction of Gender Structures in the Development of Computer Technologies

Heying Bai 1*
1 Sun Yat-sen University, Guangzhou, China
*Corresponding author: rara481846778@gmail.com
Published on 20 July 2025
Volume Cover
ACE Vol.173
ISSN (Print): 2755-273X
ISSN (Online): 2755-2721
ISBN (Print): 978-1-80590-231-7
ISBN (Online): 978-1-80590-232-4
Download Cover

Abstract

This study explores how everyday programming practices reproduce and amplify gender bias in software systems. By analyzing five widely used open-source repositories in the fields of recruitment and financial analysis (totaling over 200,000 lines of code), and conducting in-depth interviews with ten developers, we identified three common patterns of bias: rigidly coded binary gender labels, stereotype-driven feature engineering (such as “vacancy years”), and the negation of non-binary genders. Static code analysis marked these trends, while dynamic testing on the gender-balanced dataset showed that the false negative rate for female users could be up to 15 percentage points higher, and there was a 10% inflation in the risk score. The interview revealed that organizational pressures (tight deadlines, lack of built-in fairness tools, and lack of procedural guidelines) led designers to be unable to mitigate bias. Based on the above findings, we propose a method for classifying “bias-prone code features,” an integration strategy for automatically detecting bias in continuous integration, and targeted code review guidelines. This achievement not only provides empirical evidence of code-level unfairness but also presents practical suggestions for integrating fairness into the software development lifecycle, thereby promoting a more equitable social technology system.

Keywords:

Algorithmic bias, gender structures, code review, socio‑technical systems, equitable computing

View PDF
Bai,H. (2025). Algorithmic Bias and the Power of Code: Investigating the Reproduction of Gender Structures in the Development of Computer Technologies. Applied and Computational Engineering,173,50-56.

References

[1]. López, P. (2021). Bias does not equal bias: A sociotechnical typology of bias in databased algorithmic systems. Internet Policy Review, 10(4). Retrieved from https: //policyreview.info/articles/analysis/bias-does-not-equal-bias-socio-technical-typology-bias-data-based-algorithmic

[2]. Shrestha, S., & Das, S. (2022). Exploring gender biases in ML and AI academic research through systematic literature review. Frontiers in Artificial Intelligence, 5, Article 976838. https: //doi.org/10.3389/frai.2022.976838

[3]. Ntoutsi, E., Fafalios, P., Gadiraju, U., Iosifidis, V., Nejdl, W., Vidal, M. E., Ruggieri, S., Turini, F., & Papadopoulos, S. (2020). Bias in datadriven artificial intelligence systems—An introductory survey. Wiley Interdisciplinary Reviews: Data Mining and Knowledge Discovery, 10(3), e1356. https: //doi.org/10.1002/widm.1356

[4]. Prates, M. O., Avelar, P. H., & Lamb, L. C. (2020). Assessing gender bias in machine translation: A case study with Google Translate. Neural Computing and Applications, 32(9), 6363–6381. https: //doi.org/10.1007/s00521-020-04974-5

[5]. Tang, R., Du, M., Li, Y., Liu, Z., Zou, N., & Hu, X. (2021). Mitigating gender bias in captioning systems. In Proceedings of The Web Conference 2021 (pp. 633–645). https: //doi.org/10.1145/3442381.3449840

[6]. D’Amour, A., Srinivasan, H., Atwood, J., Baljekar, P., Sculley, D., & Halpern, Y. (2020). Fairness is not static: Deeper understanding of longterm fairness via simulation studies. In Proceedings of the 2020 Conference on Fairness, Accountability, and Transparency (pp. 525–534). https: //doi.org/10.1145/3351095.3372857

[7]. González, J. A., & Smith, K. (2023). Dealing with gender bias issues in dataalgorithmic processes: A socialstatistical perspective. Algorithms, 15(9), 303. https: //doi.org/10.3390/a15090303

[8]. Sultana, S., Turzo, A. K., & Bosu, A. (2022). Code reviews in open source projects: How do gender biases affect participation and outcomes? arXiv Preprint arXiv: 2210.00139.

[9]. Sultana, S., & Bosu, A. (2021). Are code review processes influenced by the genders of the participants? arXiv Preprint arXiv: 2108.07774.

[10]. Chun, J. S., De Cremer, D., Oh, E.J., & Kim, Y. (2024). What algorithmic evaluation fails to deliver: Respectful treatment and individualized consideration. Scientific Reports, 14, Article 25996. https: //doi.org/10.1038/s41598-024-76320-1

[11]. Park, J., & Lee, H. (2025). FairCode: Evaluating social bias of large language models in code generation. arXiv Preprint arXiv: 2501.05396.

[12]. Alvarez Ruiz, L. (2023, August 25). Gender bias in AI: An experiment with ChatGPT in financial inclusion. Center for Financial Inclusion. Retrieved from https: //www.centerforfinancialinclusion.org/gender-bias-in-ai-an-experiment-with-chatgpt-in-financial-inclusion

Cite this article

Bai,H. (2025). Algorithmic Bias and the Power of Code: Investigating the Reproduction of Gender Structures in the Development of Computer Technologies. Applied and Computational Engineering,173,50-56.

Data availability

The datasets used and/or analyzed during the current study will be available from the authors upon reasonable request.

About volume

Volume title: Proceedings of the 7th International Conference on Computing and Data Science

ISBN: 978-1-80590-231-7(Print) / 978-1-80590-232-4(Online)
Editor: Marwan Omar
Conference website: https://2025.confcds.org/
Conference date: 25 September 2025
Series: Applied and Computational Engineering
Volume number: Vol.173
ISSN: 2755-2721(Print) / 2755-273X(Online)