全文截稿: 2025-03-20
影响因子: 2.726
CCF分类: B类
中科院JCR分区:
• 大类 : 计算机科学 - 2区
• 小类 : 计算机:信息系统 - 2区
• 小类 : 计算机:软件工程 - 1区
网址:
http://www.journals.elsevier.com/information-and-software-technology/
Technology plays a crucial role in people’s lives, influencing several aspects of modern society, such as work, education, politics, and leisure. If software engineering does not strive to be inclusive in all its facets (i.e., education, research, and industry), software products might unintentionally constrain groups of users. The expectation that software is effective in representing the multifaceted characteristics of our society has transcended technical needs and now stands as an ethical obligation for developing algorithms and systems that are both equitable and inclusive. In this context, the concept of software fairness emerges as a crucial non-functional requirement and a quality attribute for software, especially those based on data-driven processes.
Software fairness refers to the ethical principle and practice of ensuring that software systems, algorithms, and their outcomes are just, equitable, and unbiased across different groups of people, regardless of their gender, race, ethnicity, sexual orientation, cultural background, or any other aspect that composes their identity. In software engineering, fairness typically involves preventing discrimination, promoting inclusivity, and mitigating potential biases in the design, development, deployment, and usage of software systems. Though not entirely new to software development, this concept has only recently gained traction, fueled by the escalating discussions surrounding software engineering for artificial intelligence and ethics in machine learning —a scenario that highlights its essential role in understanding the impact of biased software in modern software engineering practices. However, this debate is still evolving slowly. It seems counter-intuitive, but the area responsible for creating innovative software solutions for billions of users worldwide does not reflect the diversity of the society it serves, e.g., algorithms are racist, technical forums are sexist, and the software industry is not welcoming to underrepresented groups.
Many research challenges and opportunities remain to be addressed in this area. This special issue is open to all manuscripts presenting novel and strong contributions to deal with software fairness, including (i) state-of-the-art methods, models, and tools (with evidence of use and study of practical impact) or bridging the gap between practice and research and (ii) empirical studies in the field, addressing one or many human, technical, social, and economic issues of software fairness through qualitative and/or quantitative analyses.
Guest editors: Rodrigo Spínola, Ronnie de Souza Santos