Software Fairness
by Yuriy Brun, Alexandra Meliou
Abstract:
A goal of software engineering research is advancing software quality and the success of the software engineering process. However, while recent studies have demonstrated a new kind of defect in software related to its ability to operate in fair and unbiased manner, software engineering has not yet wholeheartedly tackled these new kinds of defects, thus leaving software vulnerable. This paper outlines a vision for how software engineering research can help reduce fairness defects and represents a call to action by the software engineering research community to reify that vision. Modern software is riddled with examples of biased behavior, from automated translation injecting gender stereotypes, to vision systems failing to see faces of certain races, to the US criminal justice system relying on biased computational assessments of crime recidivism. While systems may learn bias from biased data, bias can also emerge from ambiguous or incomplete requirement specification, poor design, implementation bugs, and unintended component interactions. We argue that software fairness is analogous to software quality, and that numerous software engineering challenges in the areas of requirements, specification, design, testing, and verification need to be tackled to solve this problem.
Citation:
Yuriy Brun and Alexandra Meliou, Software Fairness, in Proceedings of the New Ideas and Emerging Results Track at the 26th ACM Joint European Software Engineering Conference and Symposium on the Foundations of Software Engineering (ESEC/FSE), 2018, pp. 754–759.
Bibtex:
@inproceedings{Brun18fse-nier,
  author = {Yuriy Brun and Alexandra Meliou},
  title =
  {\href{http://people.cs.umass.edu/brun/pubs/pubs/Brun18fse-nier.pdf}{Software Fairness}}, 
  booktitle = {Proceedings of the New Ideas and Emerging Results Track at the
  26th ACM Joint European Software Engineering Conference and Symposium
  on the Foundations of Software Engineering (ESEC/FSE)},
  venue = {ESEC/FSE NIER},
  address = {Lake Buena Vista, FL, USA},
  month = {November},
  date = {6--9},
  year = {2018},
  pages = {754--759},
  accept = {$\frac{14}{51} \approx 27\%$},

  doi = {10.1145/3236024.3264838},  
  note = {\href{https://doi.org/10.1145/3236024.3264838}{DOI: 10.1145/3236024.3264838}},

  abstract = {A goal of software engineering research is advancing software
  quality and the success of the software engineering process. However, while
  recent studies have demonstrated a new kind of defect in software related
  to its ability to operate in fair and unbiased manner, software engineering
  has not yet wholeheartedly tackled these new kinds of defects, thus leaving
  software vulnerable. This paper outlines a vision for how software
  engineering research can help reduce fairness defects and represents a call
  to action by the software engineering research community to reify that
  vision. Modern software is riddled with examples of biased behavior, from
  automated translation injecting gender stereotypes, to vision systems
  failing to see faces of certain races, to the US criminal justice system
  relying on biased computational assessments of crime recidivism. While
  systems may learn bias from biased data, bias can also emerge from
  ambiguous or incomplete requirement specification, poor design,
  implementation bugs, and unintended component interactions. We argue that
  software fairness is analogous to software quality, and that numerous
  software engineering challenges in the areas of requirements,
  specification, design, testing, and verification need to be tackled to
  solve this problem.}, 

  fundedBy = {NSF CCF-1453474, NSF IIS-1453543, NSF CNS-1744471, NSF CCF-1763423},
}