When do Computers Discriminate? Toward Informing Users About Algorithmic Discrimination
In this collaborative project with University of Maryland, ICSI researchers are tackling the challenge of explaining what constitutes unacceptable algorithmic discrimination. Getting the answer to this question right is key to unlocking the potential of automated decision systems without eroding the ability of people to get a fair deal and advance in society. In order for society to answer these moral and ethical questions, however, non-technical people -- from legal and policy experts to ordinary laypeople -- must be able to understand subtle distinctions between mathematical concepts of fairness. In this project, researchers are developing and evaluating text and graphical descriptions and/or vignettes illustrating different nondiscrimination properties and their tradeoffs. For concreteness, this exploratory work is focused only on accuracy-like nondiscrimination properties, only in the context of criminal justice, such as algorithms used in bail and sentencing decisions. This is accomplished using iterative, qualitative, person-centered design involving both non-computer-science subject-matter experts in law and social science and laypeople to develop and preliminarily evaluate the explanations provided by the researchers. Both populations are involved because the understanding and opinions of both are critical to decision-making in a democratic society, and therefore explanations must be usable by both.
The definitions that the researchers represent and how they represent them will be shaped by the parallel effort to systematize the space of nondiscrimination properties. This effort will inform the researchers' qualitative design efforts; concurrently, interviews with legal and ethical experts will also shape the systematization, in a process of iterative refinement. The end product will be a description of how various nondiscrimination definitions differ along the axes the researchers' empirical studies find most important.
Funding provided by NSF