The more scientific software is used in decision support, the more serious the effects of unintended bias become, up to and including an outright threat to civil rights.
|Film name||Coded Bias|
|Presenter(s)||Joy Buolamwini, Cathy O’Neil, Safiya Umoja Noble, and others|
The 90 minute film, Coded Bias, discusses the social justice implications and impacts of bias in software. The film is a good resource for anyone in HPC/CSE looking to add to their Inclusion toolbox. An Inclusion toolbox contains strategies, practical tools, policies and procedures to help one design and implement an inclusive program in their domain.
While the film focuses primarily on big-data and AI software, the issues it raises are equally applicable to any software used in decision support, including for example, scientific computing software. Some of the key take-aways are:
- We should beware of the tendancy to believe, sometimes falsely, automated algorithms are objective.
- Just as the FDA regulates and oversees the safety, effectiveness and quality of vaccines, there needs to be an FDA for algorithms.
- Algorithmic justice poses a significant threat to civil rights.
The film focuses on the work of Joy Buolamwini, founder of the Algorithmic Justice League in analyzing bias in facial recognition software where she demonstrates various products' ability to even detect a face correlates strongly with facial skin tone. The darker the subject's skin, the less likely the software is to make a detection.
The film includes interviews with several other industry experts on bias in software. It gives several examples of bias in AI and how that bias negatively impacts the lives of real people. It also describes cases where big-data algorithms have demonstrated an ability to abruptly influence the real world behavior of millions of individuals. Buolamwini warns that algorithmic justice poses a great threat to civil rights.