Translation, Tracks & Data: an Algorithmic Bias Effort in Practice

Abstract

Potential negative outcomes of machine learning and algorithmic bias have gained deserved attention. However, there are still relatively few standard processes to assess and address algorithmic biases in industry practice. Practical tools that integrate into engineers’ workflows are needed. As a case study, we present two tooling efforts to create tools for teams in practice to address algorithmic bias. Both intend to increase understanding of data, models, and outcome measurement decisions. We describe the development of 1) a prototype checklist based on existing literature frameworks; and 2) dashboarding for quantitatively assessing outcomes at scale. We share both technical and organizational lessons learned on checklist perceptions, data challenges and interpretation pitfalls.

Related

May 2021 | ICWSM

Representation of Music Creators on Wikipedia, Differences in Gender and Genre

Alice Wang, Aasish Pappu, Henriette Cramer

May 2021 | CHI

Towards Fairness in Practice: A Practitioner-Oriented Rubric for Evaluating Fair ML Toolkits

Brianna Richardson, Jean Garcia-Gathright, Samuel F. Way, Jennifer Thom, Henriette Cramer

October 2020 | ISMIR - International Society for Music Information Retrieval Conference

Artist gender representation in music streaming

Avriel Epps-Darling, Romain Takeo Bouyer, Henriette Cramer