Mostra: A Flexible Balancing Framework to Trade-off User, Artist and Platform Objectives for Music Sequencing
Emanuele Bugliarello, Rishabh Mehrotra, James Kirk, Mounia Lalmas
Potential negative outcomes of machine learning and algorithmic bias have gained deserved attention. However, there are still relatively few standard processes to assess and address algorithmic biases in industry practice. Practical tools that integrate into engineers’ workflows are needed. As a case study, we present two tooling efforts to create tools for teams in practice to address algorithmic bias. Both intend to increase understanding of data, models, and outcome measurement decisions. We describe the development of 1) a prototype checklist based on existing literature frameworks; and 2) dashboarding for quantitatively assessing outcomes at scale. We share both technical and organizational lessons learned on checklist perceptions, data challenges and interpretation pitfalls.
Emanuele Bugliarello, Rishabh Mehrotra, James Kirk, Mounia Lalmas
Maryam Aziz, Alice Wang, Aasish Pappu, Hugues Bouchard,Yu Zhao, Benjamin Carterette and Mounia Lalmas
Alice Wang, Aasish Pappu, Henriette Cramer