In 2016, Boulamwini debuted a short documentary video, The Coded Gaze: Unmasking Algorithmic Bias, at the Museum of Fine Arts, Boston. In it, she questioned the implications of facial recognition software that refused to recognize her until she placed a white mask on her face. What were the systems of bias that produced this software? What are the biases perpetuated by this widely used software?
“Through a combination of art, research, policy guidance and media advocacy, the Algorithmic Justice League is leading a cultural movement towards equitable and accountable AI. This requires us to look at how AI systems are developed and to actively prevent the harmful use of AI systems. We aim to empower communities and galvanize decision makers to take action that mitigates the harms and biases of AI.” —AJL.org
With this clear and compelling statement, the AJL introduces its call for action based in two fundamental principles: Equitable AI and Accountable AI. Equitable AI offers offers agency and control for people that interact with AI; affirms consent for all interactions with AI systems; and prohibits unjust use of AI by government systems. Accountable AI demonstrates meaningful transparency, continuous oversight, and redress for harm caused in the use of AI.
The AJL produces exhibitions, documentaries, spoken-word performances and events, and public talks and panels. Its many publications advocate for change and for meaningful oversight and regulation of artificial intelligence.Text To Speech
Joy Buolamwini, The Coded Gaze: Unmasking Algorithmic Bias (excerpts), 2016, video, Courtesy the Artist
︎ View The Coded Gaze