top of page
p18013692_v_v8_ab_edited.jpg

Coded Bias

Coded Bias follows MIT Media Lab researcher Joy Buolamwini’s startling discovery that many facial recognition technologies fail more often on darker-skinned faces, and delves into an investigation of widespread bias in artificial intelligence.

Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, artificial intelligence is not neutral, and women are leading the charge to ensure our civil rights are protected.

bottom of page