Independent Lens: Coded Bias

Independent Lens: Coded Bias

Mon, 03/22/2021 - 10:00pm - 11:00pm

MIT researcher Joy Buolamwini in Coded Bias

Credit: Producer Steve Acevedo

While conducting research on facial recognition technology at the MIT Media Lab, computer scientist Joy Buolamwini made the startling discovery that the algorithm could not detect dark-skinned faces or women with accuracy. 

In an increasingly data-driven, automated world, the question of how to protect individuals’ civil liberties in the face of artificial intelligence looms larger by the day. Directed by award-winning filmmaker Shalini Kantayya, Coded Bias follows MIT Media Lab researcher Joy Buolamwini, along with data scientists, mathematicians, and watchdog groups from different parts of the world, as they fight to expose the discrimination within the facial recognition algorithms now prevalent across all spheres of daily life. Independent Lens: Coded Bias premieres Monday, March 22, 2021 at 10 p.m. on WXXI-TV.

Joy Buolamwini discovery led to the harrowing realization that the very machine learning algorithms intended to avoid prejudice are only as unbiased as the humans and historical data programming them. Coded Bias documents the dramatic journey that follows, from discovery to exposure to activism, as Buolamwini goes public with her findings and undertakes an effort to create a movement toward accountability and transparency, even testifying before Congress to push for the first-ever legislation governing facial recognition in the United States. Around the world, artificial intelligence has already permeated every facet of public and private life—automating decisions about who gets hired, who gets health insurance, and how long a prison term should be—theoretically casting analyses and insights that are free from human prejudice. In addition to following Buolamwini’s journey, Kantayya also goes to London, where police are piloting the use of facial recognition technology; Houston, Texas, where teachers are evaluated via algorithms; and Hangzhou, China, which is quickly becoming a model for city-wide surveillance. And in each of these places, she profiles data scientists, mathematicians, ethicists, and every day individuals impacted by these disruptive technologies who are fighting to shed light on the impact of bias in A.I. on civil rights and democracy and to call for greater accountability.

share