Coded Bias
2020

Coolidge Corner Theatre Brookline, MA

moderated by

Van Jones

CNN commentator

with

Joy Buolamwini

Film subject, CODED BIAS; Founder, Algorithmic Justice League; Research assistant, MIT Media Lab

and

Safiya Umoja Noble

Author, "Algorithms of Oppression"; Associate Professor of Information Studies and African American Studies, UCLA; Cofounder, UCLA Center for Critical Internet Inquiry

and

Clare Garvie

Senior associate, Georgetown Law Center on Privacy and Technology

Coded Bias— Beyond Minority Report: A look at artificial intelligence and law enforcement

During this special nationwide, virtual Science on Screen event, CNN commentator Van Jones moderates a discussion with CODED BIAS director Shalini Kantayya, Algorithmic Justice League founder and CODED BIAS subject Joy Buolamwini, Algorithms of Oppression author Safiya Umoja Noble, and Clare Garvie, lead author of Georgetown Law'sThe Perpetual Lineup report, about the biases underpinning facial recognition systems currently in use around the world, and their implications for law enforcement and freedom. With Kade Crockford, director of the ACLU's Technology for Liberty Program and Alvaro Bedoya, founding director of the Center on Privacy and Technology at Georgetown Law.

Coolidge Corner Theatre Brookline, MA

Film Synopsis

An exploration of the implications of MIT Media Lab researcher Joy Buolamwini's startling discovery that racial bias is written into the code of facial recognition algorithms.

Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence (AI) increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, AI is not neutral, and women are leading the charge to ensure our civil rights are protected.

Image courtesy of Shalini Kantayya

About the Speaker

Joy Buolamwini is a poet of code who uses art and research to illuminate the social implications of artificial intelligence. She founded the Algorithmic Justice League to create a world with more equitable and accountable technology. Her TED Featured Talk on algorithmic bias has over 1 million views. Her MIT thesis methodology uncovered large racial and gender bias in AI services from companies like Microsoft, IBM, and Amazon. Her research has been covered in over 40 countries, and as a renowned international speaker she has championed the need for algorithmic justice at the World Economic Forum and the United Nations. She serves on the Global Tech Panel convened by the vice president of European Commission to advise world leaders and technology executives on ways to reduce the harms of A.I. In late 2018 in partnership with the Georgetown Law Center on Privacy and Technology, Joy launched the Safe Face Pledge, the first agreement of its kind that prohibits the lethal application of facial analysis and recognition technology.

As a creative science communicator, she has written op-eds on the impact of artificial intelligence for publications like TIME Magazine and The New York Times. In her quest to tell stories that make daughters of diasporas dream and sons of privilege pause, her spoken word visual audit "AI, Ain't I A Woman?" which shows AI failures on the faces of iconic women like Oprah Winfrey, Michelle Obama, and Serena Williams as well as the Coded Gaze short have been part of exhibitions ranging from the Museum of Fine Arts, Boston to the Barbican Centre, UK. A Rhodes Scholar and Fulbright Fellow, Joy has been named to notable lists including Bloomberg 50, Tech Review 35 under 35, BBC 100 Women, Forbes Top 50 Women in Tech (youngest), and Forbes 30 under 30. She holds two masters degrees, from Oxford University and MIT, and a bachelor's degree in Computer Science from the Georgia Institute of Technology. A former pole vaulter, she still holds sentimental Olympic aspirations. Fortune Magazine named her to their 2019 list of world's greatest leaders, describing her as "the conscience of the A.I. Revolution."

Dr. Safiya Umoja Noble is an Associate Professor at the University of California, Los Angeles (UCLA) in the Department of Information Studies where she serves as the Co-Founder and Co-Director of the UCLA Center for Critical Internet Inquiry (C2i2). She also holds appointments in African American Studies and Gender Studies. She is a Research Associate at the Oxford Internet Institute at the University of Oxford and has been appointed as a Commissioner on the Oxford Commission on AI & Good Governance (OxCAIGG). She is a board member of the Cyber Civil Rights Initiative, serving those vulnerable to online harassment. and serves on the NYU Center Critical Race and Digital Studies advisory board. She is the author of a best-selling book on racist and sexist algorithmic bias in commercial search engines, entitled Algorithms of Oppression: How Search Engines Reinforce Racism (NYU Press), which has been widely-reviewed in scholarly and popular publications.

Dr. Noble is the recipient of a Hellman Fellowship and the UCLA Early Career Award. Her academic research focuses on the design of digital media platforms on the internet and their impact on society. Her work is both sociological and interdisciplinary, marking the ways that digital media impacts and intersects with issues of race, gender, culture, and technology. She is regularly quoted for her expertise on issues of algorithmic discrimination and technology bias by national and international press including The Guardian, the BBC, CNN International, USA Today, Wired, Time, Full Frontal with Samantha Bee, The New York Times, and a host of local news and podcasts. Her popular writing includes critiques on the loss of public goods to Big Tech companies, as featured in Noema magazine.

Safiya is the co-editor of two edited volumes: The Intersectional Internet: Race, Sex, Culture and Class Online and Emotions, Technology & Design. She currently serves as an Associate Editor for the Journal of Critical Library and Information Studies, and is the co-editor of the Commentary & Criticism section of the Journal of Feminist Media Studies. She is a member of several academic journal and advisory boards, and holds a Ph.D. and M.S. in Library & Information Science from the University of Illinois at Urbana-Champaign, and a B.A. in Sociology from California State University, Fresno where she was recently awarded the Distinguished Alumni Award for 2018. Recently, she was named in the “Top 25 Doers, Dreamers, and Drivers of 2019” by Government Technology magazine.

In 2020, she was awarded the Distinguished Alumna Award from the iSchool Alumni Association (ISAA), and is also the inaugural Diversity and Inclusion Award winner from the Illinois Alumni Association at the University of Illinois at Urbana-Champaign.

Clare Garvie joined the Center as a Law Fellow after graduating from Georgetown Law in 2015, and now serves as a Senior Associate. She was lead author on three of the Center’s reports on face recognition, including: The Perpetual Line-Up: Unregulated Police Face Recognition in America in 2016; and Garbage In, Garbage Out: Face Recognition on Flawed Data and America Under Watch: Face Surveillance in the United States in 2019. In 2019 she also testified before the House Oversight Committee about police use of face recognition. Her commentary has appeared in The New York Times, The Washington Post, and The Wall Street Journal, and she serves as an expert resource to both Democrats and Republicans in Congress and state legislatures. Her current research focuses on the use of face recognition-derived evidence in criminal cases and the ways activists, public defenders, and policymakers can ensure the technology is under control. Previously, she worked in human rights and international criminal law with the International Center for Transitional Justice (ICTJ). She received her J.D. from Georgetown Law and her B.A. from Barnard College in political science, human rights, and psychology.