Hopkins Center for the Arts at Dartmouth College Hanover, NH
Eugene Santos
Professor of Engineering, Dartmouth College
andJacqueline Wernimont
Distinguished Chair, Digital Humanities and Social Engagement, Dartmouth College; Associate Professor, Women's, Gender, and Sexuality Studies; Co-director, Humanities, Arts, Science, and Technology Alliance and Collaboratory
andChristina Lu
Software engineer, DeepMind
moderated byMarcos Stafne
Executive Director, Montshire Museum of Science
Coded Bias— [ONLINE] AI bias and its consequences
Program Description
Computer engineers Christina Lu and Eugene Santos discuss bias in artificial intelligence with tech historian and digital media expert Jacque Wernimont.
Presented At
Hopkins Center for the Arts at Dartmouth College Hanover, NH
Film Synopsis
An exploration of the implications of MIT Media Lab researcher Joy Buolamwini's startling discovery that racial bias is written into the code of facial recognition algorithms.
Modern society sits at the intersection of two crucial questions: What does it mean when artificial intelligence (AI) increasingly governs our liberties? And what are the consequences for the people AI is biased against? When MIT Media Lab researcher Joy Buolamwini discovers that most facial-recognition software does not accurately identify darker-skinned faces and the faces of women, she delves into an investigation of widespread bias in algorithms. As it turns out, AI is not neutral, and women are leading the charge to ensure our civil rights are protected.
Image courtesy of Shalini Kantayya