Filmmaker Shalini Kantayya’s, Coded Bias, premiered at the 2020 Sundance Film Festival to critical acclaim. Centring the voices of women leading the charge to ensure our civil rights are protected, the film set out to answer two key questions: what is the impact of Artificial Intelligence’s increasing role in governing our liberties? And what are the consequences for people stuck in the crosshairs due to their race, colour, and gender?

Filmmaker, William J. Fulbright Scholar and TED Fellow, Kantayya joined Sarah Kaplan to discuss her lifelong fascination with technology and why she sought to explore how disruptive technology impacts issues of equality.

Is intelligence without ethics, really intelligence at all?

Coded Bias highlights how the seemingly impartial world of technology is shaped by racism and privilege and how invasive technologies often get deployed against the vulnerable. As an example, the film follows a group of Black apartment residents in Brooklyn challenging the use of facial recognition technology in their building.

According to Kantayya, automated systems act as gatekeepers to opportunities throughout numerous sectors of society. Advances in democracy, such as fair housing, equal employment, civil rights and civil liberties are all being rapidly transformed by AI. The government and policy makers can not keep up with the speed of technology’s progression, nor are they vetting systems for bias, accuracy, unintended consequences or harm. Kantayya encourages the introduction of a governing body with a strict series of practices, similar to the FDA, to regulate tech before it is deployed at scale.

Feminism is an underestimated force in tech

The film uncovers a resistance movement led by female data scientists and grassroots organizations championing civil liberties. Feminist leaders recognize tech’s inclusion crisis and that more women and people of colour need to be at the table.

What does personal accountability look like?

  • Knowledge is power. Predatory officers work on preying on people not knowing how AI and algorithms work. Kantayya encouraged everyone to take a deep dive into learning what AI can and cannot do. Legislative change is driven by public understanding and will.
  • Consider taking local action. Community initiatives can impact national and global policies. Questioning the use of facial recognition at your local police department or school can be a great starting point.
  • At a managerial level, work against algorithmic bias by hiring more inclusively.
WATCH KANTAYYA DISCUSS AI ACCOUNTABILITY AND ITS IMPACT ON DEMOCRACY