Podchaser Logo
Home
Ethics isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

Ethics isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

Released Thursday, 3rd December 2020
Good episode? Give it some love!
Ethics isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

Ethics isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

Ethics isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

Ethics isn’t Enough - Designing Equitable Technology: Growing Digital Ethics in Practice

Thursday, 3rd December 2020
Good episode? Give it some love!
Rate Episode

For this episode, we’re joined by Mutale Nkonde, founder and CEO of AI for the People, a non-profit focused on the social justice implications of the use of AI in public life. Mutale is currently a Practitioner Fellow at the Digital Civil Society Lab in The Stanford Center on Philanthropy and Civil Society, and a Fellow at the Berkman Klein Center for Internet and Society at Harvard Law School. She is also a member of the TikTok Content Advisory Council and a key constituent 3C UN Roundtable for AI. Before this, she was an AI Policy advisor and was part of a team that introduced Algorithmic Accountability and Deep Fakes Accountability Acts to the US House of Representatives.

Mutale encouraged listeners to move away from the concept of ‘ethics’, which can be too slippery to design policy around and to focus instead on harm reduction for the most vulnerable groups and reducing inequality overall. By designing for the consumer who has the least systemic power, we can all benefit. Mutale’s prior research includes a framework on racial literacy designed to empower technologists to reduce the harms of racism through three foundations: acknowledging that structural racism has a bearing on technological outcomes, developing the emotional intelligence to have these conversations instead of avoiding them, and making a commitment to take action for harm reduction.

A powerful example of recognizing a problem and working quickly to commit to addressing it happened in the week we recorded this podcast: Twitter users noticed that the algorithm which crops images in the news feed was repeatedly prioritizing white faces. Community pressure grew to address this problem, and Twitter eventually committed to reviewing the system again—even though they had originally tested the system for racial and gender bias when it was originally deployed, clearly more work is needed. Mutale highlighted this as an example of people who were being affected by an issue of bias managing to create accountability where there would otherwise have been none. But there are many technology systems which create invisible unequal outcomes, and there is still more work to do to ensure technologists recognize the harms they might cause and have mechanisms in place to reduce these harms—with one of the most powerful being involving the stakeholders who might be disadvantaged by a technological choice in designing the systems that affect them.

As well as the racial literacy framework, Mutale pointed to two other major levers that can be used to improve technological outcomes for all: the first is introducing better technology legislation and more powerful penalties for violating consumer protection law and human rights law, and the second is to develop more creative visions of the future of what technology can do for us, both in terms of benefits as well as possible harms. Many of the technologies which are regular parts of our lives today started not as a research experiment or commercial project, but as an imagined piece of science fiction which was eventually brought to life by passionate and committed engineers. Mutale recommended the work of Octavia Butler as a great starting point for those looking for more imaginative and egalitarian future envisioning.

You can find out more about Mutale’s work at https://www.mutale.tech/, and her work for Data & Society on racial literacy, including a comprehensive framework for harm reduction, at https://racialliteracy.tech/.

Show More

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features