Podchaser Logo
Home
Katherine Maher on Tools for Combating Disinformation

Katherine Maher on Tools for Combating Disinformation

Released Thursday, 19th December 2019
Good episode? Give it some love!
Katherine Maher on Tools for Combating Disinformation

Katherine Maher on Tools for Combating Disinformation

Katherine Maher on Tools for Combating Disinformation

Katherine Maher on Tools for Combating Disinformation

Thursday, 19th December 2019
Good episode? Give it some love!
Rate Episode

Truth and facts are not the same thing. To find truth, we must combine the best available information (what we know to be facts) with our own lens. This is especially challenging on social media: posts may be presented as if they are truthful, but truth comes with some subjectivity. This is where fact checking tools — like the encyclopedic knowledge of Wikipedia — are essential. That is, if they present neutral information, offer unbiased results, have a transparent process and are able to be edited.

In this episode of Big Tech, co-hosts David Skok and Taylor Owen speak with Katherine Maher, the Wikimedia Foundation’s chief executive officer and executive director. Maher joined Wikimedia in 2016 just prior to the election of President Trump. She is acutely aware of Wikipedia’s emerging role as a fact checking tool.

Transparent revision history is a cornerstone of Wikipedia. Anyone using the tool can access the edit log and provide their own edits to correct information. Some technologies subvert this process. For example, connected devices such as smart speakers pull Wikipedia’s information to provide answers to users without allowing feedback or sharing a record of edits. Other factors influence Wikipedia’s transparency and neutrality, as well. Maher’s team is looking to address biases that exist within Wikipedia’s database; she acknowledges that there is a gender imbalance, both with Wikipedia editors (80 percent of edits are made by men) and with the low percentage of articles about women and minorities. Maher explains: “we know, for example, that an article about a woman is four times more likely to mention her marital status than an article about a man. If you're doing [AI] training of semantic pairing and you start to associate someone's marital status with their gender, in so far as there's a sort of higher correlation of value to someone being married or divorced, being a woman that propagates that bias out into all of the products that are then going to go ahead and use that algorithm or that dataset in the future.” Wikipedia will need to work to solve these issues if it wishes to remain a trusted source for facts.

Show More

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features