Podchaser Logo
Home
Expanding the reach of tech for good ft. Dr. Phil Budden and Lacey Kesler

Expanding the reach of tech for good ft. Dr. Phil Budden and Lacey Kesler

Released Tuesday, 1st February 2022
Good episode? Give it some love!
Expanding the reach of tech for good ft. Dr. Phil Budden and Lacey Kesler

Expanding the reach of tech for good ft. Dr. Phil Budden and Lacey Kesler

Expanding the reach of tech for good ft. Dr. Phil Budden and Lacey Kesler

Expanding the reach of tech for good ft. Dr. Phil Budden and Lacey Kesler

Tuesday, 1st February 2022
Good episode? Give it some love!
Rate Episode

Innovation is about making the future more advanced, but what advancements are we making to help today’s tech be more accessible? Since everyone doesn’t have the same digital reach, more needs to be done to make tech, training, and development opportunities as equitable as possible. Luckily, there are experts ahead of the game. Listen as MIT lecturer Dr. Phil Budden and Webflow community education manager Lacey Kesler talk about the importance of low-code/no-code alternatives, visual development, changing our approach to education, creating more inclusive apps, and more.

 

Key Takeaways:

[2:43] Why does bias happen in IT systems? When there isn’t enough diversity in the data sets that the model is being trained on, it takes on this bias in the life of the algorithm. Elizabeth shares an example in facial recognition where the data ends up being sold on the market, a customer uses that data, and makes a decision about whether or not someone is deemed a threat based on biased information. Then, if law enforcement agencies decide to use that data, they can over-police in already over policed communities and cause a systemic problem, all because of that data.

[5:35] All areas of our life are impacted by algorithms, from traffic patterns to predictions about who should get a loan, their interest rate, health insurance, and what type of health coverage someone is granted.

[6:13] Joe shares two scenarios about how humans will interact with machines over the next coming decades. First, humans are replaced by machines. Second, and the most likely scenario, humans will collaborate with machines to create a better solution and higher productivity.

[8:00] Human supervision can be extremely relevant in using information technology and AI. Joe shares some examples from MIT’s Kevin Slavin such as flash crashes, caused by program trading.

[10:54] Responsibility in AI is a shared responsibility between both the technical and non technical teams. Building ethical technology doesn’t eliminate the possibility of unethical results, and we need more resources dedicated to areas like AI Ethics and governance within our companies, especially large ones acting as nation states.

[16:27] Elizabeth discusses some best practices that will add ethics into more computer science courses and students get a critical perspective early on.

[18:09] Companies that don’t consider themselves to be in the tech business will need to play catch up fast and take on that responsibility themselves before the government has to step in. Hopefully, more companies will begin to take a more serious look at the ethical components of the tech they rely on. Elizabeth discusses the long-wave theory, which talks about how long it takes for all of the different revolutions.

[23:27] Will we be in a Terminator SkyNet scenario? Quite possibly, says Joe, but we have to figure out where humans are going to be in the loop and understand what our algorithms are doing and how they're training other algorithms.

 

Quotes:

  • [2:25] “Data is what they’re calling the new oil, and there’s a race to how much data a company can consume.” - Elizabeth
  • [5:39] “All the technologies that make sense of more data in less time and more intricate ways are fueling some of the most exciting and polarizing advancements.” - Jo
  • [7:57] “The best performance sometimes is through a joint human and machine.” - Joe
  • [14:10] “If you look at human behavior, you have a wide spectrum of possibilities, ranging from Mother Teresa to say a dictator that kills millions of people. The way the technology gets employed, and that is not the technology's fault.” - Joe
  • [18:48] “For those companies who are not able to quickly adapt to this digital moment that we are having, I don’t think they will be around for long. That’s where we are, where we are going to stay, and where the jobs are going to be.” - Elizabeth
  • [25:18] “We have to put ethics at the forefront of all of our business. Whether you think you work in tech or not.” - Jo

 

Continue on your journey:

pega.com/podcast

 

Mentioned:

Show More

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features