Podchaser Logo
Home
Deep Fakes - What are the big issues?

Deep Fakes - What are the big issues?

Released Friday, 16th April 2021
Good episode? Give it some love!
Deep Fakes - What are the big issues?

Deep Fakes - What are the big issues?

Deep Fakes - What are the big issues?

Deep Fakes - What are the big issues?

Friday, 16th April 2021
Good episode? Give it some love!
Rate Episode

Technology continues to evolve, faster than our laws can keep up. Over the last few years, there have been increased concerns regarding the artificial intelligence known as 'Deep Fakes'.

Deep Fakes is the term used to describe taking the face of someone and placing it in any image or video of choosing. Simply, you only need a picture of the individuals face. Surprisingly, some of these videos can look incredibly real.

We have seen this in free apps such as 'Reface' which can be immediately downloaded to your phone. Such apps allow you to put your face on your favourite film character or make a still picture of you, or even have you sing or move.  For many this is humorous, to be a character in your favourite music video or film. We have all seen such videos on our friends or family’s social media accounts, where they have turned themselves into a Christmas character or A-List celebrity video.

However, there is a much sinister and disturbing side to this technology. Anyone can create such videos using your image without your consent. This can then be uploaded to the internet immediately. Concerns regarding deep fakes include conflicting political statements, blackmail and fraud. Barack Obama’s voice has been used in such videos as well as videos using Donald Trump’s face.

Statistics featured in the Huffington Post article note such videos are increasingly pornographic in nature, with 96% of deep fake clips featuring an image replacing that of a porn actor. During 2019, statistics found the number of videos online doubled in a year and of the 85,000 circulating online, 90 per cent are non-consensual porn featuring women and many included images/ or videos engaging in extreme acts of sexual violence.

As noted above, this has included photographs of celebrities, politicians and regular individuals. Taylor Swift, Maisie Williams, Emma Watson, Michelle Obama, Meghan Markle, Boris Johnson and Mark Zuckerberg include just a few that have been victims of deep fakes technology.

Just using one example, in 2017, a Reddit user made deep fake videos of ‘Maisie Williams’ and ‘Taylor Swift’ having sex. Within 8 weeks, it had 90,000 subscribers.

Clearly the impact to victims can be significant both emotionally and financially if this impacts the victims career.  There can be huge embarrassment to the victim if this is widely shown with friends, family and/or work colleagues, especially if the viewer does not realise the imagery is a fake.

As with revenge porn, there are a number of concerns including; who posted the content, proving they didn’t consent to this and having this removed from the site (or perhaps multiple sites).

Many social media platforms including Pornhub, Facebook and Twitter have tried to ban them after public pressure. There are two new pieces of legislation, the EU’s Digital Services Act and the UK’s proposed Online Harms bill, which will hold platforms responsible for the content they host. However, this offers little support to the victim.

In the UK, you can be prosecuted for harassment for making and distributing such images/videos. In May 2018, a 25-year-old male was jailed for 16 weeks and ordered to pay £5,000 in compensation for photoshopping pictures of a female intern to porn websites

A campaign has been started called #MyImageMyChoice, calling for legal changes worldwide pushing for a global human rights solution to the problem by the Government creating world-leading intimate image abuse laws.  This is to focus on the violation of privacy and require an online consent for such imaginary to be placed online.

At the start of March, the UK Law Commission published a consultation paper with testimonies from #MyImageMyChoice. It will therefore be a matter of time before we see what changes, regulations and further protective measures are put in place.

If you believe you are a victim of deep fakes, contact the website administrators requesting this is removed without delay. You may also wish to report this to the police to investigate potential harassment charges against the perpetrator.

We encourage anyone who has concerns about sexual abuse to get in touch. You can contact Alan Collins or Danielle Vincent.

Show More
Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features