Podchaser Logo
Home
Americanswers! How algorithms shape us online, Russian interference in 2024, and should the US legalise fentanyl?

Americanswers! How algorithms shape us online, Russian interference in 2024, and should the US legalise fentanyl?

Released Monday, 25th March 2024
Good episode? Give it some love!
Americanswers! How algorithms shape us online, Russian interference in 2024, and should the US legalise fentanyl?

Americanswers! How algorithms shape us online, Russian interference in 2024, and should the US legalise fentanyl?

Americanswers! How algorithms shape us online, Russian interference in 2024, and should the US legalise fentanyl?

Americanswers! How algorithms shape us online, Russian interference in 2024, and should the US legalise fentanyl?

Monday, 25th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

This is the BBC. Ryan

0:12

Reynolds here for Mint Mobile. With the price

0:14

of just about everything going up during inflation,

0:16

we thought we'd bring our prices down. So

0:19

to help us, we brought in a reverse auctioneer, which is

0:22

apparently a thing. Mint Mobile Unlimited Premium Wireless. Ready to get

0:24

30, ready to get 30, ready to get 20, 20, 20,

0:26

ready to get 20, 20, ready to

0:29

get 15, 15, 15, 15, just 15

0:31

bucks a month. Sold! Give it

0:33

a try at mintmobile.com/switch. $45

0:36

up front for 3 months plus taxes and fees. Promote for new

0:38

customers for a limited time. Unlimited more than 40GB per month.

0:40

Slows. mintmobile.com. BBC

0:45

sounds music radio podcasts. Hi

0:47

everybody. It is Sarah here

0:50

with the Monday edition of America

0:53

star America answers because it's when we

0:55

respond to your questions I'm in New York

0:57

City for yet another date

0:59

in the Trump legal calendar and

1:01

we've got with us Miles Taylor in Washington

1:04

Good morning, and Katie

1:06

Hill is in Los Angeles. Good

1:09

morning, everyone And it's me

1:11

Marianna in the worldwide headquarters in London and

1:13

it's actually my first time ever on the

1:15

Monday episode so it's brilliant to meet you

1:18

Mars and Katie and We've

1:20

got lots of questions and particularly about

1:22

actually our last episode of America's where

1:24

me and Justin Chatter to Don Lemon.

1:26

He's the former CNN journalist and it

1:28

was all about his interview with Elon

1:31

Musk which ended in quite an eventful

1:33

way and Maybe unsurprisingly,

1:35

we've had quite a lot of messages from people

1:37

who are feeling a bit pessimistic about social media.

1:39

Welcome to my world We've

1:42

got this question which is about the kind of

1:44

difference between social media and the traditional media from

1:47

Steve in Rochester in the UK Hi,

1:49

America's team I always enjoy the show

1:51

and was fascinated by the conversation about

1:53

the Don Lemon and Elon Musk interview

1:56

Marianna and Justin were talking about the fear that

1:58

because of the social rise of social media

2:00

platforms, people are increasingly

2:02

only hearing voices and views that

2:04

reinforce their existing beliefs. While

2:07

I agree with that, hasn't it always been like that

2:09

to a degree? I'm referring to

2:11

the print media. Surely people

2:13

generally choose a publication because it

2:15

panders to their existing prejudices. That's

2:17

a very good question Steve and it's

2:19

something I get asked quite a lot

2:21

because disinformation and polarisation, they're not new

2:23

things, they definitely predate social media. I

2:25

think what's interesting though is actually looking

2:27

at your use of the verb to

2:29

choose which I think is what's a

2:31

bit different about social media and that's

2:33

because algorithms, the computer-generated systems that

2:35

exist, they recommend you stuff that they think

2:37

you might like but you don't necessarily have

2:39

to seek out yourself. You don't have to

2:42

go to the shop and say oh I

2:44

quite fancy having a read of this. It's

2:46

content that's being actively promoted into your

2:48

feed and so for me that's perhaps

2:51

the crucial difference between existing prejudices and

2:53

bias and all of that kind of

2:55

stuff that's always existed. I think the

2:57

problem is that algorithms kind of further

3:00

entrench those biases to the point where

3:02

people are so siloed it's impossible for

3:04

them to even encounter content that doesn't

3:07

already agree with what they say

3:09

and it's even harder to kind of challenge it and have

3:11

a conversation about it. I mean the other good thing in

3:13

some ways about a newspaper or publication is you might go

3:15

and pick it up, you might be reading it at

3:17

home or in a cafe, you might start to

3:19

have a conversation with other people who don't necessarily

3:22

entirely agree with you. Online you tend to encounter

3:24

people who often have

3:26

similar views to you or agree with you and that's

3:28

why you're existing in the same

3:30

social media spaces. So I think

3:33

that's a big part of the problem. I don't know

3:35

what you guys think if anyone else thinks anything about

3:37

the difference between perhaps 20 years ago papers and the

3:39

media and then social media polarization

3:41

now. Yes Miles and

3:43

Katie and Marianne is asking us that

3:45

because 20 years ago she was not

3:48

consuming social media or broadcast

3:50

news or newspaper. Actually Sarah, 20 years ago.

3:53

Almost exactly 20 years ago I was first

3:56

using the internet using the Barbie website. So

3:58

there we go. Well,

4:00

things have come full circle for you, Mary Anna. It

4:06

must have been a killer year for you last

4:08

year with Barbie coming. I know. Me

4:10

and Sarah love to talk about Barbie. I mean,

4:12

I would just say this and I'd be curious

4:14

what Katie has to say. I think

4:17

of it like shards of glass now is

4:19

that the media ecosystem

4:22

used to be a couple of panes of glass

4:24

in a window. You could look through the pane

4:26

that you wanted, but it shattered now and the

4:28

shards are all over the place and they're

4:30

disconnected from each other. And that's

4:32

the thing is you can go

4:34

into your own micro ecosystem of

4:37

podcasters who share a viewpoint, websites

4:39

with a viewpoint, blogs with a

4:41

viewpoint, and they all stay within

4:43

that ecosystem. And you never have

4:45

to be exposed to other things.

4:47

I mean, even with legacy media

4:49

where certain papers had a particular

4:51

bent, you were still exposed to

4:53

stories and opinions that didn't perfectly

4:55

align. So I do think

4:58

we are going into that period of different

5:00

realities and different realities

5:02

translates into social

5:04

and political divisions that I think are much

5:06

sharper than we've seen in the past. So

5:09

not to destroy the analogy, but those shards

5:11

of glass are pretty sharp for our civic

5:13

life. I think that it's really,

5:16

it's a great metaphor, first of all, the shards

5:18

of glass, I think, especially because

5:20

they are extremely sharp right now. The

5:23

suppression of information and the fact that

5:25

everything comes down to how sensational

5:27

it is, how many views you can get or

5:29

how many likes and so on and so forth,

5:32

how much traffic it generates. And

5:34

I have particular concerns about it for young people

5:36

because as we've seen, you know, with,

5:39

I mean, look, the TikTok debate is its

5:41

own thing. But

5:43

across the board, young people are just not

5:45

exposed to vetted media, right? It's just whatever

5:47

happens to show up on their feeds and

5:50

it's pretty scary. We've

5:52

got a great question from Richard

5:54

in Northern Ireland who asks, certain

5:56

US politicians seem happy to disseminate

5:58

malicious AI content. from any

6:00

source if it appears to discredit their

6:03

opponents and benefit themselves. Is there any

6:05

direct evidence that the major US political

6:07

parties are actively using these AI tools

6:10

to generate their own beneficial fake content?

6:12

If so, do they run any risk of

6:14

prosecution? So I actually helped run

6:17

a think tank in Washington DC called

6:19

the Future US and we've been meeting

6:21

a lot with federal officials to

6:23

talk about the anticipated explosion of

6:26

deepfakes in the 2024 election

6:29

and there was a meeting that we

6:31

had, I won't say with what agency

6:33

because I don't want to call them

6:35

out, but we had a briefing where

6:38

we presented this presentation we've been using

6:40

a lot about the explosion of deepfakes

6:42

and we talked about a specific controversy

6:44

that happened in the United States of

6:46

a local teen who put out a

6:48

deepfake of someone streaking into high school.

6:51

And you know we went through

6:53

the details and you know the police arresting

6:56

the student and were the police in the

6:58

wrong or you know was the student in

7:00

the wrong and we get to the end

7:03

of the presentation and what we reveal to

7:05

them after we had these officials expressed opinions

7:07

about the case we reveal that the entirety

7:09

of the case was made up. The student

7:12

was fake, the headlines were fake, the controversy

7:14

was fake and we

7:16

said to those officials look we created

7:19

this story in 10 or 15 minutes

7:21

and we managed to fool you, the

7:23

people who are responsible for election security.

7:25

If you can be fooled Americans

7:28

are going to be fooled and so

7:30

it's something we've been working with them

7:33

a lot on. There is legislation being

7:35

proposed right now in Congress to make

7:37

it a federal crime to

7:40

portray someone who's running

7:42

for elected office using

7:44

AI without their permission either

7:47

to fundraise or to affect the

7:49

vote. Now it's unlikely I think

7:51

that that legislation gets passed before

7:54

election day but there has been a flurry

7:56

of bills starting to pop up to deal

7:58

with this issue. And

8:01

I worry that something going wrong close

8:03

to the election is what's actually going

8:05

to give those needed reforms the

8:08

lift to get through Congress. Yeah,

8:10

and one of the things we've been thinking

8:12

about actually, because I imagine that there will

8:14

be lots more questions about AI and about

8:17

social media and algorithms and everything else, is

8:19

that I can, on Mondays at least,

8:21

on the Monday episode, become your designated conspiracy

8:23

theory agony answer. If

8:25

there's something you want to ask us, whether it's about social

8:28

media or whether it's about maybe someone you

8:30

know who's fallen quite deep into conspiracy theories and

8:32

you want to know what to do about it,

8:34

you're worried about the election coming up and conversations

8:36

that you want to have, do reach out to

8:39

us and I'll attempt to provide you some

8:41

advice, if I can. Well, that's

8:43

going to keep you busy, Marianne. I'm looking forward to the

8:45

conspiracy theory clinic that you're going to be running. And

8:48

in the meantime, we have had

8:51

some really thoughtful engagement with an episode we

8:53

recorded a couple of weeks ago about the

8:55

Sentinel epidemic when we had the journalist Ben

8:57

Westhoff on as a guest. And Lisa's been

8:59

in touch with a question. She's

9:01

recommending to listeners in America, there's another

9:03

Radio 4 programme called A Reckoning with

9:05

Drugs in Oregon, which of course is

9:07

where they've had a very progressive drugs

9:09

policy. And she's got this question. Given

9:12

the legalisation of cannabis in some states,

9:15

even though it can be

9:17

more carcinogenic than tobacco causes, psychosis increases

9:19

the chances of developing schizophrenia, why is

9:21

none of that talked about when the

9:24

subject of legalising cannabis is raised? Now,

9:27

Katie, on the wider topic of drugs, I know

9:29

you've got experience, professional and

9:32

personal, about this. Can you tell us

9:34

a little bit about that? My

9:37

whole career has been working in homeless services,

9:39

aside from that brief stint in politics, and

9:41

obviously substance abuse has a lot to do

9:43

with homelessness, although not as much as people

9:45

sometimes play it out as. But

9:47

it hit my own family very

9:50

personally. My brother had struggled with

9:52

addiction for several years. He

9:54

got clean, he went to rehab, we sent

9:56

him, you know, and it seemed

9:59

like things were really important. really on the right track. But

10:01

then about two months after everything happened with

10:03

me and I resigned from Congress, my

10:07

mom was in the hospital having brain

10:09

surgery. He was staying with me and

10:11

long story short, he overdosed

10:14

downstairs on the couch and I

10:16

found him, did CPR. And

10:18

it turns out, of course, you don't find

10:20

out until later what happened, but he had

10:22

done some cocaine and the cocaine

10:25

was laced with fentanyl, which is becoming more and

10:27

more of a common problem. That was in January

10:29

of 2020 and the occurrence of such situations

10:33

has become the norm. I

10:35

mean, it's happening to families across

10:38

the country, it's staggering rates.

10:41

So obviously it's very personal to me. I

10:44

think that it is not an issue

10:46

that we have good answers for period.

10:48

So you can see in the attempts

10:51

at legislation, whether it's in Oregon or

10:53

elsewhere, we don't know how

10:55

to solve it. And even as somebody who I would,

10:57

I mean, you know, I'm often considered

10:59

an expert in substance abuse issues and certainly

11:02

the organization I work for is, we

11:04

really focus on harm reduction, on trying to

11:06

prevent people from dying because that's right now

11:09

the biggest priority. We can't necessarily figure out

11:11

how to solve for it otherwise, but we

11:13

certainly know enough to help prevent those kinds

11:15

of deaths. That's a really,

11:18

really sad story Katie. I'm sorry about

11:20

that. But of course, as you say, it

11:23

is now the case that so

11:25

many families in America have that

11:27

experience that so many people know somebody

11:29

quite close to them who's either struggled

11:31

with opioid addiction or has

11:33

a family member or a close friend die.

11:36

I would say that Katie's story

11:38

unfortunately is all too

11:40

common because of whether it's

11:42

illegal drug abuse or prescription drug abuse in

11:44

the United States and there

11:47

really is an awareness issue here.

11:49

Even cannabis, which is,

11:51

you know, considered not nearly

11:53

as hard of a drug and

11:56

is being of course widely legalized

11:58

across the United States, We're learning

12:00

more and more every year about some of

12:02

the downsides. I mean, one of the things,

12:05

there was another study recently that just showed

12:07

a tight correlation between cannabis

12:09

use and early stage dementia. So,

12:11

you know, and I will say

12:14

personally, I'm in favor of relaxing

12:16

federal laws around substances in the

12:18

United States. I think it's a

12:20

personal responsibility issue. But

12:22

if we're going to entrust people to

12:24

use things responsibly, they

12:27

really need to be aware

12:29

of the implications. And

12:31

what I worry about with cannabis

12:33

legalization is that people think that

12:36

because something is legalized, that means

12:38

it just inherently is safe. But

12:41

we know that not to be true. I mean, look at alcohol. We

12:45

know alcohol, when people binge drink,

12:47

is not safe. So

12:49

just because it's legal doesn't mean it can

12:52

be used. And that's something I think that we're

12:54

going to spend a lot of time working on

12:56

here in the United States once full

12:59

legalization has happened, which seems likely eventually

13:01

at the federal level on cannabis, there's going

13:03

to be a lot of public education to

13:05

do. And we can see from alcohol, it's

13:07

taken decades to get people to understand the

13:10

health ramifications. Two things when

13:12

we're talking about cannabis. One is that

13:14

if it's regulated properly and the enforcement

13:16

occurs properly, then the shops

13:19

that sell it, that are licensed to sell it,

13:21

you know, there's a certain amount of clarity

13:24

that there's not going to be other

13:26

substances in it. Right. But

13:28

if there's an underground market, if there's a black market, which

13:30

has not been completely eradicated by any means,

13:33

then you don't know. You don't have nearly those

13:35

kinds of guarantees. So

13:37

I'm also very much in favor of regulating

13:40

and legalizing it. And

13:43

frankly, I think we should be looking at that further

13:45

for other substances as well, not to

13:47

suggest that it's healthy by any means, but

13:49

because we have the ability to actually

13:51

ensure some degree of safety. Also,

13:55

there's been a real

13:57

limitation on us studying the negative.

13:59

negative effects of cannabis because the

14:01

federal regulations having it classified

14:04

as a Schedule I drug, that means that there

14:06

have been policies and regulations on research

14:08

into the health harms of cannabis. And

14:10

I think in, you know,

14:13

as we march towards a broader

14:15

acceptance of it and eventually at the

14:17

federal level, the full decriminalization, hopefully

14:20

those restrictions will be lifted and we can

14:22

have better research. And in the same way

14:24

that we do with tobacco, those kinds of

14:26

warnings so that people are informed when they

14:29

decide to use substances. Thanks so

14:31

much for sharing that, Katie. It's really moving to

14:33

hear that about your brother. And I'm sure, you

14:35

know, some people listening will be able to really

14:38

relate to what you've said there. I'd really

14:40

recommend people go back and listen to that fentanyl episode

14:42

because I also think it really helps you understand what's

14:44

going on right now and how many people are being

14:46

harmed and affected by what's unfolding in

14:48

the States in particular. On to another set

14:50

of questions on a different topic now. And

14:52

something I get asked a lot is about

14:54

foreign influence operations and the way they could

14:56

affect the election in 2024. And on that

15:00

note, we have this question from Caroline.

15:02

I'm currently reading the book, A Very

15:04

Stable Genius. And I've

15:06

been struck by how much of

15:08

the Trump presidency was dominated by

15:10

investigations into Russian interference in the

15:13

2016 election. What

15:16

measures have been put in place since to

15:18

prevent such interference and how much

15:20

will it be a real or a perceived issue

15:22

in 2024? Thanks. Miles,

15:25

do you want to jump in first? Yeah, I'm

15:27

happy to jump into this and spend a

15:29

lot of time in the trenches on this

15:31

issue. When the Russians first

15:34

started to interfere in US elections in 2016,

15:36

I was on the receiving end

15:38

of the classified briefings from the Obama

15:41

administration as there was a flurry of

15:43

activity to try to figure out what

15:45

to do. We had never experienced that

15:48

level of interference in the modern age.

15:51

The good news is in the wake of

15:53

Russia's interference in 2016, a whole range

15:56

of things were implemented to increase security across

15:59

the United States. United States. So

16:01

I then went over to the Department of Homeland

16:03

Security and we stood up a range of election

16:05

task forces to do everything

16:07

from better share intelligence more quickly

16:10

with different agencies and relief information

16:12

to the public about interference all

16:15

the way down to monitoring the

16:17

cyber networks of election districts around

16:19

the country. In fact, more than

16:21

99% of

16:24

voting districts around the country now

16:26

have essentially federal tripwire monitoring so

16:28

that if foreign adversaries are trying

16:30

to meddle in those networks or do

16:34

something like mess with vote tallies, we

16:36

can see it as it's happening in

16:38

real time. Really just extensive work. Here's

16:41

the catch. The thing that we did

16:43

not anticipate in 2020 is that the

16:45

biggest threat to our elections wouldn't be

16:47

the Russians. It would come from the

16:49

inside. It would come from inside the

16:51

House and it would be the president

16:53

himself. That is something that the

16:55

system did not contemplate. Fast

16:58

forward to 2024 and putting domestic

17:00

politics aside and whatever Donald Trump may or

17:02

may not do to question

17:04

the integrity of the votes, I

17:06

do think we are better positioned

17:08

than we were before against foreign

17:10

interference from Russia or other governments.

17:12

However, the thing that we talked

17:15

about earlier today about artificial intelligence

17:17

and deep fakes in our democracy,

17:19

that is the big question mark.

17:21

I travel all around the country

17:23

meeting with federal, state, and local

17:25

officials about election security. This is

17:27

the thing they are most panicked

17:29

about and they feel most

17:31

unprepared to deal with is they

17:34

don't know what to do when

17:36

the deep fakes start coming, whether

17:38

they are interactive robo calls pretending

17:40

to be polling places, telling people

17:42

don't show up because there's attacks

17:44

and there's physical security threats, or

17:46

whether they are images, for instance,

17:48

of people allegedly destroying

17:50

ballots that could lead to controversies

17:53

over something that didn't actually happen.

17:56

These sort of episodes are what

17:58

election officials are fearing. And right

18:00

now, federal agencies are really scrambling

18:02

to figure out how they're going to deal with

18:04

it. One of the things

18:06

that we have to remember is that

18:09

their whole effort, right,

18:11

at interfering into elections has

18:13

been to just spread misinformation

18:15

and disinformation. And there is no,

18:18

in my opinion, there has been no

18:20

real regulatory decisions or

18:23

changes that can make that

18:25

less possible, especially when you talk about how

18:27

X is now, you know, it's

18:29

the entire conversation that happened with non-women, right? We

18:32

don't have an actor in place that is

18:35

going to do any form of moderation. And

18:37

that moderation is exactly what we have

18:39

to rely on to reduce the spread

18:41

of misinformation and disinformation. So

18:44

whether it's AI, whether it's generated images,

18:47

just like Miles was suggesting, or if

18:49

it's just the boosting of false information,

18:51

it's going to be an issue. And

18:54

again, like I said, I'm very concerned

18:56

about it for young people as well.

18:59

What I think we're looking at heading

19:01

into this election is a combination

19:03

of what you described in both 2016 and

19:05

2020, which is foreign

19:07

influence operations in their 2016 form,

19:10

you know, bot networks that are automated.

19:13

And if we're honest with ourselves, quite easy

19:15

to spot now and less influential. But what

19:17

is effective is using real people, real people

19:19

who believe this stuff, perhaps even start these

19:21

conspiracy theories or spread this content, or who

19:23

at least are vulnerable to sharing this kind

19:25

of content if it's, you know, sent to

19:27

them via message and someone says, oh, you

19:29

might want to post this meme, this looks

19:31

great. Or you might want to post this

19:33

audio clip, have you heard it? It's those real

19:35

people that I think could become key vectors of

19:38

information. We have to think about this kind of

19:40

influencer economy now on social media. You know, there

19:42

are people who are very good at building very

19:44

active online audiences, mainly who agree with what they're

19:47

saying, but sometimes who disagree, too. And it means

19:49

that they are the perfect kinds of people. If

19:51

you were running a foreign influence operation or just

19:53

an influence operation anywhere, you'd want to tap those

19:56

people up and encourage them to share content because

19:58

it becomes a lot of fun. it's not

20:00

harder to figure out where that content has come

20:02

from. It's not necessarily, you know, Bob 737 who's

20:04

posting lots of things that make you think, oh,

20:07

hang on, I don't think this is a real

20:09

person. It's someone who's convincing and can persuade an

20:11

audience. And I think for me, that's definitely what

20:13

I'm sort of looking out for. And it's really

20:15

hard to look out for everyone else with ourselves

20:17

because real people don't tend to want to admit

20:20

either that they've perhaps shared stuff that was otherwise

20:22

sent to them by different parties. And

20:24

of course, it all comes at a

20:27

time when confidence in American elections is

20:29

quite low. We know really large numbers

20:31

of the electorate actually believe Donald Trump's

20:33

completely false claim that the 2020 election

20:35

was stolen and that he won more

20:37

votes than Joe Biden did. So,

20:40

you know, you're trying to tell people

20:42

already, don't believe all of the stuff that

20:44

even the main candidates tell you, but also be

20:46

aware to look out for lies

20:48

or AI generated propaganda that's coming at you

20:50

through your social media feed. Just, you know,

20:53

this is going to be the first presidential

20:55

election that's been held since we

20:57

had one in which the result was disputed

20:59

by one of the two candidates. And so

21:02

it's being held at a strange time anyway.

21:04

And if there's going to be a lot

21:06

of misinformation coming either from inside America or

21:09

from foreign actors, it's a

21:11

dangerous time in terms of people's confidence in

21:13

the vote. Unfortunately, that is all we've

21:15

got time for today. But particularly if you're

21:17

an America based here in the UK, you

21:19

can actually hear our takeover of Five Live

21:22

this coming Wednesday with Nagam and Chetty, which

21:24

is very exciting, which means you can call

21:26

in live to ask us a question. You

21:28

don't just have to send us a voicemail

21:30

or something in writing and you can reach

21:32

Five Live by text on 85058. You

21:35

can call or WhatsApp on 08085909693 or

21:40

you can tag them on social media via

21:42

at BBC Five Live. And of

21:44

course, we'll be back next Monday with more

21:47

America answers. So do please get in touch

21:49

with those more numbers coming. I'm afraid you

21:51

can WhatsApp America on plus 44 330

21:54

1239 480. Email us America at BBC and

22:01

we're hashtag America on social media and

22:03

discord. Remember you'll always hear

22:05

America's first and in full as a podcast on

22:08

BBC sounds. But until then, we'll

22:10

see you later. Bye-bye. Bye

22:13

friends. Bye-bye. America's

22:17

from BBC News. Thanks

22:19

for listening to America's from BBC News.

22:22

You can subscribe to this podcast on

22:24

the free BBC sounds app, which is

22:26

now available worldwide. Selling

22:35

a little or a lot. or

22:38

a lot. Shopify helps

22:40

you do your thing however you chiching.

22:42

Shopify is the global commerce platform that

22:45

helps you sell at every stage of

22:47

your business. From the launch your online

22:49

shop stage to the first real-life store

22:51

stage, all the way to the did

22:54

we just hit a million orders stage,

22:56

Shopify is there to help you grow.

22:58

Shopify helps you turn browsers into buyers

23:00

with the Internet's best converting checkout. 36%

23:03

better on average 36% better on average compared

23:05

to other leading commerce platforms. Because

23:07

businesses that grow, grow with Shopify.

23:09

Get a $1 compared to other leading commerce platforms. per

23:11

month trial period at shopify.com.

23:15

shopify.com. slash work.

Rate

From The Podcast

Americast

The authoritative twice-weekly US news and politics podcast from BBC News, Americast investigates the social and cultural issues that define America today.Is Joe Biden too old to win another go in the White House? What does Donald Trump’s latest criminal charge mean for the Republican campaign? And why have issues such as LGBT rights, global warming and the war effort in Ukraine become so divisive across the US political spectrum? From foreign policy to pop culture, Americast keeps you up to date and in the know about the stories that matter with on-the-ground insights from right across the US.Americast is hosted by trusted journalists including the BBC’s North America editor Sarah Smith, North America correspondent Anthony Zurcher, presenter Justin Webb, and disinformation and social media correspondent, Marianna Spring. Joined by special guests each week such as former chief medical adviser to the president, Dr Anthony Fauci, former FBI director James Comey, CNN anchor and author Jake Tapper, Succession actress J Smith-Cameron, and Suruthi Bala and Hannah Maguire from podcast RedHanded, they look at America through an international lens, trying to make sense of the increasingly polarised political debate.Each week on Americast, Marianna Spring also brings listeners the latest update on BBC Undercover Voters, the award-winning investigation into the content that is recommended to US voters on social media. Marianna has created undercover voters – multiple social media accounts belonging to different characters who sit across the US political divide. By tracking the content that is pushed at each of them, this investigation will cover a turbulent time for US politics with speculation over a Trump bid for the presidency and Biden facing domestic and international challenges.GET IN TOUCH:• Join our online community: https://discord.gg/qSrxqNcmRB• Send us a message or voice note via WhatsApp to +44 330 123 9480• Email [email protected]• Or use #AmericastFind out more about our award-winning "undercover voters" here: bbc.in/3lFddSF.

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features