Podchaser Logo
Home
What's Going On with the US TikTok Ban?

What's Going On with the US TikTok Ban?

Released Wednesday, 20th March 2024
Good episode? Give it some love!
What's Going On with the US TikTok Ban?

What's Going On with the US TikTok Ban?

What's Going On with the US TikTok Ban?

What's Going On with the US TikTok Ban?

Wednesday, 20th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:05

Hey, this is Annie and Samantha and welcome to

0:07

Stefan.

0:07

Never told your prediction of iHeartRadio, and

0:18

today we are once again thrilled to be joined

0:20

by the amazing, the astounding, the

0:23

astonishing Brute Todd.

0:26

Welcome back, Britie.

0:27

We love Thank you for having

0:29

me. I am excited to be back.

0:32

Yes, we are as always

0:34

glad to have you back. We

0:36

were talking about this a little off mic before.

0:39

But happy belated birthday.

0:40

Thank you?

0:42

Yes, yes, Did

0:44

you do anything fun or how

0:46

have you been in general?

0:48

I have been good if my voice sounds a

0:50

little bit rasp beats, because my friends and I did

0:52

birthday karaoke over the

0:54

weekend and there was lots

0:56

of screaming and singing. And

0:58

I sometimes get this say where I

1:01

don't do a lot of karaoke, but I love karaoke and

1:03

I'm that person who, after a few drinks

1:06

insists on singing a ballad. And

1:08

my friends got lucky because the ballad that I

1:10

wanted to do they did not have. So yeah,

1:14

friends one bridget zero.

1:17

Wait, so what is your go to karaoke song?

1:19

This says a lot about you.

1:20

Oh God, I think so. It depends on

1:23

for a ballad. I wanted to do an

1:25

Amy Winehouse song. I wanted to do Me

1:27

and Mister Jones, which is a

1:30

favorite of mine, which they didn't have, which

1:32

I feel like is like practically a hate crime

1:34

to not have a robust Amy Winehouse

1:37

selection at karaoke.

1:38

Yeah, how dare you?

1:40

What are your go to songs?

1:43

I am a nineties like,

1:45

I'm the gen X millennial person who has

1:47

to hit my nineties hits so

1:50

that it will include TLC Destiny's

1:53

Child if they have Wilson

1:55

Phillips, which I there was a TikTok video about

1:57

if you want to get

2:00

slash millennials to stop arguing

2:02

or stopping their tracks, do a Wilson

2:04

Phillips hold On And I'm like, well, that's rude

2:07

and right, but it's okat. I

2:12

didn't need that because I'm also very

2:14

aware of as much as I would love to do a Whitney Houston

2:16

track, and we usually do like dance with somebody,

2:18

I want to dance with somebody. I know her

2:21

range is way out of my league.

2:23

Although past host

2:25

Caroline, she does an amazing condition

2:28

of that song.

2:29

I believe it, actually I can see it.

2:31

Yeah, yeah, but yeah, but I'm I'm that girl that

2:33

I love all of those. Then you gotta

2:36

throw some Fiona Apple in there. You know it

2:38

did some Backstreet Boys or for me in

2:40

sync, I feel like those are my go too.

2:44

Sorry I'm Backstreet

2:46

Boys. Yeah, I know, I know.

2:49

Yeah, I'm kind of embarrassed.

2:51

My mind different.

2:54

I love it.

2:55

I have a very vary

2:58

range of karaoke songs. I like to do. Mulan

3:02

Man, that's a good

3:04

one.

3:05

I will say. That's like all the producers,

3:08

like our producers, producer Andrew, producer

3:11

Dylan, like, they will get together and

3:14

Ramsey will get together and do a whole rendition

3:17

of that song.

3:18

It was quite fabulous. They do a great job.

3:21

I think at a certain point,

3:24

I'm not very good at karaoke. I love it,

3:26

but I'm not very good. And at a certain

3:28

point I started thinking of like, what are songs

3:30

I know people will sing, and those

3:33

are the songs that I request Now, So I do like the

3:35

Pokemon song, which I loved

3:39

Pokemon, but I like people

3:41

love it and I know the lyrics are easy enough.

3:44

My big one is I do Ludicris get Back.

3:48

Of course from Georgia.

3:49

People like are like you have to do

3:52

it before karaoke is over,

3:54

like that's the one I'm known for.

3:57

But then I do. I just like so many

3:59

songs I want to singer, like sad songs

4:01

are they're long songs, and

4:04

I don't think those are good karaoke songs.

4:06

They're not good karaoke songs. But truth

4:08

true friends will let you do a ballad or two

4:10

like that's like, yes, I agree

4:13

that good fun karaoke to watch

4:15

is karaoke where it's a silly, fun

4:17

song, fast paced, everybody

4:19

is singing along. But I don't know, like

4:21

you should be able to have your little moment where

4:24

you do you demand silence so you can sing

4:26

Amy Winehouse.

4:28

Yes, I'm no

4:30

good. Amy Winehouse is one of my favorites

4:32

as well.

4:33

That's a good one. That's a really good long one.

4:36

Sometimes a country song is good.

4:38

My partner I had practiced. We wanted

4:40

to do a practist.

4:43

We promised we wanted to do a

4:45

duet because it was actually tennically both of

4:47

our birthdays we have the same birthday, and so

4:50

right we were gonna do Love Shack.

4:52

We listened to Love Shack ten times.

4:55

We had come up with like dance

4:57

moves and I don't know if you know that song,

4:59

but there are like different vocal

5:02

parts and we had really practice and then they didn't have it

5:04

after all that.

5:05

What okay?

5:06

First of all, hello University of Georgia.

5:09

Be ifty dude Georgia.

5:11

Those are my people's like saying that or are

5:13

im?

5:14

How dare you? Why would I not?

5:17

But yeah, those are challenging songs because

5:19

they're their vocals are pretty intense and the

5:21

fact that they didn't have it there and you practice, all

5:23

right, can we just do you have? Like? You should just

5:25

do it online? They do it for

5:28

a TikTok video because there's lots

5:30

of karaoke apps out

5:33

there, or if you're like any you can buy your own machine

5:37

machine.

5:38

Yes I do. And as lights.

5:40

It has like like it's really embarrassing,

5:42

but it has like the button you press and it goes.

5:48

House.

5:49

We come to Washington, d c.

5:51

Oh, Yeah, we're

5:53

gonna do it.

5:54

Is it big?

5:55

No?

5:56

No?

5:58

Yeah? A sign you can't

6:00

see you, but you can't see listeners.

6:02

But it's like a smallish

6:04

box.

6:05

Yeah, okay, we're doing this.

6:07

I'm so excited to get this done. She's going to carry

6:09

this on plane. But if

6:11

you're doing it for TikTok. We're gonna need to do that very quickly

6:14

because ah.

6:15

Oh yeah, this is a great

6:18

segue, masterful segue. Samantha

6:21

did it, and that is I mean, I feel like,

6:23

I think we might have talked about this before, but I feel

6:25

like we have to get back into it because

6:27

there's so much going on, and that is what's happening

6:29

with a potential maybe

6:32

sort of TikTok band in the United

6:34

States.

6:34

Have you all been following this, Oh, I'm

6:36

glad you brought this up because I was actually going to do this as a

6:39

Monday mini in this conversation

6:41

because this is going to affect a lot of marginalized

6:44

people and we have to talk about

6:46

it because this is also

6:49

this is very controversial. But I'm

6:51

also confused because I don't know where to

6:53

side because there's so many people on the

6:55

spectrum for and against that I'm

6:57

surprised by.

6:59

Yeah, this is one of those issues where

7:01

you have very strange bedfellows,

7:03

Like it's bipartisan legislation and

7:06

so, which, you know, for any

7:08

kind of topic, to have folks

7:11

on either side of the aisle agreeing

7:13

right unanimously is a little bit unusual.

7:16

Some of the people who are coming out against it, in

7:18

my opinion, are not necessarily

7:21

who you would imagine. So I agree,

7:23

And I guess all of that is to say, I

7:25

had a very difficult time figuring

7:28

out where I stood on this legislation. Wasn't

7:30

until pretty recently that I was like, Okay, I feel like I

7:32

have my opinion made up, but it took me

7:34

a while to get there, exactly because of

7:36

what you described, Sam.

7:38

Right, Yeah, I

7:40

feel like we should

7:42

we should get into it, because again I don't really use TikTok,

7:45

but I did follow it because I

7:47

was curious about what would happen, Like, oh,

7:50

if this happens, yeah, then

7:52

what's what's next.

7:54

It's going to set a big precedent. It talks about

7:56

who's who has power of what, who

7:58

is actually getting the information that

8:01

people are saying they're getting the information, and for

8:03

me, for me and you, bridget who are

8:05

actually on TikTok. Are we seeing

8:08

an algorithm that is biased, that

8:10

TikTok is trying to get us on

8:12

their side? Like, there's so many questions here.

8:14

These are all great questions, so let's

8:16

get into it. So basically, we're

8:19

calling it a TikTok ban, which is not like

8:22

incorrect, but it's not technically

8:24

accurate. So this legislation that we've seen

8:26

moving through the House in

8:28

the last few weeks is called protecting

8:30

Americans from Foreign Adversary controlled

8:33

Applications at and it wouldn't technically

8:36

ban TikTok specifically. Rather,

8:38

this legislation, if passed, would block

8:40

social media apps that are quote

8:43

controlled by a foreign adversary

8:45

in the United States. Right now, TikTok

8:48

is owned by the Chinese based parent company

8:50

byte Dance, So when it comes to TikTok,

8:52

what that would mean is that byte Dance would be

8:55

forced to sell TikTok to an American

8:57

owner in order to operate in the

8:59

US. If this legislation were to become law.

9:02

That is kind of a tricky thing.

9:04

TikTok CEO shows Too

9:07

has repeatedly said, we're not going to do

9:09

that, We're not interested in selling. Will never happen,

9:12

And if this legislation

9:14

were to become law, they would have to

9:16

do this in about six months, which is not

9:18

a ton of time. I'm certain this would

9:21

be a very complicated transfer

9:23

of power. You know, we're talking about potentially

9:25

like a multi billion dollar deal here,

9:28

and so forcing TikTok

9:30

to do that in six months is

9:33

it's like a really tall order, so

9:35

much so that it is essentially

9:37

a TikTok ban, even if it's not a

9:39

ban in a technical sense.

9:52

So recently watched the video on

9:54

several different TikTok creators,

9:56

as well as the press conference that

9:58

the Chinese leaders did. I

10:01

don't know if you saw this, essentially saying it's

10:03

actually written in the contract that byt

10:05

dance cannot by law

10:07

sell away from

10:09

China, like they that was always a part

10:12

of the contract, that's always been there, and

10:14

to the point that the government,

10:16

our government knew that. So

10:19

it makes me like they I and everybody's accusing. So this

10:21

is an actual ban without them saying band

10:23

trying not to be the bad guys exactly.

10:26

So I keep seeing people being like, well, it's not a band,

10:28

it's not a band, Okay. Maybe

10:30

it doesn't use the word ban, maybe it's not

10:32

a band in the most like hyper technical

10:35

sense, but functionally

10:37

this is a band, right, Like, I completely

10:40

agree with you. And even that

10:42

alone is one of these things that kind of

10:44

makes me wonder about how this legislation came

10:46

together. You know the

10:49

fact that they are going out

10:51

of their way to have a carve out

10:53

to say it's not really a band when

10:56

functionally it basically is.

10:58

Right literally saying this is these these are

11:00

your options to sell or be

11:03

banned, and then then the actual

11:05

company being like, well, no, it is in our contract that

11:07

we can't sell.

11:09

So and you knew this.

11:10

You were putting us in a rock, you know, the rock

11:13

and a hard place in that situation of like, no, we're just

11:15

going to be out, and the Chinese knows

11:17

that the US,

11:19

as many users as there are, are not the only users

11:22

exactly.

11:24

So that actually brings up like a pretty

11:26

clear next question of whether

11:29

or not there is a someone

11:31

who would buy this right like TikTok.

11:34

It would probably be a multi billion

11:36

dollar sale tip to acquire

11:38

TikTok, and there's not really a lot of

11:40

people who have emerged who are

11:43

you know, in a position to buy this.

11:46

However, former Trump administration

11:48

Treasury Secretary Steve Menuchin said

11:51

last week that he's actually putting together

11:53

an investor group to buy

11:55

TikTok. So, yeah, a

11:57

former Trump administration official

12:00

could own it. Now I've put that into a

12:02

little bit of context. That would mean that Elon

12:04

Musk owns Twitter, Manonchin

12:07

would own TikTok, and that

12:09

would really be a huge change

12:11

in our social media landscape. In

12:13

my opinion, I don't think that would be

12:15

a change for the better necessarily.

12:18

Right, we already know what's happening with I misinformation

12:20

on Facebook and who dictates

12:23

all of that, which is very

12:26

concerning in itself. So that means

12:29

all of I mean, I will say.

12:31

Someone said, and this is a whole different conversation

12:34

that Reddit is becoming the new safe space.

12:36

Oh that's so interesting because I've actually

12:38

found myself. I was like a Reddit power

12:41

user back in the day, and then

12:44

I stopped using it so much. I started using

12:46

Twitter and Instagram more. Now

12:48

I'm back on Reddit, and there's something about

12:50

it because like, for me, I

12:53

really need to have a reliable

12:55

text based social media platform

12:57

as much as I like TikTok, you

13:00

know, I need to. I need to. I'm I'm a words

13:02

person, Like I like to read words. I like

13:04

to write words. Being on

13:06

camera on video, which is a different thing.

13:09

I like watching other people be on video and

13:11

be on camera. But I'm a words

13:13

person and I think, like so,

13:15

Reddit has kind of re emerged as

13:17

a place where I'm spending more and more of my time

13:19

and it's actually feeling pretty good.

13:21

Yeah, and I think I don't think you're the only one. Like I said,

13:23

people are calling that kind of the safe space because if

13:25

they're actually well monitored and

13:28

you're going to uh, you know, the subreddit

13:30

that is specifically for you, that it has been

13:33

well moderated in

13:35

an affair and like objective manner,

13:38

and peopleally really like trusting that

13:40

system more now because they're

13:42

worried about TikTok. But

13:44

back to like we need to talk about what exactly

13:47

is this legislation

13:49

that they're trying to pull.

13:50

So it passed the House last week

13:53

and now it goes to the Senate. President

13:55

Biden has already said that if it passes

13:57

the Senate, he will sign it in

14:00

law. White House Press Secretary Karine

14:02

Jean Pierre said that the Biden administration is

14:04

really looking to the Senate to act quickly on

14:07

this legislation, and so like it could

14:09

be something that moves fast. However,

14:11

I will say that as quickly

14:14

as this legislation passed the House,

14:16

passing the Senate is looking like it

14:18

might be a little bit more complicated. A

14:20

bipartisan group of senators is already

14:23

sort of signaling that they would not necessarily

14:25

be down for voting for a TikTok ban. For

14:27

instance, Senator ran Paul described

14:30

the bill as a quote draconian

14:32

measure that stifles free expression, tramples

14:34

the constitutional rights, and disrupts the economic

14:37

pursuits of millions of Americans. Senator

14:39

Ed Markey, a Democrat from Massachusetts,

14:41

argued that the privacy issues that the bill is

14:44

meant to be, you know, addressing, exist

14:46

far beyond TikTok. So it does kind

14:48

of seem like it might not be

14:50

the kind of thing that is quickly and unanimously

14:53

passed in the Senate. It might there might be some

14:56

some pushback there.

14:58

Yeah, what I understood. Let Schumer,

15:01

who is the lead in the Senate,

15:03

was saying that he is in no hurry

15:06

to even look at this bill because

15:09

they have been kind of inundated with

15:11

a lot of opposition. Again, and this is one of those

15:13

moments when you said Ran Paul, Ran Paul and

15:15

Marjorie Taylor Green all opposing this makes me

15:17

think that I'm on the wrong side

15:20

if I sort of agree with them.

15:22

Yeah, I mean listen, so

15:25

you're exactly right. Chuck Schumer

15:28

didn't really have anything to say about the legislation

15:30

and the way that he's I forget how he said it exactly,

15:32

but like to me, it signaled I

15:35

am not taking a position on this, and I'm in no

15:37

hurry to make us a priority. That's what it

15:39

looks like to me. You're so right that

15:43

the people who are for and against

15:46

it is so because usually it's like whatever

15:48

Margory Taylor Green is for, I am probably against

15:50

and vice versa. Right, But you know, even

15:52

I think last week former President

15:55

Trump came out against

15:57

a TikTok band, which that's a reversal

15:59

of previous positions, right, So

16:01

it's like it is very difficult to

16:03

figure out where different elected officials

16:06

stand on this and use it's

16:08

difficult to use that as a barometer for

16:10

where as an individual folks

16:13

listening should stand.

16:14

I think there's so many conversations because we've already

16:16

talked about the original band

16:19

that we were talking about long tap ago with a show

16:21

to coming in front of the Senate testifying,

16:24

and then the xenophobic comments that happened

16:26

time after time and time after time, and

16:29

this is a whole different conversation that I have to have

16:31

with myself as an Asian person because

16:34

we have talked, we haven't talked much about the

16:36

fact that there's a lot

16:38

of misinformation and there's a

16:40

lot of kind of conspiracy theories

16:42

that the US is xenophobic

16:45

tactics from the past, and that

16:47

communism is not good. Don't

16:49

get me wrong in all of this, but the

16:52

level of evilizing

16:55

and condemning of the

16:58

places like China and North Korea again

17:00

not good. But a part of that conversation

17:02

is because they're not Christian enough. That

17:05

has been a part of that too. That makes me wonder

17:07

how much of this is

17:09

a colonized idea of

17:11

what they expected China and

17:14

these other countries to become and when they

17:16

are of power because they are not Christian

17:18

nations? Is that that is

17:20

that part of their what they see as evil

17:23

of these places and then being successful.

17:25

I mean, you really can't have a conversation

17:28

about the way this legislation has been moving

17:31

without talking about xenophobia, and

17:33

I wish that wasn't the case, but it

17:35

is American

17:37

elected officials who are making

17:40

that the case. Like I wish we could have this conversation

17:42

about the merits of this legislation. However,

17:45

when you have elected officials

17:48

in hearings making blatantly

17:51

xenophobic and anti Asian comments

17:55

to TikTok CEO's,

17:57

you have to talk about it like it is. It is just inherently

18:00

heard the conversation. It makes it so that

18:02

you cannot talk about the legislation

18:04

on its own without talking about the sort

18:06

of like rhetoric that you're describing here,

18:09

and it really does us a disservice.

18:11

I think, because the American people

18:13

deserve to see the

18:15

merits of legislation you

18:17

know discussed. I think

18:20

that any person who is a CEO of the company

18:22

as big and as influential as TikTok should

18:24

be comfortable being grilled in front of elected

18:26

officials, and the American public should have a sense

18:29

of, like, where what they're thinking about. However,

18:32

when you retreat

18:34

into these harmful anti

18:37

Asian xenophobic stereotypes

18:40

and claims and theories, you

18:42

are doing everybody at disservice because we don't get

18:44

that right. We're talking about the xenophobia, and

18:46

we have to talk about the xenophobia because you brought

18:48

it up, you injected it into the conversation, does

18:50

that make sense.

18:51

The entire time the entire time, and calling them

18:53

the enemy state. Again, don't get me

18:55

wrong, I know that the history is complicated.

18:58

The Chinese government has done a lot of bad things.

19:00

The US government has done a lot of bad things. And

19:02

then this is like the back and forth of

19:04

like what is happening? But then also

19:07

it kind of comes down to the fact that again

19:09

maybe because I do enjoy TikTok and I

19:12

do go through and like love

19:14

my dog videos, I love my cue little animal,

19:16

I saw honors the'ed adorable, I want

19:18

one, but like all these things and

19:20

making it seem like if they shut

19:23

down TikTok, and it's like

19:25

that is shutting down the voice of

19:27

the people. And it's kind of interesting

19:29

to see what TikTok is doing and

19:31

trying to respond to the

19:34

governments like us against them like

19:37

seemingly totally.

19:38

So some of y'all might have gotten like

19:41

big alerts when you tried to open TikTok the last

19:43

few days, urging you to call your member

19:45

of Congress and tell them that you do not want

19:47

a TikTok. Band I actually didn't get one. Maybe

19:50

maybe they know that I'm in DC, and like there's

19:52

not a member of Congress that I could call. I could call Eleanor

19:54

Holmes Norton, but she doesn't get a vote, So like there's

19:57

really they're maybe they're like, we're not gonna waste our

19:59

time getting bridget with this ad.

20:01

But the alert reads stop

20:03

a TikTok shut down. Congress is planning

20:05

a total ban of TikTok. Speak up now before

20:07

your government strips one hundred and seventy

20:09

million Americans of their constitutional right to

20:11

free expression. This will damage millions of

20:13

businesses, destroy the livelihoods of countless

20:16

creators across the country, and deny artists

20:18

and audience. Let Congress know what TikTok

20:21

means. Over the weekend, they also did

20:23

a push notification that folks might have gotten to their

20:25

phones pretty much saying the same thing.

20:28

It honestly does sound like this is working. Congressional

20:30

staffers say that their phones were like

20:32

ringing off the hook. One house GOP

20:35

staffer told Politico that it was so so bad.

20:37

Our phones have not stopped bringing their teenagers

20:40

and old people, saying they spend their whole day

20:42

on the app and we cannot take it away.

20:44

You know, one creator was talking

20:46

about the fact that they actually said there was a lot of millennial

20:48

women, a lot of millennial women

20:51

calling in. They were kind of surprised how

20:53

many millennial women and when a middle

20:55

aged and millennial women were calling in talking

20:57

about this was going to affect their business.

21:00

Yeah. Actually, Essence just put

21:02

out a really really interesting article

21:04

about how many specifically black

21:07

women entrepreneurs

21:09

and creators were really using TikTok

21:11

to promote their businesses

21:14

and like find a foothold and in ways

21:16

that they were unable to do on other platforms,

21:18

and how much a TikTok ban was

21:20

concerning to them and their livelihoods economically,

21:23

and so that doesn't surprise me at

21:25

all. I also wanted to make one other comment

21:27

about that, which is that I think that TikTok

21:30

has a reputation of being like a young

21:32

person's app, and certainly it has a

21:34

lot of young users. However, the average

21:36

TikTok users age is thirty,

21:39

right, and so that's like I would call a thirty

21:41

year old young, but like a thirty

21:44

year old votes, you know, like these

21:46

are people who, you know, we're

21:48

not talking about a children's

21:51

app. And I think that perception is really

21:53

not correct, and I think that's the perception that people.

21:56

Have right, which

21:58

is an interesting because I

22:00

know the original idea of TikTok was

22:02

younger users, much younger. It was

22:04

like the dance app, which I

22:07

like dancing. I want to see people dance.

22:09

I can't do those moves, but I want to see them.

22:12

But it was it was interesting, and I know that

22:14

that's the big conversation of like who is

22:17

using it, who this is affecting, and who is able

22:19

to get on it. Of course we've already had that conversation

22:21

with like how quickly that's changing to and how

22:23

quickly they will shadow band many

22:26

a marginalized communities. But that's

22:28

the hair there totally.

22:31

And so just one other thing about these

22:33

these push notifications and stuff. I

22:35

have seen a lot of people say that TikTok

22:38

urging users to take political action

22:41

is evidence enough that the app is bad.

22:43

That they're like demanding that teenagers

22:46

call their congress person or whatever, and that like,

22:48

you shouldn't have an app telling young

22:50

people what to.

22:51

Do or whatever. But this is

22:53

so common for tech companies to do.

22:55

Both Uber and Lyft have sent

22:57

out similar push notifications asking

22:59

for users to contact congress or do certain

23:01

things. And so TikTok is not the only

23:04

tech company or app that has done

23:06

this kind of thing, and people trying

23:08

to act like this push notification action

23:11

was completely beyond the pale. It's

23:13

certainly not. It happens all the.

23:15

Time, right, which I feel

23:18

like if I remember correctly Twitter, different

23:20

Twitter accounts would have the like you

23:22

know, you could have the one link and they would

23:24

often do like at least like petitions. I

23:27

remember during a lot of the Black Lives Matter

23:29

stuff and many of the murdering

23:31

of the black community, they would have links

23:33

and petitions that would happen

23:36

through Twitter. So none of these things are

23:38

new. I'm wondering again,

23:40

and I know this is that bigger conversation of

23:42

like, maybe I'm just too caught up in some conspiracy

23:45

theories. But some of the conversations

23:47

that have happened with the pro Palestinian

23:50

movement and what's happening there

23:53

and many consider genicide, I consider

23:55

a genocide happening there, that TikTok

23:59

has been one of the biggest platforms

24:02

to actually have conversations about

24:05

that and through that because they do have

24:07

like the link tree that do have petitions

24:10

and alerts and all of these things like that

24:12

has happened, and I know there's a conversation

24:14

about whether or not that's part of this motivation.

24:17

Totally. I have so much to say about that.

24:19

I will say first, I think

24:21

a lot of the conversation around how

24:24

conversations about what's happening in

24:26

Palestine are unfolding

24:29

on TikTok. I think a lot of the conversation about

24:31

that, to me feels a little bit

24:33

ageist and condescending, Like it's

24:35

this idea that like, oh, well, if young people

24:38

are you know, if they feel

24:40

a certain way about what's happening

24:43

in Palestine, certainly is a Certainly

24:45

that is a problem, and it is it is TikTok's

24:48

fault, as opposed to saying like, well, maybe young

24:50

people are smart and with it and like

24:52

looked at looked at the information available to them and came

24:55

to a conclusion, and you

24:57

know, treating that with the respect

24:59

that it does. I see a lot of older

25:02

elected officials sort of handwringing

25:05

and being like, oh, well, the only

25:07

way that our kids could come to this conclusion

25:10

is that if this app is like tricking them

25:12

into it, and so there have been a lot of

25:14

interesting conversations about how

25:17

that fits into this

25:20

legislation to potentially ban TikTok,

25:22

this concern about where young

25:24

people are finding themselves when it comes to conversations

25:26

about Palestine.

25:38

I guess this brings us to a question. Then I feel

25:40

like the answer is kind of clear.

25:42

We've kind of talked about it ear last. A part

25:45

of it's clear, why is this happening

25:47

in the first place, bridget.

25:48

It's a great question. So if you take

25:50

these lawmakers at their word, if you take the

25:53

name of the legislation at its word, lawmakers

25:55

say that TikTok is a national security

25:57

risk because it's owned by China, and that the

25:59

app app could be compelled by

26:01

the Chinese government to manipulate

26:04

public opinion because you know,

26:06

they are you know, TikTok does have this like super

26:08

powerful algorithm that they can use to

26:11

boost certain content, suppress certain content,

26:13

that they could collect mass data on

26:16

Americans or even spy on

26:18

individual Americans. And so that's

26:20

sort of like what lawmakers

26:23

are saying they are trying to prevent with this legislation.

26:26

Okay, but is

26:28

that actually the thing?

26:31

So I should say here, this is a little

26:33

above my pay grade. But as

26:35

somebody who has done a lot of looking into this and had

26:37

many, many, many interviews on it, this

26:40

threat, my understanding, this threat

26:42

is just purely hypothetical. You

26:44

know, the Intercept is published a piece about

26:47

how when pushed on these

26:49

threats and like, when asked, you know, what is

26:51

the threat? Is the rest smoking gun? The FBI

26:54

uses phrases like oh, China could do X,

26:56

y Z if they wanted to, China

26:58

could potentially do right,

27:01

it seems to me that there is not a smoking

27:04

gun or like hard evidence

27:06

that this is happening or there's

27:08

a plan for this to happen. Now, you might

27:11

be listening and saying like, well, why

27:13

should we have to wait until there's a smoking gun? Why

27:15

should we have to wait until there's evidence that this is happening?

27:17

And that's a fair point, But as of right

27:20

now, they would be passing

27:22

this law to deal with a hypothetical

27:25

or potential threat that currently there

27:27

is just not evidence that's going on.

27:29

Right.

27:30

One of another creator that I was

27:32

looking at has actually posted I

27:35

think a recent list showing that the

27:37

likelihood of information being

27:40

farmed and Facebook and Amazon

27:42

and Instagram were top three while

27:44

TikTok was actually a number twelve.

27:47

Yes, so I'm so glad that you brought

27:49

that up. The fact of the matter

27:51

is is that we have a social media

27:53

and digital landscape that is just rife

27:56

with bad actors

27:58

and corporate interests and operations

28:00

who are doing bad things, in some

28:02

cases breaking the law to make

28:05

money off of us and to exploit us. That

28:07

is just a fact. I don't like

28:09

it, nobody should like it, but it is what it is.

28:12

And so this legislation to me, really

28:15

seems like it's cracking

28:17

down on TikTok for having

28:19

the same problems and same issues

28:21

that Facebook, Google, Twitter

28:24

all have, for exploiting us in the same

28:27

potentially exploiting us in the same ways that every other

28:29

social media platform has, but regulating

28:32

them because it's China. And this

28:34

is my opinion, I think that this

28:37

is really more about the

28:39

United States wanting to remain the

28:41

top dog when it comes to big tech, right,

28:44

Like I think I saw some joke that

28:46

was like, oh, like China

28:49

wants my data, don't they know that's reserved

28:52

for Mark Zuckerberg. I think that's exactly

28:54

what's going on here. I think that lawmakers

28:56

have figured out that by you

28:59

know, being tough on TikTok, they can

29:01

one you know, act

29:04

tough on taking down China, and

29:06

two make it seem like they

29:08

are sort of being tough on big tech

29:11

while actually doing nothing when it

29:13

comes to regulate big tech. Because just

29:15

the fact of the matter is big tech

29:17

companies enjoy a pretty

29:19

cozy relationship financially with

29:21

a lot of elected officials, and so if they were actually

29:24

to talk tough big on American big

29:26

tech, that conversation might get

29:28

a little bit thorny. But by regulating

29:31

TikTok in this way, they kind of get the

29:33

points and the appearance of being like

29:35

we're taking on big tech or really doing

29:37

nothing about taking on big tech.

29:39

I mean, we know that they let some laws

29:43

drop, some policies drop in order to

29:45

protect tech companies and their liability

29:47

way back when, and have been doing so

29:50

for a minute. But I think it's interesting

29:52

and I'm sure you've seen this Jeff Jackson,

29:55

Representative Jeff Jackson, who actually

29:57

really made his headway in TikTok,

30:00

Like his videos, his like common sense

30:02

takes seemingly in Congress

30:05

really like had him going like really

30:08

high in the polls and everybody. People

30:10

were talking about him being president, him being so great

30:12

for like being common sense and all these

30:14

things, and then he got called

30:17

out for voting yes to the TikTok

30:19

ban, which made him like this

30:21

the people that followed him made that for

30:24

him, and then him coming back on saying

30:27

that, you know, I don't think it's going to be banned. It's not gonna be banned.

30:29

It's fine, it's fine. No, I'm just a little concerned because you know,

30:31

what's in my ear. What's in my ear? And

30:33

then on top of that, people showing why

30:36

some of these possibly

30:38

like the conspiracy theories about why and this the

30:40

different people who are paying

30:43

the lobbyists who are coming through who are

30:46

actually pushing this narrative really

30:48

hard.

30:49

I mean, it's some of these aren't

30:51

conspiracy theories, right, Like we

30:53

know that Facebook or Meta

30:56

has pumped so much money into

30:58

anti TikTok like

31:00

negative pr right. Facebook

31:03

worked with a company called

31:05

Targeted Victory to get like negative

31:07

op eds about TikTok placed in

31:10

newspapers so that elected officials

31:12

would see them like this is a documented

31:14

thing, right, Like I don't think anybody at Facebook would

31:17

deny this is happening. It's it's like a

31:19

matter of public fact, and so we

31:22

do got to ask some questions about why,

31:25

because certainly a TikTok ban would

31:27

be great for Facebook and great for

31:29

Mark Zuckerberg, and they're certainly spending

31:31

a lot of money to

31:34

push a certain negative public perception

31:36

of TikTok with elected officials.

31:38

Like I remember when there was a

31:40

hearing a couple last year about

31:43

TikTok, and Facebook

31:45

had paid targeted Victory to

31:48

put op eds in papers about

31:50

like how young people had really

31:52

unfortunately died or hurt themselves

31:55

doing so called TikTok challenges, however,

31:58

obscuring the reality that some

32:00

of those TikTok challenges actually originated

32:02

on Facebook. Right. And so it's like, rather than

32:04

having the conversation about, well, what

32:06

platforms were actually responsible for this harm,

32:09

and like, like, let's talk about this harm

32:11

to kids, which we should be doing, it's

32:13

just a way of making TikTok the boogeyman

32:16

and letting Facebook off the hook for

32:18

harms that they also participate

32:21

in.

32:21

Right, And kind

32:24

of related back to our shops. When we're talking about TikTok

32:26

shops, there's a conversation that Amazon

32:28

have jumped on board because they have lost cells due

32:32

to TikTok shops. So that's a whole

32:34

different conversation. Again, so many

32:36

things that you're like, why are you doing this?

32:39

Yeah, money, I mean everybody, everybody

32:42

has

32:42

a stake

32:44

in this a little bit like Amazon

32:48

and Jeff Bezos certainly have a financial

32:50

stake in what's going to happen with

32:52

TikTok because TikTok is really

32:55

aggressively pushing TikTok shop as

32:58

a way to compete with online retailers

33:00

like Amazon, and so Amazon

33:02

obviously has a financial stake in what happens

33:04

with TikTok. And so we just got to be more honest,

33:07

I think, and have a bit more of an honest assessment

33:09

of the major players and why

33:12

they might be taking the positions that they're taking.

33:14

Right, but let's say all this is

33:16

just hearsay and conspiracies, as I'm

33:19

saying, even though we've talked about it's not. Is

33:22

TikTok trying to balance

33:24

out the fears?

33:25

So they are. TikTok's attempt

33:28

to balance out these fears is what they're

33:30

calling Project Texas. So TikTok

33:32

has committed to spend one point five billion dollars

33:34

on this plan, which would essentially enact

33:37

a firewall between TikTok and

33:39

employees of its Chinese based parent

33:41

company, fight Dance. It would sort of set

33:43

up this United States based firm called

33:46

Oracle as a kind of a watchdog

33:48

to scrutinize TikTok source code

33:50

and I guess like act like a third party

33:52

monitor for potential security risks. So basically

33:54

they're like, Okay, if you are so worried

33:56

about it, we will set up this firewall

33:59

and we will put it America tech firm

34:01

in charge of being a watchdog to make

34:03

sure everything is on the up and app Will that

34:05

actually be enough to satisfy these lawmakers? It

34:08

sounds like the answer is no so far,

34:10

Like, I have not actually seen a lot of people talking

34:12

about Project Texas as a thing that is

34:15

actually going to ameliorate the concerns

34:17

that these lawmakers are saying that they have.

34:20

I'm trying to imagine if they chose the name Project

34:22

Texas to try to wow.

34:25

I'm sure like,

34:29

oh my gosh, what conveys like

34:32

patriotism and like Americanism

34:34

and like oh United States, Project

34:37

Texas, Freedom Texas.

34:40

Now this is really interesting to me because I had

34:42

to take a moment. I was reading up about this

34:46

because again I was just curious about

34:48

what would happen, and I

34:51

was reminded of you

34:53

know, when I was in China. You know, Twitter's

34:55

band, Facebook is band, Instagram's

34:57

banned, Google's not allowed, Google

35:00

left voluntarily.

35:01

But then it's kind of like no.

35:04

But everybody used a VPN to just

35:06

like use it anyway. So I

35:08

was just thinking about, like, what would

35:11

that be, what was

35:13

that going to look like. It's

35:15

just it's a really fascinating kind

35:17

of terrifying thing. But I

35:21

guess the last time you were here, we

35:23

were talking about this, and

35:26

you know, we were talking about still

35:28

the same things of which

35:31

I know we'll get to later about you know, marginalized

35:33

communities being affected, because again in China, that

35:35

was a huge blow that they couldn't use these things.

35:38

But also you

35:40

know, having

35:43

these spaces is really important. But

35:46

they do have their issues, and last time you were

35:48

here, you were talking about like TikTok and what is

35:50

going on there with their issues.

35:52

Totally, So someone listening to

35:54

this might be like, wow, Bridget is like

35:57

being paid by TikTok or something, although,

35:59

as we discussed, I am wearing a TikTok sweatshirt

36:01

right now that I got as a free swag

36:04

from an event, only because it's the Comfy's sweatshirt

36:06

I've ever worn. However, like I

36:08

will be the first person to tell you

36:11

all the different ways that I think that TikTok

36:13

is messing up, I don't want to make

36:16

it, see Mike, I think TikTok is great. It

36:18

definitely has very big issues

36:20

that deserve to be talked about, that they deserve

36:22

to be made accountable for,

36:24

you know, just like any social media

36:26

platform in twenty twenty four, there are

36:28

some very valid concerns around TikTok

36:31

and security and other things too, And

36:33

anybody who tells you that there isn't just does

36:35

not know the fact. So we talked

36:37

about how a lawmakers are

36:39

concerned that TikTok could like

36:41

spy on Americans. They actually

36:44

have kind of done that already, you know. There were reports

36:46

that TikTok access journalist information

36:48

to identify which employees were leaking

36:51

information about the platform. TikTok

36:53

basically admitted to doing this, according to an internal

36:55

email. They were asked about this, and they were

36:57

like, well, it's not really spy,

37:00

and I mean, like, it does sound like spying

37:02

to me, you know, I get what they're kind of saying

37:04

that it's not like they were spying on

37:08

random American citizens for no reason.

37:10

They were spying on specifically journalists

37:13

to answer a specific question. To

37:15

me, spying is spying, So like, it's not like they have

37:17

a squeaky clean record on that, which

37:20

I think people should just be aware of. Also,

37:22

there's like plenty of other non

37:25

spying reasons that your average

37:27

person who uses TikTok or like has

37:29

a child or a sibling or something that uses TikTok

37:32

should be concerned about, right, what

37:34

issues when it comes to moderation, Like

37:36

Sam was talking about how TikTok

37:38

impacts mental health, one of the things

37:40

I wish that you talked about more is just like how

37:43

addictive these platforms are, how

37:46

they are intentionally made

37:49

to keep you scrolling for longer and longer

37:51

and longer, even when they know this

37:54

is actually harming them, this is actually making

37:56

them feel bad, Like they like, this is they know this

37:58

right, this is not this is not news

38:01

to them, and yet they still design

38:03

their platforms to be as addictive as possible.

38:05

These are questions I wish we had answers to right,

38:07

things like algorithmic suppression or algorithmic

38:10

manipulation, like are they showing you some

38:13

things and not showing you other things? And why?

38:15

Of course, the spread of things like mis and disinformation

38:18

and pushing users into extremist content.

38:20

But as much as these

38:22

are things that we should be talking about, none

38:24

of them are specific to just TikTok,

38:27

And so I have a problem

38:29

with this idea that the reason

38:31

the only reason that we should care about these issues

38:34

is when the platform is

38:36

owned by a Chinese based parent company.

38:38

I don't understand why we can't

38:40

then say like, okay, well, if these are problems, let's

38:43

actually address the problem, and let's actually call

38:45

out the different corporations for doing

38:47

the exact same thing, because it's not just

38:49

TikTok, and it makes it seem like they're making TikTok

38:52

into this singular boogeyman, when

38:54

anybody who has used Facebook or Twitter

38:56

or Instagram knows that is not the

38:59

case.

39:09

Mean, Twitter in itself is

39:12

a cesspool of white

39:14

supremacist, not neo Nazi movements.

39:17

Yeah, yeah, I was thinking

39:19

about this too, because I was

39:21

like, you know, it's

39:25

it's really if we lose

39:27

TikTok, we've already

39:29

kind of lost Twitter, Like we're losing these spaces.

39:32

It's and they were important, Like

39:36

there there is such a use and a

39:38

value in them, and it

39:40

is sometimes I feel like

39:42

when people don't use them or they don't understand

39:45

them, they dismiss that they're just like, but

39:48

there is so much happening and so much

39:50

being shared that is important on them. We

39:53

already lost Twitter.

39:54

Yeah, I mean, like I

39:56

think that context is really important of

39:59

what will it be like

40:02

when if there is no TikTok and Twitter

40:04

is functionally not usable,

40:07

you know, for particularly for marginalized

40:09

people, right, Like, you know, I

40:14

feel like I cannot use Twitter

40:16

without seeing transphobia,

40:20

racism, horrible conspiracy

40:22

theories, harassment, and so it's not a platform that

40:24

I can really tolerate spending a lot of time on anymore

40:27

so for marginalized people

40:29

who do not want to be subjected

40:31

to that kind of nonsense. Where

40:33

if we lose TikTok, where's the organizing

40:36

happening? Right again, to

40:38

add even a little more context, folks

40:40

might have seen a couple of weeks ago that Meta

40:42

made an announcement that they were going to be deprioritizing

40:45

content about political and social issues,

40:47

right, and so if they are making

40:49

that choice, and it's going to be that much harder

40:52

for political and social issues,

40:54

which Meta has a track record of really defining

40:57

those issues as issues pertaining to women,

41:00

queer folks, trans folks, black folks, Right,

41:02

those are the issues when they say political

41:04

issues and social issues from their

41:06

own behavior, like that is what they mean. They

41:08

have a whole track record of suppressing

41:12

content about abortion access

41:15

and Palestinian

41:17

activism and black activism and things

41:20

that pertain to marginalized communities. So

41:22

that being the case, where are these where are

41:24

we supposed to do our organizing? Where are we supposed

41:26

to build power online? Particularly

41:28

given that building power online

41:31

has been so important to traditionally marginalized

41:33

communities, It's really a deep question

41:35

that I'm quite concerned.

41:36

About, right, And Honestly, platforms

41:39

like TikTok really help with brand

41:42

new creators, Like this is where you can

41:44

see like there's not

41:46

nepotism working. It's literally people who

41:48

go viral for whatever reason, good

41:51

or bad, but essentially they're

41:53

just making it on their own whether it's sure

41:56

algorithm has everything to do with it, but we're able

41:58

to see more and more content from like

42:01

everyday people, like every day people

42:03

who get to do great content, and

42:05

people can pick and choose who they want to follow and

42:07

who they don't want to follow, and it's actually

42:10

something that they do on their own. And I know YouTube

42:12

once upon a time was that kind of same idea as

42:15

well. But with this level of like needing

42:17

to be ad sponsored on

42:19

YouTube, that kind of has gone away. You already

42:21

need a connection, whether it's through an agency,

42:24

whether it's through something to actually make

42:26

big again on YouTube, as where TikTok

42:28

has kind of become that new YouTube where you

42:30

can be discovered as a brand new thing.

42:33

Yeah. In that article I mentioned in essence

42:35

about the black women entrepreneurs who

42:37

were really concerned about a TikTok ban, that

42:39

was something they really made clear is that on

42:42

YouTube, in order to find success,

42:44

you have to have a kind of like already

42:46

have some sort of name or connection. It

42:48

is very difficult to sort of start from zero and

42:50

start building up. I'm hearing that YouTube

42:52

shorts, this sort of like new TikTok

42:55

competitor on YouTube might be a little

42:57

easier. So like that's a platform that

42:59

folks might to check out. But it

43:01

just goes to show that, like it's

43:05

just so hard to use

43:07

social media to build a platform

43:09

for yourself. And with all of

43:11

these changes, and with the different

43:14

platforms changing so rapidly, not to

43:16

mention the overall landscape changing so rapidly,

43:19

that is certainly going to hurt marginalize

43:21

people the most, people who don't necessarily

43:23

have access to like traditional

43:25

media outlets and like legacy media and

43:28

things like that. And so it's just really

43:30

clear to me that we are potentially

43:32

gearing up to a digital media landscape

43:34

where it is that much harder for

43:37

underrepresented voices to be heard

43:40

equitably and in a way that's truly representative.

43:43

Right, So, with all of that and

43:46

all the conversations that I'm throwing out,

43:48

with all the things that I've seen on my algorithm,

43:51

what do you kind of lay out there

43:53

what you think is really happening right now?

43:56

So, as I said, I think that lawmakers

43:58

have really sent that like they

44:00

need to be cracking down on TikTok. It will help them

44:03

kind of look tough on China, will also kind

44:05

of help them look like they're doing something to reign

44:07

in big tech, even though they're not. I

44:09

think that the United States really wants to remain

44:11

a major player when it comes to big

44:14

tech, but also the like massively

44:16

lucrative data collection industry.

44:20

I think that like this is really

44:22

about that concern

44:24

to me, because you know,

44:27

if you really wanted better data privacy,

44:29

which we should have, it does not begin

44:32

an end with banning TikTok are,

44:34

especially when all of our data is

44:37

essentially for sale to whoever wants it, which

44:39

if you're an American, I'm sorry to report

44:41

that it's just true. Anybody out there who wants

44:43

to buy your address, your

44:46

phone number, really any piece of information

44:48

about you can do that, and they can do it legally,

44:50

And so that's to me, that's something

44:53

that seems bad, and that

44:55

includes countries like China.

44:57

Only a few weeks ago, back in February,

45:00

I signed an executive order that

45:02

ostensibly was meant to limit the sale

45:04

of some of our data to quote countries

45:07

of concern. But this order

45:09

really just puts some limitations on

45:11

the sales of American data. As Wired points

45:13

out, US data brokers

45:16

need only take a few steps to ensure

45:18

that overseas customers follow certain

45:21

security requirements during the transfer, many

45:23

of which were already required by law.

45:25

So I really just have a hard time understanding why

45:27

TikTok is such a threat given

45:30

that China would not even need TikTok

45:32

to get our data. As long as they follow certain guidelines,

45:35

they can legally buy it the same as anybody else.

45:40

It is just literally like, okay, so you didn't

45:42

give us money for it, that's why we

45:44

want to be Yes,

45:46

we couldn't make a profit out of it. Which is hilarious

45:49

because I think I've received in the last

45:51

three months alone at least four different letters

45:54

telling me that my data has been

45:56

breached by all these different companies.

45:59

My healthcare told me I have got a breach.

46:01

My insurance told me I gotta breach. I

46:04

want to say, my h

46:06

one of my credit cards got to breach. My bank that

46:09

has my mortgage, you gotta breach. So I'm like, uh,

46:12

some everybody's got my data. According to

46:14

my information, they all have it.

46:16

Yeah, that is just the state of being

46:18

an American. It's like every

46:21

like you wake up, buy a coffee

46:23

and then you had a notification five seconds later. That's like, oh,

46:25

we got we got it, we got you whatever, whatever

46:27

information that you have, we have right.

46:29

And what's so even funnier to me is

46:32

instead of people actually, instead of the government's

46:34

changing or going after these people a hardcore

46:36

I'm sure if like small fish get

46:38

actual jail time or whatever whatnot, I'm

46:41

being Instead all of these agencies

46:44

and corporations are changing their terms of agreement

46:46

to tell me that I can't suit them for their bad they're

46:48

bad breach.

46:50

Yeah, if anybody uses Roku you

46:52

might have seen recently, you had to agree

46:55

to some new terms and conditions, and

46:57

those terms and conditions were like, you can't

46:59

sue if we do something wrong to you.

47:01

If you reach information, you

47:04

can't see us. We're listening all your money if.

47:06

You want to us, if you want to watch this episode

47:08

of Gilmore Girls, you have to give away your

47:11

ability to sue us.

47:13

I mean, like, what's so funny is every

47:16

single individual network

47:18

I stold us to, like Hulu has sened that out and I

47:20

was like, what the hell is this? And

47:22

the other part to all of those like for Roku,

47:25

the only way you can opt out is by sending them

47:27

a physical letter.

47:29

Doesn't that really tell you everything

47:32

about how these companies are showing

47:34

up in this day and age that like, yes, and

47:37

I guess like to close, like that's

47:39

sort of I just think that

47:42

that dynamic, the dynamic that says that American

47:45

tech companies can do whatever they

47:47

want to you, take whatever they want from you, exploits

47:50

you in any way they want make money from you and

47:52

your kids however they see fit if it

47:54

makes them richer, and there's not a damn thing

47:56

you can do about it. I think that that is

47:58

really what we're up against here.

48:00

I think that people are sick of it. And

48:03

I think that this whole song and dance about TikTok

48:06

really highlights that it's not

48:08

just about this one app. It is about an entire

48:10

rotten ecosystem where we are being exploited

48:13

to make somebody else fitcher and I think people

48:15

have really come to see that, like that is what we need

48:17

to be talking about. That is what we need to be really

48:20

deeply rethinking and reassessing.

48:22

Yeah, And I think one of the things I love about

48:25

when you come on Bridget and your

48:27

show is that there are

48:29

so many things that are

48:33

so gatekeeping in the technology

48:35

world, where so many

48:37

things have been invented or crafted

48:40

by women marginalized communities,

48:43

and then are like overtaken, are bought

48:46

by people, and

48:48

it is a way of keeping us off out

48:50

of these spaces.

48:51

Like you can.

48:53

There are just numerous examples of times

48:55

where it's like, well, Okay, I can't use this anymore

48:57

because I'll run into all this hateful

48:59

stuff. I can't use this anymore because

49:02

I'll run into all this hateful stuff. So I it

49:04

is a really important issue in

49:06

terms of intersectionality

49:09

and feminism, and as we're moving

49:11

into elections, oh my

49:13

god, like I'm just so concerned

49:16

about a lot of it.

49:18

But it is like.

49:20

If we again did

49:23

lose TikTok, it would be another space where that's

49:26

gone, can't show up anymore.

49:28

Yeah, And you know,

49:31

women, people of color, trans

49:33

folks, queer folks, we

49:36

are technology, like we are

49:38

what makes technology what it is. We

49:41

have been at the forefront of all

49:43

of this, of computers becoming

49:45

commonplace, of cell phones becoming what

49:47

they are, of smartphones, iPhones, all

49:49

of it. We have been at the forefront

49:52

of that since the very beginning. And

49:54

so I guess I just want to

49:56

challenge anybody listening whether

49:58

you're for a talk band against

50:01

it. Never use TikTok, never would use

50:03

it. Really see this

50:06

domain and these conversations

50:08

and these spaces as yours

50:11

to be part of, yours, to take up space, and yours

50:13

to show up in, yours to care about. I

50:16

think a big part of that sort of gatekeeping

50:18

that you spoke about, Annie is this idea that

50:21

we A don't

50:23

belong in these spaces, which is bs we do, we

50:25

invented them, You're welcome, or B that

50:28

we can't be involved in them

50:30

because we're not smart enough. You don't have the background,

50:32

whatever, whatever. I don't have any kind of hard tech

50:34

degree or anything. You don't need a

50:36

hard tech degree to know when you're being

50:38

taken advantage of by a tech leader. You

50:41

don't need a degree in it to know when a digital

50:43

experience does not feel good,

50:45

It is not validating, and it is exploitative.

50:47

Right. You are the expert of your

50:50

technological experience, and don't let any of

50:52

these tech bro losers tell

50:54

you otherwise. You are

50:56

smarter than Elon Musk, whoever listening

50:58

to this. You are smarter than Musk. I guarantee

51:01

it. I don't know who you are, but I can guarantee

51:03

you Elon Musk is stupider

51:05

than you are. For sure.

51:06

I mean he's bankrupted two companies.

51:08

Have you yet comment companies

51:10

that you bankrupted? Probably none.

51:12

Writing Samantha

51:19

sent me a TikTok recently about

51:21

the study about how male gamers

51:23

who are bad are the ones that are real pissed

51:26

off right, And we talked about this,

51:28

Bridget in our episode we did on

51:30

gate keeping and online gaming, and.

51:33

I was like, yeah, see, you

51:35

guys just suck.

51:37

Most of the trolls who were on

51:39

the gaming network were the ones who were actually really

51:41

bad at the game. Those are the only trolls that you released.

51:44

Yes, I'm good, I'm a

51:46

good gamer.

51:47

Don't come in my space.

51:51

You suck and you're taking it out of Andie stop

51:54

it.

51:56

Yes, well, we always love

51:58

having you, Bridget. Thank you so much. I'm

52:00

sure we'll be following up on this. We'll

52:02

see what happens. But in the meantime,

52:04

where can the good listeners find you?

52:06

You can check out my podcast. There are no girls

52:08

on the Internet. You can find me on Instagram

52:11

at Bridget brandyc on Blue

52:13

Sky, at Bridget Todd, on threads

52:16

at I think my thread's name is the same as

52:18

my Instagram name. I think I have to double check if that's true,

52:20

but you can try that.

52:22

Uh.

52:22

And you can find me on TikTok at Bridget Todd

52:24

makes pods.

52:26

Yes, and you should do that if you haven't

52:28

already. Listeners and check

52:30

out all Bridget has done on this show for

52:32

sure. Yes, thank you for

52:34

being here and listeners. If you would

52:37

like to contact us, you can our email

52:39

at Stephanie moms Stuff at iHeartMedia dot com.

52:41

You can find us on Twitter at mom Stuff podcast

52:43

or on Instagram and TikTok stuff.

52:45

I've Never told You.

52:46

We have a tea public store, and we have a book you can get wherever

52:49

you get your books. Thanks as always too,

52:51

our super producer Christina or executive producer Maya,

52:53

and our contributor Joey. Thank you and

52:55

thanks to you for listening. Steffan Never told You is

52:57

Fritch and by heart Radio. For more podcasts from my heart Radio,

53:00

you can check out the Heart Radio Apple Podcasts

53:02

wherever you listen to favorite shows.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features