Podchaser Logo
Home
Understanding TikTok – Should it Stay, or Should it Go?

Understanding TikTok – Should it Stay, or Should it Go?

Released Tuesday, 19th March 2024
Good episode? Give it some love!
Understanding TikTok – Should it Stay, or Should it Go?

Understanding TikTok – Should it Stay, or Should it Go?

Understanding TikTok – Should it Stay, or Should it Go?

Understanding TikTok – Should it Stay, or Should it Go?

Tuesday, 19th March 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:03

Now really.

0:06

Really now

0:09

really hello, and welcome to really know really with Jason

0:11

Alexander and Peter Tilden, who asked, isn't

0:13

it really time you subscribe to our show time?

0:16

As in TikTok and amazingly. On

0:18

this episode, we discuss TikTok,

0:20

the video hosting platform owned by the Chinese

0:22

company Fight Dance, and whether or not it should

0:24

be banned in the United States, as many in

0:26

Congress are trying to do.

0:28

We're joined by.

0:28

One of the foremost experts on the topic.

0:30

Bruce Schneier.

0:31

We'll discuss how spying, manipulation, and

0:34

misinformation are at the heart of the

0:36

to ban or not to ban issue, and whether

0:38

TikTok is really any different from meta

0:40

Google or x.

0:41

We'll also cover.

0:42

How surveillance as a business model is actually

0:44

one hundred percent legal, how social

0:46

media's ability to manipulate you is

0:49

much greater than you think, why data

0:51

privacy is a national security issue

0:53

that isn't treated like one, and lastly,

0:56

cougar hemorrhoids.

0:57

You heard me, folks, cougar hemorrhoids.

1:00

Really, no, really, here's

1:02

Jason and Peter.

1:03

An This episode

1:05

is about TikTok, because which

1:07

I know nothing about it. I saw that the majority

1:10

of Americans state TikTok is a threat to national

1:12

security. And then you know me, I read a lot in prep and

1:14

then there's articles that say.

1:15

Oh, why that makes me laugh. TikTok

1:18

is the threat that security. Yeah, and then

1:20

I read articles to say it's not. And

1:22

then meanwhile, India has banned it two different

1:24

times. They have the chairman of.

1:26

TikTok who just testified the congressional

1:29

hero are observation the congresional hero.

1:32

First of all, you have congress

1:34

people who are interviewing who

1:37

are borderline to seased and

1:41

like they're a step away. Their families have already

1:43

made plans. They're that old, they've been there that long.

1:45

Yeah, they have no idea about social media, none,

1:47

not none. So this guy's there. You do the line

1:50

of questioning that we love the most about the

1:52

chairman of TikTok testify,

1:56

But.

1:57

Miss chairman, are you a member of the Chinese Communist

1:59

Party?

2:00

No, I'm from Singapore.

2:02

But are you a member?

2:03

Are you now or have you?

2:05

Where? And where do you currently

2:07

reside?

2:08

And if i'm if I'm the CEO of

2:10

TikTok, I'm looking

2:12

at him and here's the bubble that are over

2:14

my head. You knew

2:17

I was coming, right, You have staff, couldn't

2:19

google me? Yeah? The other question, this is why

2:21

we wanted an expert on. This guy is a big time expert

2:23

coming on. So they're worried

2:25

the TikTok's going to take our information. Can't

2:28

TikTok just call Facebook and say you got it?

2:30

Already got it?

2:32

And I'm going to ask our guy,

2:34

Brushneier. But now

2:36

I'm not on TikTok, but I am on Instagram.

2:39

I'm on Instagram and our show is on all the platforms.

2:42

When we created an account for the show, what

2:45

the information did we have to surrender to

2:47

them? What did they know that that the Chinese

2:49

government is going to feast on that?

2:51

What? What? What did I give up? Can I tell you I put

2:53

in my social Security number? Did I give

2:56

them my my children's stock?

2:58

What?

2:58

What did I give them?

2:59

That is a sycle? You're full of it, Because

3:01

let's get our expert on Bush Schneier,

3:04

who is called a security guru. This guy

3:06

is testified in front of Congress. He's

3:08

written tons of articles. He can tell us the

3:11

really not really we're looking for today? Really if

3:13

it's so dangerous, it's so horrible,

3:16

why am I reading articles it says it's not dangerous,

3:18

not horrible.

3:19

Is TikTok dangerous or not

3:21

really nice?

3:22

Tell us already we want to really

3:26

already, don't yanks around. Bruce Schneier

3:28

has joined us. He is a

3:30

security expert. He's written multitude

3:32

of books. He's He's appeared in front of Congress

3:34

about hacking, about tech. He's

3:37

a fellow at the Brickman Klein Center

3:39

for Internet Society at Harvard University.

3:41

Lecture Harvard who

3:43

my kid went to the yel so you know who

3:46

I'll go.

3:46

I'll give a who to Harvard, even though yeah.

3:49

You don't know it's another So,

3:52

mister Schneier, we want to have you on

3:54

because this TikTok thing is

3:57

confusing. You. Hear that most Americans

3:59

think that it is an evil, horrible

4:01

thing. I know that the company administration

4:04

tried to ban it but couldn't. So welcome

4:07

to the show. And what do we need

4:09

to know about? What do we need to know about

4:11

TikTok? That will clear

4:14

up this question for the people listening.

4:17

Probably not going to clear it up, but we could certainly

4:20

talk about it. Okay, so let's

4:22

start generally, all

4:24

of these social media companies make

4:27

their money spying on you. TikTok,

4:31

Facebook, Twitter, Instagram,

4:36

every last one of them. Right,

4:38

they spy on you, They know a lot

4:40

about you, and they try tonipilate

4:43

you. Right, they try to sell you stuff. That's

4:45

their business model. So in

4:47

that way, TikTok is no different

4:50

than anybody else. On

4:52

the other hand, TikTok is

4:54

owned by the Chinese, which

4:57

makes it different. Is

4:59

a threat national security? I

5:02

don't know is Facebook trans national

5:04

security? Is there a difference

5:06

that matters? Maybe

5:10

it's actually complicated.

5:12

But what do they Here's

5:15

the thing that Peter don't know.

5:15

We're talking about, what do they know about

5:18

me?

5:18

Like, I'm not on TikTok, but

5:20

when I when I sign up for social media,

5:24

I have to give them an email address.

5:25

I have to give them very little information about

5:27

me.

5:28

But okay, Google, Google, Google

5:30

knows what kind of porn you like. Okay,

5:33

that's how you search for it, right. Google

5:35

knows everything you hope, everything you

5:37

fear, everything you searched. And these things

5:39

are very intimate. You

5:42

got a Google phart knows where you are at all times,

5:44

knows who you're with, because you've all got a phone.

5:47

These are the

5:49

amount of information about us. Facebook

5:52

famously knows your relations before you announce

5:54

it. They knows if you're voted, know who

5:56

you voted for, They know your sexual

5:58

orientation on the things

6:01

you do and post. These are very

6:03

intimate portraits of us through

6:05

our actions and tiktik, they

6:08

know a lot about us, and

6:11

the amount is kind of staggering. Those are a

6:13

fortunate consumer ports she's a couple of weeks ago,

6:16

and all the information Facebook buys

6:18

from other companies to compile

6:21

the dossiers. Assume TikTok

6:23

is the same thing. They all do this and

6:25

the Internet is built on spying.

6:28

This is the most effective surveillance device you've

6:30

ever seen.

6:32

So are we concerned about

6:35

let's say that it's a Chinese government is

6:38

in fact spying on this

6:40

in some way? Are we concerned about

6:42

the metadata that we are giving

6:44

to these people, or are we concerned literally that

6:48

if they want to manipulate Jason Alexander,

6:50

They're going to go into my profile, figure

6:52

out what I'm vulnerable about and manipulate

6:55

that.

6:55

What's our bigger concern?

6:57

Let's let's let's do them several. So let's

6:59

let's talk about the spying first. Now,

7:03

the Chinese government does a lot of spine and

7:05

they were behind the Equal Facts data breach. Remember

7:07

that one seventeen, under forty seven

7:10

million credit records of Americans

7:12

were stolen by the Chinese government. They

7:14

were behind the Office of Personnel Management. I

7:17

think that was a year later. This is the

7:19

profiles of everybody with security clearance. They

7:22

hacked Mariota Hotels. They told

7:25

database the zoo stay and where. If you sort

7:27

of look at this, they kind

7:29

of want to know who comes in to their country. US a government

7:31

employee. If you have all

7:33

of this data, you could

7:35

know a lot about individuals. And

7:37

you can imagine US government wants

7:40

this data. Chinese government, a lot

7:42

of governments want this kind of data. So yes,

7:44

you could easily imagine China

7:47

collecting data on Americans through

7:49

TikTok. You could also imagine them buying

7:52

it by from Facebook, like Facebook sales all this

7:54

stuff, so they don't need their on social network.

7:57

So that's sort of data surveillance. So

8:00

yeah, I think there's an issue here. Is

8:02

it really different than Facebook or Twitter everybody

8:04

else? Probably not?

8:05

But wait, but wait, let me let me before you go

8:07

into the other. The issue was that

8:09

the Chinese government, if you have a Chinese company,

8:12

which this is a Chinese company,

8:14

can decide to do whatever

8:16

they want, and then the Chinese government goes, we want

8:18

your fine, we want your info, and they have to them.

8:20

So of course, the same

8:23

way when the US government asks

8:26

Google or Apple or

8:28

Facebook for data, the US government lead

8:30

the companies give it to the US government. And

8:33

you know, we know something about how

8:35

these companies respond to data requests

8:38

from countries around the planet, right, you know

8:40

they do. We

8:42

don't know to what extent they

8:44

These companies respond to orders

8:48

by the Chinese government for day US companies,

8:50

but you're right, a Chinese company will

8:53

respond quicker or effectively give

8:55

it all right, like more under the control

8:57

of Chinese governments. So that's definitely a concern.

9:00

It's a matter of degree, not client. So

9:02

that surveils Let's move to the other

9:04

half. Manipulation, all right, we

9:07

know that US companies engage

9:10

in some manipulation. Facebook

9:12

rand and experiment in mood alteration

9:15

where they where they show that if they showed

9:18

you happy posts by your friends,

9:20

you'd be happier they showed you sad post

9:22

you'd be sad. They showed that

9:25

they were able to affect whether

9:27

you voted or not by showing

9:29

you posts of people you knew

9:31

who also voted, or not showing you those.

9:34

We don't know what extent the US

9:36

companies manipulate, right which you only know

9:39

Twitter. Make sure you see

9:41

Musk's posts and the posts that he likes.

9:45

So yes, you could easily imagine

9:49

TikTok doing the same thing. And even

9:51

though that malicious intent, it is clear

9:54

that the algorithm sends people down

9:56

rabbit holes of depression

9:58

or you know, anxiety,

10:01

or maybe even blissfulness

10:03

and happiness I guess said is

10:05

better than happy for engagement. So

10:07

we know that, like everybody else,

10:10

TikTok does that. That is

10:12

that is the nature of social media. The question

10:14

here is could they do it

10:16

on behest of their government? And the answer

10:19

is yes. We kind

10:21

of know the answer because the TikTok

10:23

available in China is

10:26

very different than the TikTok av the United States.

10:28

The Chinese government actually has

10:30

restrictions on the amount of

10:33

this kind of manipulation TikTok

10:35

and do on its own citizens, but

10:38

in the US it's a free for all. TikTok

10:40

and behave just like Facebook, Twitter and everybody else.

10:42

Wow, that's that's a really really

10:44

right there. Is it much of a difference from.

10:48

You know, I don't know, because it's not much

10:50

experimentation. I believe it is. I believe

10:52

there are rules on

10:55

on the kind of contact content TikTok

10:58

and show Chinese users they have on

11:00

engagements, so they don't want people addicted to it,

11:02

you know, But that's not just TikTok,

11:04

that's all social media. But of course

11:06

it's in China, so you've got you know, the Western

11:09

social media doesn't penetrate.

11:10

But Bruce, and you're talking about usually

11:12

young users. Correct, TikTok's

11:15

used younger than.

11:15

Other Seventy percent of young

11:18

Americans use TikTok. It's a crazy

11:21

number.

11:22

That's that's insane. So what about misinformation

11:24

pushing out generated

11:27

misinformation that the Chinese

11:29

government wants they use to create some kind of chaos

11:31

over here and division.

11:33

They surely can. They probably better off using Facebook

11:35

like everybody else does. I mean, that's what that's what

11:37

Russia does. That's what so so likely

11:39

they'll use Facebook because I think

11:42

you know, more older people are

11:44

on Facebook. But sure they can use TikTok,

11:47

and are they just like everybody else. So

11:49

again, there's no difference here, this information on

11:51

Twitter, this invation on Facebook, on

11:53

TikTok. The question which

11:55

things interesting is is there a

11:57

difference? You know, when you hear

12:00

also banned TikTok, we talk about that because that's

12:02

kind of a nutty ideas on its face,

12:04

they're saying there's a qualitative difference between

12:07

TikTok and Facebook. I'm

12:10

not convinced there is. Well, there's a

12:12

difference a degree probably, but

12:14

not a difference in kind.

12:15

But our Facebook, Instagram,

12:18

the other major

12:21

social media companies are they American based

12:23

companies.

12:24

Yes, So the big difference is the

12:26

Chinese.

12:27

Government is TikTok, it's our

12:30

spying is good, they're spying no good.

12:32

Well, but the question is how much spying

12:34

can the Chinese government do using

12:37

the American companies?

12:38

Right?

12:38

They can't, Right, they can't

12:40

get the same amount of bold data because

12:42

they don't have that direct government line. But

12:45

you know, we know they try to hack into these

12:47

companies all the time. Google

12:50

has been fending off China for like a couple

12:52

of decades now, usually

12:54

going after the Gmail accounts of Taiwanese

12:56

dissidents, so you

12:58

know, are they in We don't know.

13:01

So let me ask you about this because I've read that the

13:03

algorithm is different for TikTok,

13:06

that the algorithm there is

13:08

about minutes and hours of time

13:11

spent, whereas the others we're talking about are about

13:13

clicks and engagement. Is that

13:15

a nuance that makes a difference?

13:18

It probably does, you

13:20

know. So all of these Western

13:24

social media sites, and we'll add TikTok

13:26

in the United States, right, they make

13:28

their money showing you ass So

13:31

their goal is to keep you on their platform as

13:33

long as possible, which

13:36

tends to mean extreme content,

13:38

things that pissue off, things

13:40

that are outrageous, and

13:42

you know, cute puppy videos, right, So

13:46

all of those things are shown on

13:49

all these platforms to keep you engaged.

13:52

Now, it's interesting the difference. Apple's

13:55

the one company whose

13:57

business model doesn't rely on spying you.

14:00

Right.

14:00

Apple makes money selling you over price electrocs.

14:03

So Apple systems

14:05

tend not to have that addictive

14:08

quality because the company doesn't

14:10

need addiction to make money. Wow,

14:13

there's other companies do.

14:15

Right.

14:16

The more you use Google, the more money

14:18

they make. The more you use Facebook,

14:20

and the more money they make as

14:22

opposed to Apple, which is singularly different.

14:25

So given all this, what

14:28

as a security tech guru,

14:31

what are you called upon most to do?

14:34

What do you do for these companies?

14:35

And my second part of that question is what can

14:38

what can the consumer? What can I do

14:41

while I'm on my social media to

14:43

push back at this?

14:44

If anything?

14:45

Are you gonna hate that answer? Uh?

14:47

I mean, what

14:56

can the consumer?

14:57

What can I do while I'm on my

14:59

social media to push back

15:01

at this?

15:01

If anything, you're gonna hate that answer?

15:04

Uh? I mean a lot of what I do is security

15:06

consulting, you know, how to networks

15:08

safe? The answer your question when you asked,

15:11

like, as a generic user, what can

15:13

I do? And the answer is almost

15:15

nothing, right, because

15:17

your data is not in your hands. Think about what your

15:19

data is. Your emails

15:22

on Gmail, your photos are

15:24

on No No, No, some photo site. Your

15:26

documents are in the clouds somewhere else. You

15:29

know, your calendar is here. Your

15:32

data is no longer under your control now,

15:36

So all these security measures, you've

15:39

got to trust the companies. And

15:42

that's a terrible story because

15:44

the companies are like pretty untrustworthy.

15:47

But at this point, the

15:49

advice I can give you is very much around

15:51

the edges, because your data

15:54

is elsewhere controlled by other

15:56

companies with their rules. And

15:58

you call up, oh, I don't know, you

16:01

know, facky and say, you know, I want to know what

16:03

security measures my photographs are being

16:05

protected by and they say, no, we

16:08

won't tell you. Your choice is

16:10

to trust them are not. So what can you do?

16:13

Don't have a cell phone, don't have

16:15

an email address, don't have a credit card,

16:17

like, and that's kind of stupid advice

16:21

for the first quarter twenty percent.

16:23

So I can't let me ask

16:25

you this. Can I Can

16:28

I game my feed? Can I

16:30

refuse to click on stuff that.

16:31

I know they want me to click on?

16:33

Can I can I not be sucked

16:35

in by the enticements and thereby

16:37

change what they send me?

16:40

I mean you can, like, if you stop clicking on

16:42

those cute puppy photos, they're going to stop sending

16:44

you cute bubby photos. But honestly, they're

16:47

cute bubby photos. They are hard not to click on.

16:49

But I'm thinking more about I'm thinking

16:51

more about political stories or

16:53

cultural stories where you

16:56

know it's being fed to me because they think they understand

16:58

my bias.

17:01

And you know, maybe so.

17:02

One of the things I teach this in a

17:04

policy school here at Harvard, and

17:07

I had students last week download

17:10

their Facebook and Google data. Both of those

17:12

companies allow you to download

17:14

their data about you. Do

17:17

it. It's actually interesting

17:20

what they have on you, how

17:22

they've categorized you, what

17:24

they've gotten right with they've gotten wrong. Now,

17:26

in the United States you don't have any ability to correct that.

17:29

In Europe you have some ability to

17:31

have them delete your data and correct your data.

17:33

Not in the US you have no recourse.

17:34

I downloaded on I see

17:36

egregious things said, There's

17:39

no way to correct, no way to contact them.

17:41

Says you see posts on

17:43

Facebook you wrote but never sent.

17:46

Can you well hold on hold on a second

17:51

posted post.

17:54

Yes, if you wrote something and said,

17:56

you know, I don't want to send that, delete

17:58

it. They've got a copy.

18:00

We need to do. You got to testify in front of Congress

18:02

again. Damn it.

18:04

I you know you can see

18:06

it yourself. It's it's I get it. It's

18:09

there. They have an idea of where you

18:11

live and where you work based on sort

18:14

of where you are during the day and night, and

18:17

of course you all have one. They know who you sleep with, like

18:19

are you are you able to?

18:20

Uh?

18:21

Is that something you can explain, like in a few

18:23

sentences about how somebody goes to download

18:25

their data that way?

18:27

Uh?

18:27

You know what I tell them students is go do it

18:30

because it's not that hard. I mean, type

18:32

into Google, download your Facebook data, gives

18:34

you instructions.

18:35

It's not you know what's so funny if

18:37

you got me the next question was going to ask, it would

18:39

be that how do you down? How

18:41

do you download?

18:42

Because I'm now seriously, I'm I

18:45

I signed up for Facebook.

18:46

I have never used it.

18:48

So I always tell anybody if you saw something

18:50

of me on Facebook, it ain't.

18:51

But knows you have Facebook collect data

18:53

about you even if you're not on Facebook.

18:55

Well, then may I just say them?

19:02

So when you testified in front of Congress,

19:04

were you completely frustrated

19:07

when when you finished by

19:09

the questions they asked, So, I'll.

19:11

Give you a secret unless your high

19:13

profile like a Zickberg. I

19:15

mean the test of the congressional hearings you

19:17

watch on television, most of the members

19:20

aren't there. They're staffers there.

19:22

Wait that.

19:25

I knew that, But that's this is

19:27

a good thing.

19:28

The staffers are who you want to talk. They're

19:31

the ones who know stuff and get stuff

19:33

done. You know, I tend to know.

19:35

I'm speaking to the staffers. So yeah, the members

19:37

will ask questions. There'll be questions,

19:39

the staffers wrote often they want

19:41

to hear the answers. I'm good at giving answers

19:44

and non techi terms, so I'm likable.

19:47

You know.

19:47

I'll give them homework. They laugh.

19:49

But I want to hang myself since we started this

19:52

whole.

19:52

Yea, but let's talk about

19:54

answers. Let's

19:57

talk about the notion of

20:00

of what we might meet and say

20:02

to ban Facebook.

20:03

Right, I mean I was going to ask

20:05

you, how would you change the laws here?

20:07

Yeah?

20:08

Okay, So there are several ways

20:10

to do this. So the first one

20:13

is you could ban

20:16

the app stores from selling

20:18

the app I would make from making the app available,

20:20

So the Apple and Google app stores app

20:23

is no longer available. This is

20:25

like vaguely effective. It won't affect

20:28

the seventy percent of young people have already installed

20:30

the app. It's on their phone. Anybody

20:33

who jail breaks their Apple phone

20:35

or uses an alternate app store for Google, will

20:37

they all download it? So you can still do

20:40

it. Presumably the kids will exchange

20:42

tricks on how to do this. It'll be vaguely

20:45

effective. It doesn't

20:48

affect the website at all. You

20:50

can always go to TikTok through the website.

20:53

So not a great idea. Let's do

20:55

another one. Congress can

20:57

ban US companies from

21:00

doing business with Byte dance by dance

21:02

to the company that would and this is what Donald

21:05

Trump tried in twenty twenty and

21:07

the courts courts blocked it. All

21:10

right, So what

21:13

this does is it keeps

21:15

TikTok from being on American infrastructure.

21:17

It would have some effect. It would definitely cut

21:19

into TikTok's business. US

21:22

advertisers couldn't pay money,

21:24

but adds on TikTok if it's actually

21:26

subsidied that Chinese wouldn't make

21:28

that much of a difference. Maybe a little slower

21:30

to load because the servers will be in Europe

21:33

or in Canada instead of in the

21:35

United States. Okay, so

21:37

that's two. The third thing

21:39

they can do is actually

21:43

try to ban the

21:45

companies from being on the

21:47

internet the United States. And this would

21:49

be incredibly difficult. This

21:52

is building a censorship regime like

21:54

China. We do not have the ability

21:57

to enact that sort of ban, So

22:01

our options tend to be further

22:04

back in the financial realm, either

22:06

prohibiting the app stores from having it,

22:08

phivting US companies from doing business with byte

22:11

Dance. I'm

22:13

not gonvin still be that effective. And

22:15

they're a big deal. I mean there

22:17

really are. You know, we do

22:19

that for things that are actually

22:22

freaking dangerous.

22:26

So they can't prove, they can't prove unequivocally

22:29

what's going on, So, Bingo, are youre likely

22:32

you're.

22:32

Being spied on by the Chinese?

22:34

I mean, duh, if

22:36

we had a functional Congress and

22:38

that they were savvy enough in this area, is

22:42

there any kind of laws that we could

22:44

create that would diminish

22:47

the ability of these companies to legally do what they're

22:49

doing?

22:50

And yes, I mean right now, surveillance

22:52

as a business model is legal without

22:55

any regulation. You

22:57

know, Google can sell what kind

22:59

of you like to whoever wants to buy

23:01

it. So yes,

23:04

and Europe past some of these laws. The US

23:06

doesn't. So when I think

23:08

of the problem and the solution, I

23:11

go back up right. I want

23:15

serious data privacy laws that

23:18

protect us from all the companies,

23:21

not just TikTok, but the rest of them as well.

23:23

Yeah, you know, a good regulation

23:26

that protects us from all the companies, protects

23:28

us from the Chinese. Now i'd like

23:30

to be able to say that. Look, you know, surveillance

23:33

as a business model is now a national security

23:35

concern.

23:36

So they're collecting all this information on

23:38

us. And these companies want to grow and grow and grow.

23:40

Where are they going to go next for revenue

23:43

model? Since they have access?

23:46

Is there something that's coming that we're not prepared

23:48

for that you see from the cyber area that we

23:50

haven't even dealt with yet.

23:53

So they've already telegraphed, they've

23:55

already said pretty much all of them that

23:57

their future is AI, that

24:00

AI being able to analyze

24:04

better and manipulate better. Now,

24:06

I don't know how much of this is real, how much of this is a red

24:08

herring, but that is what they

24:10

are saying. And they're looking

24:13

at the same things you are, right like, we're saturated.

24:15

What do we do? They see AI

24:17

as a way to loose their.

24:18

Revenue revenue screen because that's what it's about.

24:22

Well, thanks, it's been a pleasure having you on. I now have

24:24

and I'm going to go hang myself.

24:28

I like this optimistic podcast.

24:29

This is great, but have you changed

24:32

your baid? Do you feel pretty good every day? Do you get up

24:34

and go to Harvard whistling fixing on the way to

24:36

class or do you have.

24:37

Honestly, you have to I

24:40

mean yes, I mean this is a lot of bad

24:42

stuff in there, but you know, I

24:45

tend to be longerm optimistic. This is not gonna

24:47

be the thing that will, you know, destroy civilization.

24:49

We've gone through a lot worse, and honestly,

24:51

everybody makes their own decisions. I

24:54

don't use Facebook, but I freaking

24:56

play Pokemon Go.

24:57

That's a space game, you

25:00

know.

25:00

So I'm not telling you

25:02

to not to do these things. Don't think

25:05

of that. We all make our own trade

25:07

offs. Now, you want to watch the

25:09

cute puppy videos, go to town. It makes

25:11

you happy.

25:13

You like the puppy thing, don't you

25:15

You know, I don't know.

25:16

I don't actually watch it's

25:18

what's coming up. It's coming up.

25:20

Well, I don't actually watch my own porn. I don't

25:22

do it. I have puppy

25:26

is playing with my IP.

25:27

All right, this is this is the way to a sharing.

25:30

You just made just uncomfortable.

25:32

Finally, fantastic, I got.

25:35

Well, you're thanks for coming on, man, I you

25:37

amazing, amazed, Thank you very

25:39

much for coming on. Appreciate it.

25:41

See on the app. Okay,

25:54

listen, if you had to do it, could you unplug from

25:56

all of it?

25:58

You couldn't do it.

25:58

No, what would I do to talk to you all day? I want

26:01

to see stuff that's interesting.

26:02

Well, you couldn't talk to me. I'm like, I'm away myself.

26:05

Well, what did we do is get this? We had it? But

26:08

you can't. You can't opt out. You can't opt out

26:11

there.

26:11

Honestly, there are days where I wish every

26:14

satellite would just explode so that we

26:16

could go back a little.

26:17

Bit to no communication systems.

26:19

It wasn't that there were no communication

26:21

systems. Things false.

26:25

First of all, there was more privacy. Second of all,

26:27

false falsehoods didn't

26:30

propagate.

26:30

By the way, there was no more prophecy. My

26:33

grandman knew everything that's going on

26:35

with everybody. I used to say to her,

26:37

did you read Grandpa's diary? You look at you stuff?

26:39

She said, Well, I'm going to go to bed every night with a stranger.

26:42

She said, you listen, you follow me up and followed you

26:44

over. I want to know exactly what, David.

26:47

What do you got David.

26:50

Oh Man.

26:52

What don't I have?

26:53

Right, what don't you have?

26:55

Let's see?

26:56

Can I add to the pessimism

26:59

or I don't know.

27:01

We've had enough of that. TikTok.

27:04

Have you got you guys views TikTok.

27:06

I do not.

27:07

I work

27:09

on TikTok A lot. Go ahead.

27:11

You know what tworking is.

27:13

So that is something.

27:17

Well, condescen

27:20

means talking down to you.

27:22

Let me give you a little view of some of the things

27:25

that you're missing.

27:27

Okay, so here we go, leave you on things

27:29

we are missing. Yeah.

27:31

So the number one video

27:34

of all time on TikTok has

27:36

two point one billion views. It's

27:39

from a guy named Zach King,

27:42

and it's called Zach King's Magic Bride.

27:44

I can't believe you haven't seen this. He

27:46

appears. He's like on a suburban street

27:49

and sort of a Harry Potter get

27:52

up and he's like literally flying.

27:55

I've seen him.

27:56

I've seen it. And it's a mirror

27:58

and he shows you the mirror and the thing.

28:00

Yes, yes, yes, so that

28:02

that seat you watch

28:04

it.

28:04

I don't watch it on TikTok. I didn't

28:06

watch it on TikTok. Oh it came

28:08

to me instead. Oh, Instagram.

28:11

Well, there you go. You're luckily.

28:13

Luckily for you, the Instagram has

28:15

completely ripped off the TikTok look

28:17

at you.

28:18

Another one.

28:19

This one's actually a sort of sweet.

28:21

It's from a woman

28:24

named Naya Dolly uh,

28:26

and it's.

28:27

Called Nya Dolly's Beauty Tutorial.

28:29

Look in this one.

28:31

But she's a very dark skinned African

28:33

American woman doing her makeup, and

28:36

she's very sweet, and it just sort of

28:38

shows you, you know, a little bit like

28:40

it's not all the typical, you know,

28:42

stuff that you actually would assume was

28:44

was our TikTok.

28:45

So that was nice.

28:47

Some of the other things that I looked at that

28:50

are nonsense. The world's

28:52

largest elephant toothpaste

28:54

experiment, So that one has two hundred

28:56

and sixty six million, because I have no idea

28:59

what that's about. Another one,

29:01

this is the number two video on the

29:03

on the app.

29:04

It's Welcome to the Sister's

29:07

Christmas Party by James Charles

29:09

that has one point seven

29:12

billion views. And this one,

29:14

and I know you're gonna download the app because

29:16

it's so exciting.

29:17

It's called Bella Porches m

29:20

to the b.

29:23

Uh Huh, which has seven hundred and thirty

29:25

one million views, so.

29:27

And you have no idea what happens on the m

29:29

to the bit. No, I couldn't be

29:31

bothered, You couldn't. I thought.

29:33

The one that would have gone through the roof is the

29:36

the iguana who's having the acid trip?

29:39

Well, whose floor is that? Not my floor,

29:41

not my flaw and not my problem. Remember that one?

29:44

That that thing, Oh

29:46

that was one of the funny.

29:47

You know what, this is scaring me more than the security

29:49

brief. How about that? You don't know that we

29:52

spend time looking at this stuff that the iguana

29:55

iguana, Well.

29:55

It's not talking one. It's a it's an animated

29:58

one.

29:59

Animated one.

30:00

Apparently the story behind it was a guy dropped

30:03

acid and they put him in a closet and recorded

30:05

him and then they animated this lizard

30:08

whatever. But

30:11

that that moves society forward.

30:13

That kind of Yes, it makes us better,

30:16

yes, better people, and it also gets a Chinese

30:18

to go. I don't know what.

30:19

I don't know what.

30:19

I don't know what we do.

30:22

You know what, we put up a lot of stuff that

30:24

con That's what I was with their head.

30:27

We game it with that, and I only

30:29

watched certain kinds of things. Can I yes,

30:32

right, cougar. Yeah,

30:34

what's that cougar?

30:36

Ham around with these all day

30:38

long?

30:38

Yes?

30:39

Yes, wow, give me another?

30:41

Okay, okay, all right, all

30:44

right, hold on second.

30:45

Do something with a train and hold on

30:49

dancing. No.

30:50

I was going to go right to Madonna Each,

30:52

and I got to give me a minute,

30:54

right, but Madonna

31:01

Each the world's largest wawfut,

31:08

we're going lee there. We only got

31:10

it, Thank you very much, Davin.

31:14

I feel like we haven't moved the ball at

31:16

all. The question was TikTok

31:19

safe or not? Well?

31:21

The question was is China profiting?

31:24

Yes?

31:25

Are they spying?

31:26

Yes? Are we yes?

31:27

Is everybody spying?

31:29

Yes?

31:30

Have we been compromised?

31:31

Yes?

31:32

Are we in trouble?

31:33

Yes?

31:33

Is anyone going to help us?

31:35

No?

31:38

You are our future

31:41

people who.

31:42

Have not resited.

31:44

Yes, that's

31:46

another episode if really no really comes to a

31:48

close. I know you're wondering how many global

31:50

users do the top ranked social media sites

31:53

actually have that answer in a moment, But first

31:55

let's thank our guest, Bruce Schneier.

31:57

You can follow Bruce.

31:58

On x where he is at Schneier. On

32:00

Facebook, he is Bruce Schneier and he has a

32:02

blog at Schneier dot com. Find

32:05

all pertinent links in our show notes, our

32:08

little show hangs out on Instagram, TikTok,

32:11

YouTube, and threads at really No Really

32:13

podcast And of course you can share

32:15

your thoughts and feedback with us online at

32:17

reallynoreally dot com. If you have

32:19

a really some amazing factor

32:21

story that boggles your mind, share

32:24

it with us and if we use it, we will send

32:26

you a little gift. Nothing life changing,

32:29

obviously, but it's the thought that counts. Check

32:31

out our full episodes on YouTube, hit that subscribe

32:34

button and take that bell so you're updated when

32:36

we release new videos and episodes, which

32:38

we do each Tuesday. So listen

32:40

and follow us on the iHeartRadio app,

32:42

Apple podcasts or wherever you get

32:44

your podcasts.

32:46

Now, what are the.

32:47

Top social media apps and how many users

32:49

do they have?

32:50

Well lower down the list or some you may recognize.

32:52

Threads has one hundred million users,

32:54

Cora has three hundred million, LinkedIn

32:57

has four hundred and twenty four million

32:59

users. X formerly Twitter, ranks

33:01

in at number thirteen with five hundred

33:03

and fifty million users, and the top

33:05

ten ranked from number ten to number one. Sinoweibo

33:08

five hundred and ninety nine million, Kui

33:10

Show six hundred and seventy three million,

33:13

Snapchat seven hundred and fifty MILS,

33:15

Telegram eight hundred mil, Little

33:17

Old TikTok at one point two to two billion

33:20

users, we Chat one point three

33:22

to two billion, Instagram two point

33:24

oh four billion users, YouTube

33:26

coming in at two point four to nine billion.

33:29

What's App is number two with two point seven

33:31

eight billion and blowing them away at

33:33

number one is you guessed it? Facebook

33:36

with three point oh five billion users

33:38

per month worldwide and maybe

33:41

two hundred of them follow me. No,

33:43

really, million,

33:48

It really is the production of iHeartRadio and Blase

33:51

Entertainment

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features