Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
Guess what day it is? Oh,
0:04
it's French Friday, it's French
0:06
Friday, so grab your fries
0:08
and say hooray. David French
0:10
is here to play on
0:12
French Friday. It's French Friday.
0:17
David French, welcome back. Well,
0:20
thanks for having you back, Skye. Always
0:22
enjoy talking to you. Somebody
0:24
had asked me if we were gonna talk
0:27
about Dune Part Two this week, and I
0:29
know you've talked about it with, I think,
0:31
Jonah on a Dispatch podcast, and we will
0:33
do another nerd fest, but it can't be
0:36
this time. We've
0:38
both seen it, we both loved it from what I've
0:40
seen you say on social media, so sorry
0:43
to disappoint those who really want a deep dive
0:45
on Dune Two. And plus, aren't we reserving that
0:48
for a nine-part special series? Yeah,
0:50
we really should, and you
0:52
just agreed to do a live event with us in Nashville
0:54
in May. I did. Maybe
0:57
we'll get some nerdy questions at that. If
0:59
people haven't seen it. I'd be disappointed if
1:01
we didn't, yeah. I know. Go to holypost.com/events,
1:03
and you can sign up for
1:05
that one, or we have some others going on with other
1:07
guests in other parts of the country. Okay, David,
1:11
you wrote a piece earlier this week in
1:13
the New York Times about TikTok, because TikTok
1:16
is making waves in Washington.
1:19
So the big controversy is that
1:21
the House of Representatives has passed
1:23
a surprisingly bipartisan bill that
1:26
would either ban TikTok in the
1:28
United States or mandate that the
1:31
Chinese owners of TikTok sell its
1:33
American version to American owners so
1:35
that it wouldn't be foreign-owned anymore.
1:39
That bill now goes to the Senate, but let's begin with this.
1:41
Why are people freaking
1:43
out or seeing TikTok as a
1:46
national security risk? What
1:48
is the threat of this social media
1:50
app? Right, so I'll give you
1:52
the case that
1:54
smart folks are making against TikTok, and then
1:56
there's a dumb case against TikTok. All right.
1:59
Yeah, so. The case that a
2:01
lot of smart folks are making against TikTok is they say,
2:03
okay, let's just
2:05
leave aside for the moment the
2:07
content of the app. Let's
2:10
not talk about the content of the
2:12
app. Let's not talk about whether having
2:14
a lot of dances or people dancing
2:18
and explaining geopolitics or
2:20
all of the various kinds of TikTok crazes and
2:22
trends are good for our youth. Let's
2:25
just put that to the side. We're more, much less
2:27
adults. Good for youth or adults. Let's put all of
2:29
the content stuff to one side and just look at
2:31
this. The People's
2:34
Republic of China has ultimate control
2:36
over an app that 170 million people
2:40
use that vacuums up an enormous
2:42
amount of our public, I mean
2:44
of our private information and
2:47
then has the ability to then use
2:49
that same app to broadcast straight back
2:51
out to us through push notifications, through
2:54
algorithmic changes that an actual
2:56
geopolitical enemy of the United States is
2:58
vacuuming up our personal information and has
3:00
the ability, if it so chooses, to
3:03
speak directly and immediately into the homes
3:05
of 170 million Americans. The
3:09
argument there is it doesn't matter what the content is.
3:11
This could be cute puppy
3:14
talk instead of TikTok. It
3:16
could be Bible versus the day talk.
3:19
It could be anything, but
3:21
by granting the People's Republic of
3:23
China that level of access to
3:25
our personal data and by granting
3:27
the People's Republic of China that
3:29
level of ability to immediately inject
3:32
itself into our public debate creates
3:34
a national security problem for the United
3:36
States. Therefore what this
3:38
bill says, it's not banning TikTok because
3:41
of its content. It's saying you can
3:43
have TikTok, you can have TikTok
3:46
exactly as it is just not
3:48
owned by a Chinese company, which
3:50
is ultimately subordinate to the Chinese
3:52
state. Just
3:55
if you think that that might be unusual To
3:57
ban an app because it's owned by a foreign. Then
4:00
treat well. That's why China does
4:02
to. I oversaw American social media,
4:04
right? And so I'm in. The
4:06
argument for banning Tic Toc or
4:09
for this legislation is it's not
4:11
about the dancing teenagers engines the
4:13
years. It's about who run, who
4:15
ultimately possesses the data, and who
4:17
ultimately controls the content. It's a
4:19
who, Not a what is that
4:22
Mcgrath? So or to stay on
4:24
the who Not what. Side
4:26
of this there have been some defenders of Tic
4:29
Toc who are say we'll wait a minute all
4:31
the content that we are seeing on there. Is.
4:34
Coming from, American users might listen
4:36
majority of it's so who cares? Like
4:38
if if Americans are Korean a concept
4:40
and Americans are consuming the contents,
4:42
then. Let. Americans have the
4:44
contents. There's been some arguments against
4:47
that, saying that Will the algorithm
4:49
that's actually control in China is
4:51
determining which content gets scene which
4:53
content gets most played. Some have
4:56
made the argument that people who
4:58
get their news content. Primarily.
5:01
Through Tic, Toc have a
5:03
disproportionate sympathy for Palestinians in
5:05
Gaza, rather than. Anti.
5:07
Semitic anti Semitism in the Us are
5:09
anti Israel like contests. So there's all
5:12
these concerns going back and forth about
5:14
the nature of the comments. Should we
5:16
be concerned about. The. Content at all
5:18
in this conversation? Or is it simply.
5:21
China. On this app and that's bad. So
5:25
yeah edition. Here are some content arguments is
5:27
so for example there's a Chinese version of
5:30
Tic Toc that just feeds Chinese kids and
5:32
like a bunch of is expressed. In
5:35
an American, Tic tac is inane by
5:37
comparison, And so there's a bummer to
5:39
say that China's using it to do
5:41
to basically. make make their
5:43
community listen stupid right yeah we we
5:45
do that perfectly well without china like
5:48
that that is are you know it's
5:50
it's it's not like instagram is like
5:52
a a constant lecture course it m
5:54
i t i mean come on here
5:56
you know so tic toc is not
5:58
materially different from american own social media.
6:00
There's no question about that. And so
6:02
that's why the content argument A,
6:05
doesn't really work based
6:08
on current content. And
6:10
B, would have a constitutional problem.
6:13
Because if you're saying we're banning
6:15
this because of the content that
6:17
Americans create in
6:19
America on a Chinese
6:21
app, well, then you're beginning to engage
6:25
in viewpoint discrimination against Americans
6:28
engaging in speech in America, but
6:30
on a Chinese owned app. And
6:32
so that's why that you have
6:34
to make this determination without regard
6:37
to the actual content on TikTok right now.
6:39
And you would have to argue
6:42
that there is a reason for
6:44
this action that is independent of
6:46
the content. Now that's not
6:48
to say that the content would never
6:51
matter. So for example, if you're talking
6:53
about a national security emergency, imagine
6:56
China launches an attack on Taiwan,
6:59
and then immediately is able to use a
7:01
lot of the location data and the information
7:03
data that people have given
7:06
to ByteDance through belonging
7:08
to TikTok to take
7:11
perhaps direct action against children
7:13
of American public officials, for
7:15
example, or immediately
7:18
use the algorithmic control to completely
7:20
flood the zone to 170 million
7:23
Americans of Chinese propaganda
7:25
about Taiwan. Now
7:28
that's not currently happening. So that's not
7:30
why this action is taking place because
7:32
of that content. But again, to put
7:35
it in another context,
7:38
imagine if on the onset of World War II,
7:40
170 million Americans
7:43
were then immediately available
7:46
for Hitler to provide
7:48
direct unmediated
7:51
propaganda access. Hitler
7:53
had direct unmediated propaganda access to 170 million
7:56
Americans. That
7:58
would be disastrous. that
8:00
would be terrible. It's terrible for national
8:02
security. So it's not about what TikTok,
8:05
the content on TikTok now,
8:08
although some people, when I said there's sort
8:11
of a not smart argument about TikTok, that's
8:13
the not smart argument about TikTok. Cause if
8:15
you're gonna single out TikTok,
8:17
well, what about Instagram reels?
8:19
What about Snapchat? I mean, it just goes
8:21
on. Okay, the issue of either
8:24
the foreign governments or US government
8:26
utilizing social media platforms for propaganda
8:28
purposes. We're gonna come back to
8:30
that later in our conversation. But
8:34
going back to this legislation right now on TikTok,
8:36
a lot of people were surprised that it
8:38
got bipartisan support in the house, given how
8:40
divisive our politics is these days. And
8:43
it's really interesting who was against
8:45
this TikTok ban because
8:47
you had people
8:50
you would expect on the far right, those
8:54
representatives who leveraged social media
8:57
extensively like Marjorie Taylor Greene, Matt
8:59
Gaetz, some of the other real
9:01
distasteful figures on the far right.
9:03
But you also had figures on
9:05
the far left, like AOC, who
9:07
were against this bill. So your
9:09
thoughts on why
9:12
opposition to this bill seem
9:14
to find really odd support on the extremes of
9:17
both parties. Yeah, it's a
9:19
great question. But it makes
9:22
sense for both extremes and it makes sense
9:24
under their own terms. So
9:27
for example, the right edge extreme,
9:29
was soon as Trump came out against it, which I'm
9:32
sure we'll talk about that as soon as Trump came out
9:34
against it. Obviously there's gonna be
9:36
some members of the Republican Party are just
9:38
gonna fall in line immediately, even
9:41
though opposition to China is kind of
9:43
a cornerstone principle of MAGA. That
9:46
in fact, this is how MAGA defends
9:48
itself against arguments about isolationism. They say,
9:50
well, look, Europe can handle Russia.
9:54
Let Europe handle Russia, that's not our
9:56
job. Only we can handle China and
9:58
China is the real threat. So
10:01
for years, the MAGA line was, you
10:03
know, we're not isolationist. I mean, America
10:06
first isn't the synonym for isolationist, but
10:08
China's the real threat from trade to
10:10
military power to economic power, you name
10:12
it. And so that's
10:15
why when Trump reversed, not
10:17
everyone went along with him, but some
10:19
people definitely were. And
10:21
then also you have to realize that a
10:24
MAGA content does pretty well on TikTok.
10:27
So there are some Republicans who said,
10:29
look, this is against our interests. MAGA
10:31
content does well here. This
10:34
is TikTok doesn't censor MAGA the
10:36
way, say, American tech
10:38
companies might censor MAGA. So
10:40
it's very against our interest to
10:43
censor TikTok. Now, on
10:45
the far left, some of this is
10:47
a holdover from the idea that the
10:50
anti-China emphasis is racist and
10:52
not really ideological or
10:54
national security based. In other words, that
10:57
singling out a Chinese app
10:59
is anti-Chinese and an ethnic
11:01
or racial sense versus anti-Chinese
11:05
Communist Party. I
11:08
would assume if there was a TikTok app
11:10
that had access to 170 million Americans that
11:12
was being put
11:14
out by Putin's Russia, we would
11:17
probably be just as concerned about that, if not
11:19
more concerned. Well, that's the answer back
11:21
to, say, an AOC or that far left
11:23
argument is no, you're just wrong. You're
11:26
not a racist. Race
11:28
has nothing to do with this. This is about
11:30
a geopolitical foe owning a social media app. And
11:32
then the other thing is Gen
11:35
Z is all over TikTok.
11:38
And Gen Z is a big
11:40
Democratic voting base. And so some
11:43
of the Democratic reluctance to deal with
11:45
TikTok is just basically, hey,
11:47
look, we get it. We understand that
11:49
there are dangers here,
11:52
but this is really unpopular.
11:54
This is not something that
11:57
our core constituency wants. So
12:00
they're going to look for ways to sort
12:02
of deal with the national security issue while
12:04
not alienating a core constituency. I
12:07
wrote about this and I said, look, TikTok's
12:09
popular and Congress doesn't like to ban popular
12:11
things. And so there's real concern
12:13
that it's slowing down in the Senate. The ban
12:16
is slowing down in the Senate, both
12:18
because a key democratic
12:20
constituency is all over TikTok and
12:23
because on the Republican side, Donald
12:25
Trump doesn't want it banned. And
12:27
so, yeah, we'll see. We'll
12:30
talk about the Trump thing in a second.
12:32
I don't quite get though is, I mean,
12:34
obviously TikTok is an enormously valuable company just
12:36
because of its huge audience and its revenue
12:39
potential. And so with
12:42
the possibility that this bill might force
12:44
China to sell TikTok, there are already
12:46
suitors lining up to buy this. Yes.
12:49
Right. There are private equity firms
12:51
here in the US. The Saudis already said they want a
12:54
stab at buying this app. I
12:56
don't know if that would be better or worse. Again, they have
12:58
a different foreign power owning it. But someone's
13:00
going to buy it. Someone's going to relaunch
13:02
it. If it's
13:05
off of the US market for any amount of time,
13:07
it's going to be incredibly short. So
13:10
Gen Z is concerned that they're not going to
13:12
be able to do dance moves anymore or waste
13:14
hours scrolling on TikTok. At
13:16
the very worst, it's going to be a short-lived
13:19
disappearance of this app before it's relaunched. So it's
13:21
not going away. I can't see that being a
13:25
lasting concern of anyone in Congress who
13:27
says, well, we've got to just bypass
13:29
the national security fears because Gen Z
13:31
has to have this addiction-fed.
13:36
Gen Z needs its crack. Right. Okay.
13:41
And it's fascinating because back in 2020,
13:44
Trump tried to ban TikTok when
13:46
he was still in office. But
13:49
recently, he has completely done a 180
13:51
and now says we need
13:53
to not ban TikTok. Instead, we
13:56
should ban Facebook. Thanks
13:59
for the real... real enemy. Yeah.
14:02
Right. All right. So
14:04
let's analyze this. What
14:07
are the hypotheses as to why Trump
14:09
has done a 180, especially given his
14:11
anti-China rhetoric, which has been a cornerstone
14:13
of his MAGA platform since 2015? Why
14:17
is he now kind of
14:19
chief defender of TikTok and Chinese
14:21
ownership of TikTok? Yeah.
14:24
Yeah. Look, there's a
14:26
number of viable theories right now, but
14:28
all the viable theories depend on an
14:31
underlying reality about Trump. And that underlying
14:33
reality is there's no fixed principle here
14:36
other than what is it that helps Donald Trump? Right.
14:40
So even if you go back to 2020, I had a
14:42
very thoughtful reader wrote into me
14:44
and it said, even that
14:47
2020 Trump effort to ban
14:49
TikTok, that wasn't
14:51
necessarily motivated by national
14:53
security idealism because it's occurring after a
14:56
lot of Korean pop
14:58
fans, Korean K-pop
15:01
fans flooded TikTok with
15:03
this sort of idea to vacuum
15:05
up tickets. Remember this to a
15:08
Donald Trump political rally? Yeah. Yeah.
15:12
And there were all these empty seats because like
15:14
somehow, and I can't remember how there was a
15:16
grudge between K-pop fans and Trump. There's been so
15:18
many grudges over the years. And
15:20
so the argument was that, well, wait, that first initial
15:22
effort to ban was nothing more than a temper tantrum
15:25
itself. I'm not
15:27
as convinced by that because there
15:29
was a very consistent theme in
15:32
macro world for
15:34
all four years that while Trump would express
15:36
personal admiration for Xi, the Chinese
15:40
leader, there
15:42
was always an anti-China
15:46
strand through Trump foreign policy.
15:48
Right. Yeah. Yeah. Including
15:52
trade. Absolutely. And so
15:54
the Trump TikTok ban effort in 2020 was par
15:56
for the course for MAGA. I
15:58
did not see that as necessarily. just
16:00
purely motivated by Trump's own
16:02
personal quest for
16:04
vengeance. But 2024, TikTok
16:07
is more powerful than it was in 2020. TikTok
16:09
is more influential than it was in 2020. And
16:12
now Trump has flipped around
16:14
and there's multiple theories. One theory
16:16
is, hey, Trump has
16:18
about $500 million in debt right now
16:21
that he is strapped
16:24
for cash and
16:26
that he's then much more vulnerable to
16:28
people who have lots of cash coming
16:30
in and saying, hey,
16:32
I can help you with that New York
16:35
judgment, with those attorney's fees, with all of
16:37
this, but you know, you scratch, I scratch
16:39
your back, you scratch mine. And
16:42
there's been, you know, a major,
16:44
major mega donor who has, who
16:47
has absolutely, you know, been sort
16:49
of flirting with Trump and who
16:52
is a big time
16:54
investor in TikTok. And so
16:56
that's one, that's
16:58
one theory. I think the,
17:01
sorry, didn't even former vice president Mike
17:04
Pence open up and
17:06
talk about his theory that Trump has changed
17:08
his view on TikTok because of this donor
17:10
name you identify him as
17:12
Jeff Yass. Yes. Yes. Yes.
17:16
Yes, that's right. And so that
17:18
is a, that's a theory and it's
17:21
got a lot, there's
17:23
a lot to be said for it, but
17:25
there's also another theory that's got a lot
17:27
to be said for it that also matches
17:29
Trump's words. And that
17:31
is actually the real enemy
17:33
for Trump and Trumpism is
17:36
always the domestic enemy. That
17:39
in other words, they're much more concerned
17:41
and have, get much more anger at
17:44
their American opponents than their foreign
17:46
enemies. And how would
17:49
we think that Trump might have this motivation?
17:51
Well, this is what he said in truth
17:53
social in all caps.
17:56
TikTok is less of a danger to the
17:58
USA than meta Facebook. book, which is a
18:00
true enemy of the people. One
18:05
explanation is, has nothing to do with the
18:07
money and has everything to do with the
18:10
grievance that is
18:12
just steeped within Trumpism. And
18:15
that is, they just dislike Mark Zuckerberg more
18:17
than they dislike Xi and the People's Republic
18:20
of China. And if in this
18:22
zero-sum game, if tech talk loses, meta might
18:24
gain and we just can't have that. The
18:28
odds are it's actually some sort of
18:30
combination of these factors. And
18:32
for those who've forgotten back
18:34
in 2021 after the attack on the Capitol, meta,
18:38
Mark Zuckerberg and Facebook banned Trump
18:40
from posting, as did Twitter at
18:42
the time before it was owned
18:44
by Elon Musk. And so
18:46
Trump has had a grudge against these
18:48
American owned social media companies for kicking
18:51
him off the network for spreading misinformation
18:53
and lies that led to the January
18:55
6th attack. So
18:58
the enemy of my enemy is sort of
19:00
my friend, or maybe it's the other way
19:03
around, where Trump sees TikTok as a competitor
19:05
for Facebook. So why would he want
19:07
to do something that would be beneficial for Facebook when
19:09
he doesn't like them? It's
19:11
petty, it's small minded, it's self-interested,
19:13
and those are values that we have
19:15
seen time again and again motivate
19:19
Trump. So it makes sense. Where
19:23
do you think this is going? What's in
19:25
the Senate now? Does
19:27
the Senate have a bipartisan kind of
19:30
kumbaya moment like the House did
19:32
in rallying against TikTok for the
19:35
sake of national security? Or does
19:37
Trump kind of weasel
19:40
his way in there again like he did with the immigration
19:42
bill and convince enough senators
19:44
to side with him in China and
19:47
scuttles the whole thing? So
19:51
it's been pretty obvious from the conversation so
19:53
far that I support this bill. So
19:57
when I say there's a danger to this bill,
19:59
I'm coming from the stand. point of I'm supporting
20:01
this. And I think there's two dangers at once.
20:04
So danger number one is that a
20:06
lot of Democrats are just, as I said earlier,
20:09
worried about the popularity of the app with Gen
20:11
Z. And there's a lot of concern in the
20:13
Democratic world that we might lose
20:15
this election because we're losing support with Gen
20:17
Z. And so, you know,
20:20
there might be a subtle message that says, not
20:23
now. Not in election
20:25
year, not right now. Joe
20:27
Biden has said he would sign the bill. Biden
20:30
has said he'd sign it. Trump has said
20:32
he opposes it. So this is another example,
20:34
as with Russia and Ukraine, where you have
20:37
Biden saying, taking the
20:39
harder line against our foreign foes and
20:41
Trump is. But
20:43
it's the question, how much does Biden
20:45
really want to sign it? He's
20:48
willing to sign it. How much is he wanting
20:50
to sign it? So is there circumstance where Chuck
20:52
Schumer could kind of do him a political solid
20:55
and not exactly raise
20:58
too much of a ruckus and trying to
21:00
push this through, you know, using the procedural
21:02
maneuvers that can kind of cause
21:04
it to die out temporarily, like assign
21:06
it to committee, have the committee slow
21:09
roll it, those kinds of things. I
21:11
think that's one danger sort of from
21:13
the left side of the aisle. I
21:15
think the danger from the right side of the aisle is
21:17
that Trump's just drawn a line
21:19
in the sand here and he's got a lot of
21:22
reliable folks in the Senate who are just going to
21:24
do what he asks. It's not the
21:26
same as the House, but there are
21:28
some folks in the Senate. And my concern
21:30
for the bill is the combination of, okay,
21:34
in theory on the Democratic side, okay, in theory
21:36
we're for this, but it's politically difficult. With
21:39
on the Republican side, in reality we're really
21:41
for it, but Trump makes it difficult, could
21:45
combine to squelch
21:47
the bill, at least for now. That's
21:50
an interesting thought. I wonder if there
21:53
were a suitor or a buyer approved
21:55
in advance, would that
21:57
win more Democratic votes to know? All
21:59
right. It's not going to go away.
22:02
No one's going to lose their TikTok accounts or
22:04
service. It's just going to transfer ownership from China
22:06
to this US company and We're
22:10
not going to lose young voters over this. Would that be
22:12
enough to get them over the hump? I don't know But
22:16
it seems obvious that no one's in a hurry to
22:18
tackle this issue Maybe after the election if
22:20
things go a certain way, they'll solve it.
22:22
Before we leave this TikTok specific thing You
22:24
brought up the the interesting
22:26
dynamic that TikTok in a way proved
22:29
their critics worries
22:32
and concerns when when TikTok called
22:35
upon their most popular
22:37
influencers on their platform to
22:40
flood congressional leaders with phone
22:42
calls and letters and lobbying efforts to
22:44
get them not to pass this bill
22:46
and You
22:49
wrote the resulting flood of angry calls
22:51
demonstrated exactly how TikTok can trigger a
22:53
public response and gave the
22:55
lie to the idea that the app did not
22:57
have a clear and essentially I'm
23:01
I can't read it through my own highlighters.
23:03
It's so dark political influence. Anyway, TikTok
23:06
saying no, we don't have a disproportionate political influence and
23:08
then they just showed How
23:10
disproportionate their political influence was by trying to stop
23:12
this bill. Oh, yeah I mean
23:15
Tom Tillis today put up on Twitter
23:17
a tape recording of a death threat that
23:19
he got Obviously from a young lady a
23:22
young girl who and so, you
23:24
know It's not it wasn't a
23:26
true death threat in the sense that you hear it
23:28
and you think oh no Somebody's coming after him But
23:30
here's somebody who sounds like they're in middle school or
23:32
young high school Just laughing Lee
23:34
talking about how they're gonna shoot a senator,
23:36
right? And it's absurd and
23:38
there were people who would call in and saying they
23:40
were going to commit suicide Because
23:43
of the TikTok bill and
23:45
it infuriated a lot of members of
23:47
Congress It was the worst thing that
23:50
TikTok could have done and again for
23:52
somebody like me who's saying this is
23:54
the potential of the app I was
23:56
imagining you have a Chinese attack
23:58
on Taiwan and all of a sudden contact
24:00
your congressman to stop nuclear war
24:02
to prevent America from defending Taiwan.
24:06
That kind of actual push
24:08
into American politics by an act
24:10
of hostile power is really, really
24:13
dangerous and they proved it. They proved that
24:16
it was dangerous. So
24:18
I'm going to be fascinated to see
24:21
what happens. I'm actually encouraged that it
24:23
made it out of the house with
24:25
a bipartisan majority because it does show
24:27
there are still some limited circumstances where
24:29
obvious foreign problems can unite us. But
24:32
with everything in the Senate going on in
24:35
the emerging Trump opposition and worries about
24:37
the 24 election, I'm
24:40
really afraid we're not going to do the
24:42
right thing from a national security perspective because
24:44
of short-term political considerations. This
24:52
episode of The Holy Post is sponsored by
24:54
HIA Health. Do you have
24:56
kids? Do you care about their
24:58
health? Of course you do. Most
25:00
kids' vitamins are basically candy in
25:02
disguise filled with two teaspoons of
25:04
sugar, unhealthy chemicals and other gummy
25:06
junk kids really shouldn't be eating.
25:08
That's why HIA was created, the
25:11
pediatrician-approved super-powered chewable vitamin for kids.
25:13
Zero sugar, zero gummy junk, yet
25:15
they taste great and are perfect
25:17
for picky eaters. HIA fills in
25:19
the most common gaps in modern
25:21
children's diets to provide the full-body
25:23
nourishment our kids need with a
25:25
yummy taste they love. And we've
25:27
worked out a special deal with
25:29
HIA for their best-selling children's vitamin.
25:32
Receive 50% off your first order.
25:34
To claim this deal, you need
25:36
to go to hiahealth.com/holypost. This deal
25:38
is not available on their regular
25:40
website. Go to hiyahealth.com/holypost
25:42
and get your
25:44
kids the full-body
25:47
nourishment they need
25:49
to grow into
25:51
healthy adults. And thanks
25:53
to HIA Health for sponsoring this episode.
25:55
Today's episode is sponsored by Sundays. This
25:58
is Phil. I have a
26:00
dog. You have a dog. We love
26:02
our dogs and we gotta feed them
26:04
something. Fresh food with human-grade ingredients is
26:06
a better way to treat our dogs
26:08
than that old bag of... whatever that
26:11
stuff is. Sawdust and cow bones, I
26:13
have no idea. But fresh pet food
26:15
is expensive and inconvenient. And that's where
26:17
Sundaes comes in. No, not today. The
26:19
new dog food company that makes air-dried
26:21
dog food from a short list of
26:23
human-grade ingredients. It's healthy, with beef, chicken,
26:25
and digestive aids like pumpkin and ginger.
26:27
It's convenient. Unlike other fresh dog foods,
26:30
it's zero prep, zero mess, and
26:32
zero stress. Sundaes is shelf-stable
26:34
and ships right to your door. And
26:36
it's affordable, costing 40% less
26:38
than other healthy dog food brands because
26:40
they don't waste money shipping frozen packages.
26:43
We've got a special offer for our
26:45
dog-loving Holy Posters. Get 35% off
26:48
your first order of Sundaes.
26:50
Go to sundaysfordogs.com/Holy Post. Or
26:52
use the code HolyPost at
26:54
checkout. That's sundaysfordogs.com/Holy Post. Upgrade
26:56
your pup to Sundaes and
26:58
feel good about the food
27:01
you feed your dog. And
27:03
thanks to Sundaes for sponsoring
27:05
this episode. All
27:07
right, let's transition from that social media
27:10
story to an even broader social media
27:12
story. And that is, in the last
27:14
number of weeks, the Supreme Court has
27:16
been addressing or hearing oral arguments on
27:19
a number of different attempts
27:21
to regulate US
27:23
social media platforms. There's
27:27
a lot to unpack here, but let's start
27:29
with the Florida
27:32
and Texas bills. So these
27:34
were laws that were passed
27:36
in Florida and Texas that were intended to
27:39
stop social media platforms from
27:43
censoring or regulating the content on
27:45
their own platforms. My
27:47
understanding was conservatives in Texas and Florida
27:49
were saying that places like Facebook
27:53
or Twitter, maybe
27:55
before Elon Musk, other platforms were
27:59
inappropriate. appropriately removing or
28:01
censoring content that was conservative
28:05
in its outlook and They
28:07
passed bill saying you can't do this That
28:10
is now before the Supreme Court, but
28:12
there's also these bills coming out of
28:14
Missouri and Louisiana. Is it? Cases
28:17
coming out of yeah cases multiple cases and
28:19
they're coming out a little bit differently. They're
28:21
upset that the
28:23
Biden administration Has voiced
28:26
concern to social media companies about
28:28
certain content that's on their platforms
28:30
urging them to remove it saying
28:33
that that kind of Influence
28:36
coming from the White House or the administration
28:38
is also a problem
28:41
so the Supreme Court is
28:43
now finding itself at the cross section of
28:46
Fairly new at least new is the
28:48
Supreme Court would argue new digital platforms and
28:52
having to uphold freedom of speech So
28:55
there's a lot of different layers to these to
28:57
these cases But however the the court
28:59
rules on it could have a massive
29:01
influence on the way we communicate
29:03
as a society and who has access To what
29:06
platforms? Yeah, so I'm gonna throw all this at
29:08
you David and you tell me what piece you
29:10
want to tackle first In
29:12
trying to peel this onion. Well, you kind
29:15
of have to tackle them together because they're two
29:17
sides of the same coin. So moving
29:20
through 2020 There
29:23
are a lot of Republicans Who
29:25
sort of reached peak fury against
29:27
big tech on multiple
29:29
grounds? So ground number one
29:31
was they believed that big tech was
29:33
on Biden's
29:36
side not Trump's side and
29:38
that big tech was You
29:41
know whether it was through donations
29:43
that you know big tech that
29:45
Facebook etc were making regarding Ballot
29:48
access to try to improve ballot access around
29:50
the country or the way in which they
29:52
moderated for example when
29:54
the Story from the
29:56
New York Post broke about the hunter Biden
29:59
laptop that how Twitter later, immediately
30:01
throttled discussions of the
30:03
Hunter Biden laptop, lifted the
30:05
throttle later, but in the short term,
30:07
throttled discussions, or
30:09
Big Tech banned from the platform. So
30:12
there was this real sense, and then especially,
30:15
and all this got, was
30:17
massively amplified by concerns over
30:19
COVID and how Big Tech was
30:21
taking aim at what it perceived to be COVID
30:24
disinformation. Even when sometimes
30:26
it wasn't
30:28
disinformation at all. Such as the lab
30:30
leak theory for COVID, Big
30:33
Tech took some steps to try to restrict the
30:35
spread of the lab leak theory. And
30:38
now the lab leak theory may well be the
30:40
best available theory we have for COVID's origins. So
30:43
there were some legitimate beefs that
30:45
people had. I mean, suppressing the
30:47
laptop story was wrong. Suppressing
30:51
lab leak discussion was wrong. Both
30:54
were suppressed under the rubric of
30:56
so-called misinformation, and they just were
30:58
wrong. So there was a
31:00
legitimate beef that a lot of Republicans had
31:03
against Big Tech. So yeah. To
31:05
pause for a second though, the lab leak story,
31:07
my understanding was it got
31:10
suppressed on social media because at
31:12
the time it was seen as
31:14
a reason for inflaming anti-Asian rhetoric
31:17
and racism. And
31:19
so the concern wasn't strictly lab
31:22
leak is misinformation, therefore we're going to remove
31:24
it. It's lab leak is
31:26
a dog whistle for anti-Asian racism.
31:28
That's why we're going to remove it. Regardless,
31:31
these social media platforms have their own
31:33
set of values that they then use
31:35
to filter what content they want on
31:39
their platform. And Texas and
31:41
Florida are saying, no, no, that's not fair.
31:43
You need to treat these as open forums
31:45
for free speech. And just because
31:47
you don't like someone's speech doesn't mean simply
31:50
owning the platform means you get to remove
31:52
it. Right. Yeah. And
31:55
so there were two responses to this 2020
31:57
spasm of anger. That's number one
31:59
is Florida. and Texas saying, hey,
32:01
we're going to control how you moderate your platform.
32:04
So Texas's control took the form
32:07
of essentially saying, look, here's the criteria
32:09
that you have to use. It's
32:11
got to be viewpoint neutral. And there's a specific way you
32:13
can moderate. Florida was different in
32:15
that one of the things it was saying was
32:18
gave privileged access to candidates for
32:20
office. For example, you can't remove
32:23
these candidates for office. And
32:25
the bills are detailed and have their own
32:27
distinctions. But that was step number one. And
32:29
then step number two was
32:31
to confront the social media
32:33
companies over their censorship in 2020, 2021.
32:38
And that's where the Twitter
32:41
files come in. So the Twitter files is
32:43
a series of files
32:45
after Elon Musk bought Twitter.
32:47
He gave some selected journalists
32:49
access to internal documents. And
32:52
so these internal documents
32:54
showed that members of the
32:56
Biden campaign, members of the
32:58
Trump administration, members of the
33:01
federal government were constantly jawboning
33:03
the social media companies saying,
33:05
hey, you should take this down or hey,
33:07
you should take that down or this is
33:10
misinformation and that is misinformation or that violates
33:12
the terms of your service. And
33:14
so what a lot of MAGA
33:16
thought when they Twitter files came through is
33:18
they thought, aha, we have the smoking gun
33:21
of federal and tech
33:24
of big deep state and
33:26
big tech collusion. And
33:29
so lawsuits were filed
33:31
to try to stop that. So
33:34
Texas and Florida passed laws trying
33:36
to regulate moderation. And
33:38
then the lawsuits are filed to try
33:41
to stop government officials from interacting with
33:43
social media. Okay. And just
33:45
from an outsider layperson point of view,
33:47
forget the law for a minute, forget
33:49
constitutional rights. It
33:51
just seems contradictory that
33:53
on the one hand, Texas and Florida are saying
33:56
we want to control what you put on your
33:58
platform and make sure that you do. what we
34:00
want you to do. But then these other lawsuits are saying,
34:03
social media companies, you can't be influenced
34:05
by what the government tells you you
34:07
should and shouldn't put on your platforms.
34:09
That seems contradictory. It's a
34:11
problem. The two lawsuits are in tension with
34:13
each other in an interesting way because the
34:16
argument in the Missouri case that
34:19
was argued was that the interstate
34:21
interference was a
34:23
constitutional violation. Right. Okay.
34:26
But then the net choice cases involving
34:28
Florida and Texas, they're trying to argue
34:31
that state interference isn't a constitutional violation.
34:33
Right. And these are
34:35
all coming from red conservative states
34:37
trying to say that social media
34:40
is biased towards progressivism. So
34:42
you have conservative actors here contradicting
34:44
their own arguments. One saying, don't be influenced by
34:46
what the state wants you to do. And the
34:48
other saying, yes, the state should have
34:51
control over what you do if it's a red state. Well,
34:53
yeah, but this goes to the new rights for you
34:56
of free speech, which is free speech for me and
34:58
not for thee. So
35:00
the Texas and Florida, they're
35:02
stepping into control. Yay,
35:04
because it's more free speech for me. The
35:09
federal government is stepping in in
35:12
a way that we don't like. Well, now the
35:15
constitution applies. Right, because
35:17
it's a democratic administration that they're
35:19
upset with. Right, right. And
35:21
so I suspect they're
35:23
gonna lose both cases. Okay.
35:27
Yeah, so if you go again with
35:29
the danger, there's always a danger in
35:31
judging the outcome of a case from
35:33
the oral argument alone, but
35:37
you can usually judge the
35:39
outcome. You can't always judge the outcome is
35:41
the way I would put it. And
35:44
if you looked at the oral argument in the Texas
35:46
and Florida case and the argument in the Missouri case,
35:49
both of them, the justices
35:51
seemed extremely skeptical to
35:54
the right wing position. Extremely
35:56
skeptical. A number of things I've read
35:58
or heard about this. From what
36:00
I understand, some of the justices themselves
36:03
during the oral arguments were able
36:05
to acknowledge that they may not be the
36:07
best people to understand and
36:09
analyze social media given
36:12
their ages. Is
36:14
that a fair critique here? Is
36:17
technology moving at such a speed that
36:20
an older generation occupying the seats
36:23
that will regulate it may
36:25
not fully understand the implications of their own rulings?
36:30
Okay, there are areas
36:32
in which technical expertise
36:34
is in some degree of
36:36
technical expertise is indispensable to rendering
36:38
a sensible judgment. There
36:41
are limited areas like that.
36:44
Limited. As a general matter,
36:47
that is the free speech
36:49
issues on tech platforms are not super
36:51
novel at all.
36:53
There are offline analogies again
36:55
and again and again and
36:57
again for online speech controversies.
37:01
A lot of times, and this is something that the
37:04
new right also happens to be the
37:06
young right often, and
37:09
they have zero historical perspective on
37:11
these kinds of disputes and debates.
37:15
They think that stuff that happens that
37:17
they've seen online is new
37:19
and novel and remarkable when
37:22
there are offline analogs to all
37:24
of this that have existed forever.
37:27
For example, let's look at
37:30
the jaw boning issue. The jaw boning issue
37:32
is the issue that says, hey,
37:34
you can't tell Facebook we
37:36
would prefer you to take X, Y, or Z posts
37:39
down. You just can't do that. You
37:41
mean the federal government? Yeah, you federal
37:43
government, you can't reach out to Facebook and
37:45
say, take this post down.
37:48
Oh, really? For generations,
37:54
government officials have been calling reporters,
37:56
screaming at them, cursing at them,
37:59
telling them, demanding that they not
38:01
publish something. And we just
38:03
recorded an episode of Advisory Opinions where
38:05
Sarah gives this great anecdote about dropping
38:08
liberally carpet bombing a conversation
38:10
with F-bombs with some
38:13
reporters to try to tell
38:15
them they're wrong and they shouldn't publish what
38:17
they were expected
38:19
to publish. It's
38:21
a great story, you need to listen to the podcast.
38:24
It's very funny because Jeff Sessions who walks
38:26
in at the tail end of her carpet
38:28
bombing of F-bombs and it's a very funny
38:30
story. But this
38:33
has happened, I mean how many times in
38:35
American history have government officials sat down with
38:38
say editors from the New York Times and
38:40
said don't publish that story that contains classified
38:42
information. Right. And they're asking,
38:45
they're not telling, okay. Because
38:47
the difference would be if the government official
38:49
says if you publish this, we
38:52
will audit your taxes. You're
38:54
prosecuted, right. You're gonna be prosecuted or
38:57
we're gonna take away some, you
38:59
know your freedom of the press, whatever. If
39:02
there's some coercion, then
39:04
the government's in trouble. But to simply
39:06
ask is
39:09
not a problem. It's normal. It's very
39:11
normal. Yeah and so there's a
39:13
difference between convincing and coercing. And if you
39:15
think well, government officials shouldn't be in the
39:17
business of convincing, in
39:19
actuality we do even elect people
39:22
to make arguments and public. So
39:24
my understanding is from the so-called Twitter
39:26
files, the Biden administration had gone
39:29
to Twitter and other social media actors and
39:31
said, you need to take
39:33
down content on your platform that is
39:35
misinformation about the vaccines or it's misinformation
39:37
about COVID or misinformation
39:40
about the stolen 2020 election because
39:42
these pose a public health risk
39:44
or a public security risk. Right.
39:47
And so, but they were
39:49
ultimately powerless. The Biden administration was ultimately
39:51
powerless to force
39:53
any of the media companies to do that. But
39:55
they asked. And they had no
39:57
immediate, so not only they were
39:59
powerless. powerless, and they didn't even
40:02
really have the ability to say, if you
40:05
don't do what we say, then we
40:07
will change the law to come after
40:09
you. Because they could say, if
40:11
you don't do what we say, we can try
40:13
to change the law. Didn't
40:16
have that ability to actually change the
40:18
law. And also, you're dealing with, if
40:21
anyone's saying, oh, these poor social media
40:23
companies just being browbeaten by the government.
40:26
These are the most powerful, some of the most
40:28
powerful corporate entities in the world who have unlimited
40:31
resources to hire the very best legal
40:33
team money can buy to take
40:36
on the federal government. These are
40:38
not easily-intimidatable corporate entities. Just like
40:40
my employer, the New York Times,
40:42
is not an easily-intimidatable by the
40:44
government corporate entity. We
40:47
have an enormous amount of power
40:50
and autonomy under our American
40:52
constitutional system. And so does
40:55
Facebook, so does all the others.
40:57
It's not the deep state picking on some
41:00
dude in his basement. I'd
41:02
like to believe that's true of the Holy Post, too.
41:06
If we are an authority. That's right.
41:08
If we got a call from the deep
41:10
state representative threatening us, we would stand our
41:13
ground. Let's
41:19
talk about the other side of this, the Texas
41:21
and Florida attempts. What
41:24
about the Supreme Court stance
41:26
on state
41:29
actors wanting to guarantee certain
41:31
content is allowed on platforms
41:33
that those platforms don't want
41:35
to publish? Does
41:37
that have a snowball's chance before the Supreme
41:39
Court? Based
41:42
on the oral
41:44
argument, highly unlikely.
41:49
Unlawdly. So let
41:51
me put it this way. If
41:53
you're a listener and you're familiar with the
41:55
303 Creative Case or the Masterpiece Cake Shop
41:58
Case, these are cases where where
42:00
a state government,
42:02
in those cases, Colorado, was telling
42:05
a private corporation, a bakery, and
42:07
a graphic design company, there
42:10
is speech or expression you
42:12
have to engage in. In
42:15
other words, for Masterpiece Cake Shop is
42:17
you have to custom design a
42:20
cake for a gay wedding. For
42:22
303 Creative, it was you have to design
42:25
websites for gay weddings. The
42:28
Supreme Court in 303 Creative, in no
42:31
uncertain terms, says you cannot coerce a
42:34
private corporation to engage
42:36
in speech. You
42:39
can't coerce a private corporation
42:41
necessarily to host speech. And
42:44
so this is a classic
42:46
compelled speech argument or
42:49
case that, hey, a private
42:51
corporation does not have to advance
42:53
an idea that it disagrees
42:55
with. What
42:57
has cheered the outcome of 303 Creative? How
42:59
dare the government try to tell a
43:01
graphic designer or a
43:04
bakery shop owner
43:06
what expression they should advance?
43:09
Similarly, to take a big
43:11
company, Hobby Lobby, multi-billion dollar
43:15
craft store, they
43:17
said we have a religious
43:19
liberty interest and not
43:22
complying with the Obamacare contraception
43:24
mandate. Here again,
43:26
a private corporation cannot be
43:28
compelled in that circumstance to
43:30
engage in conduct
43:33
that violated its religious conduct. Conscious.
43:36
Conscience. We're both having
43:39
pronunciation issues. And
43:41
so in that circumstance, the private company
43:44
was in charge of its own
43:46
speech and expression. And
43:49
conservatives cheered. Here's
43:52
Facebook, here's Twitter, private companies engaging in
43:54
their own speech and expression as to
43:57
which voices they're going to highlight or
43:59
record. platform, they made a
44:01
different choice from Hobby Lobby or 303 Creative or
44:03
Masterpiece Cake
44:05
Shop. They chose their own
44:08
policies and their own procedures and conservatives
44:10
didn't like it. And
44:12
so then a lot of the new right decided,
44:14
well, the state,
44:17
actually the state can compel private citizens
44:19
to host. Okay.
44:22
So let me push that analogy
44:24
maybe too far. I
44:27
understand that the government cannot force
44:30
Jack Smith, what was his name? The baker. Jack
44:33
Smith, yeah. You cannot force Jack Smith to
44:35
bake a cake that communicates a message
44:37
he does not want to communicate. What
44:41
if Masterpiece Cake Shop was a
44:44
business that invited the public
44:46
in where they could
44:48
learn how to make cakes themselves and
44:50
they can make cakes using
44:53
their ingredients, using their ovens,
44:55
using their stuff and Jack
44:58
Smith wants to be able
45:00
to tell a patron in his shop,
45:03
hey, you can't make that cake that
45:05
celebrates same-sex marriages. Is
45:07
that still the same? Because that seems more
45:10
like what Facebook and Twitter are doing is
45:12
they're inviting the public onto their platform to
45:14
communicate messages and then telling
45:16
them but you can't communicate that message.
45:19
So it's not Twitter or Facebook communicating
45:21
the message, it's just their platform being
45:23
used to communicate someone else's message. Well
45:25
they would say no, it's actually both
45:27
of us. So you're
45:30
communicating a message that is on
45:32
the Facebook website, it is
45:34
a Facebook branded site is the Facebook and
45:36
if you look at the terms of service
45:38
of these places, what you're realizing when you
45:40
create a Twitter post and
45:42
this I haven't looked at the most recent Twitter terms
45:44
of service but there was a time when you created
45:46
Twitter post, it was no longer
45:49
your post, it was its Twitter's
45:51
post. Twitter owns that post, it can
45:53
do with it what it wants and
45:55
so what you are, you
45:57
are being given an opportunity. Unlike
46:00
any opportunity we've had in world history, which
46:02
is essentially instant theoretical access
46:05
to the general public, we've
46:07
never really had that before. But
46:10
within boundaries, and those boundaries are set by
46:12
the owner of the space. So
46:15
for example, Gab, which claims to
46:17
be free speech absolutist, says, anything
46:20
you post that's not illegal, you can keep
46:22
up. Well, Facebook has
46:24
a different philosophy of how it wants
46:26
its space to be used. So it
46:30
will sometimes privilege pictures
46:32
about your food over
46:35
your posts about politics. Why? Because
46:38
they found that Facebook as a community thrives when it's
46:40
more about connection between
46:42
people and less about fights over
46:44
issues. So they choose to orient
46:46
it in that way. Where Twitter chose for
46:48
a time to orient its algorithm in one way
46:50
that amplified a certain kind of content. Elon
46:53
Musk comes in, changes it. Now
46:56
Twitter amplifies a bunch of white nationalists. So
47:00
these platforms are participating
47:03
with the consumer. It's
47:06
not just like a blank
47:08
bulletin board that I post whatever I
47:10
want to post. So the question seems
47:12
to be, when do these companies that
47:15
give people unprecedented
47:17
access to the public square, when
47:19
do they become common
47:21
carriers? In other words, you
47:23
can't just broadcast whatever you want on television
47:26
because the government regulates what is obscene
47:29
speech and what's allowed and what is
47:31
not allowed because it's seen as a
47:33
public carrier to the
47:36
broadcast networks can't just put
47:38
anything they want up. So
47:40
the government steps in and regulates how those platforms
47:43
are allowed to be used. When
47:45
does a Facebook or Twitter essentially become a
47:47
common carrier where the government can step in
47:50
and regulate their content rather
47:52
than if they regulate themselves? Well
47:54
when it comes to public posting of thoughts or
47:56
ideas, never. When it comes
47:59
to private delivery of messages? Maybe.
48:01
So the difference is, so
48:04
when it comes to broadcast, we have to
48:06
put that in its own category because broadcast,
48:08
all broadcast airwaves are owned by the government.
48:11
Okay. So when you're talking about
48:14
ABC, NBC, CBS, when you're talking about AM
48:16
or FM radio, they are broadcasting their signal
48:19
on a signal that the government owns.
48:21
So the government, because it owns the
48:24
signal, has a lot more
48:26
authority. So for example, the decency rules
48:28
around the FCC and all of this,
48:30
there's more authority that exists because these
48:32
are government airwaves. But
48:35
a common carrier is something more like, say,
48:37
an AT&T or like
48:39
a FedEx. And usually what that means is you
48:42
have had companies, private companies
48:45
that create, in essence, what you might, services
48:48
that have an analog to
48:50
government services and often have
48:52
utilized government resources to create
48:54
their services. And
48:56
so for example, AT&T, back
48:59
before we were all on cell
49:01
phones, the amount of public-private
49:03
participation to create the national
49:06
network of AT&T landlines was
49:09
off the charts, just
49:11
off the charts. And so in
49:13
that circumstance with the common carrier, you're
49:15
talking about, in many cases, with common
49:17
carriers, public-private partnerships,
49:19
where a private entity
49:21
delivers a service that
49:24
has analogs to government services.
49:27
Delivery of the mail, for example, what's
49:30
the analog? FedEx is analogous
49:32
to. And
49:34
so in those circumstances, the
49:37
government essentially is saying, look, you just
49:39
transmit the package. You don't open the
49:41
package, read it, find out if it
49:44
says something you like, and
49:46
then send it. Or you don't look at who sent it
49:48
and say, well, I don't like that person. I'm not going
49:50
to mail that package. That's what a
49:52
common carrier is talking about.
49:55
When we're talking about Facebook, and
49:57
we're talking about Twitter, what we're talking about are, little
50:00
bitty op-eds. So we're
50:03
not talking about point-to-point communication, that's Facebook
50:06
messenger, you know, that's a direct message,
50:08
maybe the direct messages might be sort
50:10
of more common carrier-like. But
50:12
what we're talking about is essentially like little bitty
50:14
letters to the editor, little bitty op-eds. And
50:17
there's no such thing as a common carrier
50:19
of American op-eds. There never
50:22
was. There's no analogy to a
50:24
government service, you know, what,
50:26
you know, before Facebook, it's not like
50:29
I could write my letter to the editor and
50:31
get it published in the New York Times or
50:33
the Lexington Herald Leader where
50:35
I grew up. There's no such thing
50:37
as a common carrier of political commentary.
50:40
So let's go back to the
50:42
TikTok conversation we had earlier and the national
50:44
security risk of having
50:46
the Chinese
50:48
government having control over messaging
50:51
that could instantaneously go out to 170 million Americans
50:53
or data on 170 million Americans that could be
50:55
weaponized in
50:58
some way. We all understand
51:00
that that's probably not a good idea for an
51:02
adversary, a foreign adversary to have that kind of
51:04
power. But a
51:06
very similar power exists in
51:08
the hands of Mark Zuckerberg
51:10
or Elon Musk. Now they're
51:12
American companies, but they could
51:14
still do enormous damage to
51:17
American health, American stability,
51:20
all kinds of infrastructure in America
51:22
if their platforms, if they choose
51:25
to use their platforms to spread
51:27
lies, misinformation or to get highly
51:29
partisan in their outlooks.
51:33
So at what point
51:35
can the government ever step in and
51:37
say, you know what, yeah, this is
51:39
a private company, but you
51:41
have so much power and so much
51:43
influence that
51:47
we have to regulate you in some
51:49
way even
51:51
though it appears to be an infringement
51:53
on First Amendment rights, the
51:55
risk is just too great. Is
51:58
that ever possible? If
52:00
you're talking about monopoly power,
52:02
the remedy to that is antitrust
52:04
law, not government
52:07
coercion of speech. The
52:11
way that we have traditionally dealt with
52:13
the negative effects of monopoly power. Monopoly
52:16
power in the public square looks different than
52:18
monopoly power when it comes to say provision
52:20
of telephone services. The
52:22
antitrust case against Bell
52:25
back in the day, or
52:27
antitrust actions brought against Microsoft and Microsoft
52:29
was king of the world. There's
52:32
antitrust actions right now against Google. When
52:36
you have monopoly power, monopoly
52:38
power has negative consequences that
52:41
are addressed by breaking the
52:43
monopoly. That's
52:45
the traditional way that we deal with the
52:47
negative effects of monopoly power is by breaking
52:49
the monopoly. That's
52:53
the way if you have a
52:56
private corporation gets monopoly level
52:58
power over public
53:00
discourse, you
53:03
break the monopoly. That's
53:06
the question at issue, for example,
53:08
in lawsuits currently against
53:10
Google. The issue there
53:12
on monopoly is not, is Google saying things we
53:14
don't like? Or
53:16
any other company, Facebook, is it saying things we don't
53:18
like? Facebook sort of
53:21
cleared the field of all
53:24
competition. In other words,
53:26
nobody has any opportunity really to meaningfully
53:28
disagree with or present and create options.
53:31
That's how we deal with it. We deal with it not
53:33
by saying to a private entity that we're
53:36
going to now dictate how you use your
53:38
space. We do it to a private entity
53:40
by saying we're going to break
53:42
up the monopoly. Now, amongst a lot of,
53:44
some libertarians don't even like
53:46
antitrust law. There's
53:48
too much of a state interference. I
53:51
think antitrust law is a
53:53
solid concept. I
53:56
endorse and appreciate antitrust law properly applied, but
53:58
that's how we deal with it. deal
54:00
with a monopolies guide that respects the
54:02
first that still respects the First Amendment,
54:04
we don't override the human conscience and
54:07
say, speak things you don't want to
54:09
speak. What we do is we say,
54:12
we're going to split apart aspects of this
54:14
business, so that you now don't control
54:16
all aspects of it. Now the part that you control,
54:20
you still control. But we
54:22
break monopolies rather than commandeer
54:24
monopolies, if that makes sense.
54:27
Yeah, obviously I
54:29
agree with that. I just wonder
54:31
if it
54:34
still works as effectively in the
54:36
digital reality we are in. So
54:38
I don't know how large, for
54:40
example, metas accesses to
54:42
the American population. But
54:45
let's say Mark Zuckerberg decides to just
54:47
go rogue with AI and floods
54:50
the zone with a
54:52
conspiracy theory that, I
54:55
don't know, come up with something
54:57
ridiculous. Like, you know, Joe
54:59
Biden is actually a cyborg
55:02
who's being controlled by, you
55:04
know, the New World Order, something like
55:07
that. And it absolutely takes over, or
55:09
something even worse, like that
55:11
would trigger violence. And like you
55:13
could just do an enormous amount of damage. And
55:16
I don't think anybody's making the case right now
55:18
that meta has monopoly control over
55:20
social media. It doesn't. There are competitors.
55:22
But it still has such a massive
55:24
influence that it could do incredible
55:27
damage very quickly, all under
55:30
the guise of the First Amendment, freedom
55:32
of speech. We can regulate what we want on
55:34
here, what we don't. And to just
55:37
think that we are powerless to do anything about
55:39
it, and I get it, it could be weaponized
55:41
the other way. Because the last thing you want
55:43
is somebody in government who's
55:45
controlling the other side of it. So all
55:48
this just raises the question, are
55:51
there limits on our constitutional law
55:54
and our freedoms that just are not prepared
55:57
to deal with the tools we now have
55:59
at our disposal? for communication. Again,
56:01
I think this is really a product.
56:04
That argument is
56:06
also a product of sort of a
56:08
lot of, I've heard that argument a lot, but it also
56:11
has a history problem as
56:13
well. If you
56:15
look at the list
56:17
of top social media
56:20
sites, from Facebook down to
56:22
Pinterest, there is
56:24
more market competition in social
56:26
media sites than there was back when the news
56:28
was the big three. Yeah, I
56:30
agree. That's what I'm saying. I don't think any
56:33
of these have a monopoly. Right, right. But even
56:35
at the size they are at, they could do
56:38
massive damage if they chose to. As
56:40
has been the case for every powerful
56:42
entity in American history throughout all of
56:44
American history. We have had
56:46
lots of different periods
56:49
of time in American history where different groups,
56:51
different corporate entities, different individuals have
56:54
a lot more power than
56:56
the average individual. It's interesting,
56:58
that was actually the argument that Herbert Marcuse used
57:00
in the 1960s to begin to
57:03
argue for the speech code
57:05
and for dramatic interventions into
57:07
the marketplace of ideas was
57:09
the disparity of power. It
57:12
is very funny to hear a lot of folks
57:14
in the New Right make Marcusean arguments that
57:18
free speech can only be a marketplace of
57:20
ideas if the government comes in and decreases
57:22
the power of the powerful and increases the
57:25
power of the powerless. That's
57:27
sort of the Marcusean solution. This
57:29
is the 1960s, okay? And
57:32
so here we are in the
57:35
2020s and a lot of our discourse is
57:37
taking place in different spaces than it did in the 1960s.
57:40
But the power imbalances, there
57:43
were dramatic power imbalances in the 1960s and
57:45
in many ways
57:47
much worse, Skye, much worse. Because
57:49
if I'm just a dude who
57:51
thinks the earth is flat in
57:54
1965, you know, how do I get
57:57
my message out? Right. You
58:00
know, I had very little opportunity. Exactly.
58:03
In the year 2024, I have a
58:05
lot of opportunity to get my message
58:07
out. Right. And yet people are
58:09
saying at the same time that we are now living
58:11
under deep state censorship or and
58:13
it just doesn't add up. I'm not, I
58:16
mean, obviously I'm a very strong proponent of
58:18
the First Amendment. I just, it's
58:21
hard for me to imagine January 6, 2021
58:24
happening in a world without social media. I
58:28
don't know. I
58:33
think the level
58:35
of QAnon influenced conspiracy theorists
58:37
that were there that carried
58:39
all this rhetoric forward. The,
58:41
yes, of course, President Trump
58:43
and his minions spread this
58:46
through old model media, whether it was
58:48
Fox News or other places, but the
58:50
speed with which it took off and the number of
58:53
the momentum and voices that were added to
58:55
it through social media. I mean,
58:58
that's exactly why he got banned
59:00
from Facebook and Twitter, right? He
59:02
leveraged those platforms better
59:04
than anyone to spread this and within
59:06
the matter of days, had
59:08
thousands of people at the Capitol
59:10
ready to murder people. So
59:13
that's the difference. I
59:15
agree with you on speed. I do not
59:17
agree with you on existence of the attack.
59:19
Obviously, we had a rebellion in this country
59:22
prior to social media. It was called the
59:24
Civil War. Many. We
59:27
had Shays' Rebellion. We had Whiskey Rebellion. We
59:29
had an effort to do a January 6
59:31
style blockage of Lincoln ascending
59:33
to the presidency in 1860 and 1861. It
59:38
just was overshadowed in its prominence
59:40
by the candidate against Fort Sumter
59:42
months later. But we've
59:44
been through this. Heck,
59:46
when you talk about conspiracy theories, Sky, the
59:50
Texas Secession Declaration in 1861 contains
59:53
a ludicrous conspiracy theory about
59:55
anti-slavery Northerners poisoning people in
59:57
the South. And,
1:00:00
yeah, it's just all over
1:00:02
our country. Yeah, it's an
1:00:04
argument from silence, I suppose, but how
1:00:06
much worse would those things have been
1:00:08
had they had a tool like social
1:00:11
media to spread and
1:00:13
amplify their conspiracy? And
1:00:15
it's a little bit like this is a completely
1:00:18
different can of worms, but there's been a number
1:00:20
of reports talking about the war in Ukraine and
1:00:23
how dependent the Ukrainians are on
1:00:25
Starlink for their military resistance against
1:00:28
Russia, right? And Starlink is the
1:00:31
network of satellites that are run by
1:00:33
Elon Musk. And he could single-handedly just turn
1:00:35
off those satellites and completely cripple the
1:00:37
Ukrainian army. And a lot of people are
1:00:40
saying, this is an unsafe
1:00:42
amount of power to be wielded by
1:00:44
one man whose loyalties
1:00:46
could shift, right? And
1:00:48
we recognize the danger of a
1:00:51
non-state actor having that kind
1:00:53
of control over state security.
1:00:56
And similarly, we
1:00:59
are relying on the benevolence or the
1:01:02
relative patriotism of a Mark Zuckerberg
1:01:04
and an Elon Musk and a
1:01:06
handful of other individuals that run
1:01:08
these large companies that if
1:01:11
that trust deteriorates, if their loyalty is
1:01:13
gone, they would be just
1:01:15
as dangerous as the Communist Party of
1:01:17
China running TikTok. But
1:01:20
we have no way of regulating that because
1:01:22
they're seen as private entities with free speech
1:01:24
over hundreds of millions of people and what
1:01:27
they see through their algorithms. That just kind
1:01:29
of weirds me out. And
1:01:31
I get it. I mean, I don't know the solution here. I'm
1:01:34
just saying I think that's messed up.
1:01:36
I mean, I get it. The
1:01:38
first thing you have to understand about a free speech
1:01:41
society is it's not utopian. Oh,
1:01:43
absolutely. Yeah. I mean, we have to
1:01:45
be willing to defend ideas and speech that we hate. I
1:01:47
agree. And also, we're going to be
1:01:49
in a situation where very powerful people we don't
1:01:52
like have the exact same free speech
1:01:54
rights as you and I do and utilize
1:01:57
their power and their platform. In
1:02:00
many ways, you know, the interesting thing
1:02:02
to me is I look at the present tech
1:02:04
environment and it still think it's better from
1:02:07
that standpoint than it was 30 to 40 years ago when
1:02:11
it was the CEOs of ABC, NBC,
1:02:13
CBS who had giant
1:02:16
amounts of control over
1:02:19
what we can see, just immense
1:02:21
amounts of control. And in many
1:02:23
ways, a cabal of a
1:02:25
few, the cabal that has control is,
1:02:28
it's a larger number and
1:02:30
the access to the marketplace even to
1:02:32
get around the cabal is
1:02:34
a lot, the marketplace of ideas is a lot
1:02:36
more open and there's a lot more ways to
1:02:38
go around the cabal than there used to be.
1:02:41
And so it's funny because again,
1:02:43
I just think a lot of this is
1:02:46
a matter of historical perspective. If I am
1:02:48
a dissenter from the present political moment or
1:02:50
I'm a dissenter from the even
1:02:52
the big tech zeitgeist, I
1:02:54
have lots of ways to get my message out,
1:02:56
lots of ways. But here's the
1:02:58
thing, Sky, here's the thing, Sky,
1:03:00
that is distorting the debate a little bit. And
1:03:03
I wrote about this. One
1:03:05
of the reasons why the right is so
1:03:07
focused on big tech is
1:03:10
because it's one of the last places
1:03:12
people are actually reading them. So if
1:03:15
you look at right wing websites around
1:03:17
the United States, they have by and
1:03:19
large collapsed in readership, collapsed.
1:03:23
All new sites have been down to some
1:03:25
extent. It's a lot of
1:03:27
exhaustion, a lot of fatigue, but a
1:03:29
lot of these right wing sites are in a total
1:03:31
state of audience freefall. So
1:03:34
they can no longer attract readers to
1:03:36
the platforms they built. So
1:03:39
the last way they can access the public is
1:03:41
through the platforms they did not build, Twitter,
1:03:44
Facebook, etc. And
1:03:46
so it's the market failure
1:03:49
of their own products that
1:03:52
is making them more focused on making
1:03:54
sure that they have a avenue through
1:03:57
these other products that they do. did
1:04:00
not create. And that
1:04:02
is distorting the debate. That's one of the
1:04:04
reasons why the right is so focused on
1:04:06
social media because any given
1:04:09
say personality at the Daily Wire
1:04:11
or at you name it, some
1:04:15
of these other sites, Gateway
1:04:18
Pundit, that
1:04:20
their actual audience is in a state of collapse
1:04:22
of people who go to their site to read
1:04:25
it. So how do they reach
1:04:27
eyeballs through these other sites? And it's
1:04:29
really fascinating and that does make a difference.
1:04:31
I mean, I know in my own work,
1:04:34
it's exactly the reverse. 95 to
1:04:36
98 to 99% of my readers do
1:04:41
not come through social media. Hmm,
1:04:43
they don't. The average piece that
1:04:45
I write has about a 2% link
1:04:48
click-through rate on social media. So I
1:04:50
mean, 2% of the readers come through
1:04:52
social media. 98% do not.
1:04:55
So for me, social
1:04:57
media is a side show. My
1:04:59
main show are the columns that
1:05:02
I write or the podcasts that I produce. It
1:05:04
is not the social media posts. But if you're
1:05:07
in a position where far
1:05:09
more people read you on social media than
1:05:11
ever read your work on your own site,
1:05:13
then all of a sudden social media becomes
1:05:16
that much more important to you and then
1:05:18
you're going to really agitate to maintain your
1:05:20
place in it because that's your last link
1:05:22
to the public. Okay, so as
1:05:24
we wrap up, when should we
1:05:26
expect rulings from the Supreme Court on these cases? And
1:05:29
like you said, you think they're both going to get
1:05:31
shot down. What are
1:05:33
you looking for in those rulings that might – sometimes
1:05:36
they're not just a clear up or down. They're
1:05:38
nuanced and they're ruined. Is there anything you're looking for
1:05:40
in particular that we should pay attention to? So
1:05:43
I think with NetChoice, that's the case
1:05:45
with Florida and Texas, you're likely to
1:05:47
see the Supreme Court just basically reaffirm
1:05:49
existing precedent, which is, look,
1:05:52
you just can't make a private company host speech it
1:05:55
doesn't want to host. See 303
1:05:57
Creative, see
1:05:59
Miami which is a case about, you
1:06:02
know, publishing competing op-eds and things like that.
1:06:04
So there's a long line of cases that
1:06:06
say, you know, you don't have to publish
1:06:09
work you don't agree with. Also
1:06:13
with the Missouri
1:06:15
case, you cannot consider the Missouri
1:06:17
case without the New York National
1:06:19
Rifle Association case, which
1:06:22
is very similar in that it is
1:06:24
a case where the NRA
1:06:26
is claiming that state officials were
1:06:29
trying to convince people not to do
1:06:31
business with the NRA to punish
1:06:33
the NRA for its viewpoint. And
1:06:36
I think what you're going to see between the Missouri
1:06:38
case and the NRA
1:06:40
case, the Supreme Court is going to say
1:06:42
in the Missouri situation, that's what convincing looks
1:06:44
like. In other words, that
1:06:46
is maybe not all of
1:06:48
this elements, but the vast majority that people
1:06:51
are objecting to in the Missouri case are
1:06:53
just government officials trying to make
1:06:56
a case. But if
1:06:58
you go to the NRA case, that's what coercion looks
1:07:00
like because there was
1:07:02
a regulator who issued guidance
1:07:04
letters urging companies not
1:07:06
to do business with the NRA. That's
1:07:09
going too far. So I
1:07:11
think they're going to set up these two cases
1:07:13
to say if you want to know the difference
1:07:16
between convincing and coercing, the
1:07:18
convincing one is this big tech one, the
1:07:20
coercing one is the NRA case. That
1:07:23
take may not age well, Skye, because we got
1:07:25
to see, but by the end
1:07:28
of June or beginning of July, we'll get all this.
1:07:30
We'll know what that answer is. Well, maybe we'll revisit
1:07:32
these if we need to when the ruling comes out.
1:07:34
But David, thank you for your expertise on this and
1:07:36
your analysis. We just
1:07:38
live in this interesting time. Social media really
1:07:40
emerged in the late aughts. And
1:07:43
we're now 15 plus years into
1:07:45
this experiment. We're seeing early on
1:07:47
there was this utopian expectation
1:07:49
that social media was going to democratize the
1:07:51
world. And it was leading the revolutions in
1:07:53
Middle Eastern countries with autocracies and on and on.
1:07:57
And since 2015-16, we've kind of seen the dark side of it.
1:08:00
it with the way it can be abused.
1:08:02
There's a whole other article that we didn't get
1:08:04
to by Jonathan Haidt in the Atlantic called End
1:08:07
the Phone-Based Childhood Now, looking at all
1:08:09
the skyrocketing rates of mental health
1:08:12
problems and the horrors that
1:08:14
social media is causing in other realms. We're just beginning
1:08:16
to come to terms, I think, as a society
1:08:18
with the good and bad implications of the attack. And
1:08:21
the law has to catch up with some of it. But
1:08:24
how do we do that without losing some core
1:08:27
American values around free speech and
1:08:30
independence and limited government
1:08:33
without completely surrendering
1:08:35
to the malformation of these platforms
1:08:37
for individual health or social cohesion?
1:08:40
So there's a lot to discuss, and I'm sure
1:08:42
it's not over. Yeah. But
1:08:45
gosh, there's a lot here. We
1:08:47
could do a whole podcast easily on
1:08:49
social media and kids and smartphones and
1:08:51
kids. To what extent
1:08:53
does the state have the authority to just
1:08:56
flat out ban smartphones? To
1:08:58
what extent, I mean, or ban social media
1:09:00
for children? To what extent does the state
1:09:02
have the ability to ban smartphones in school?
1:09:05
Lots. Yeah. Lots.
1:09:08
Should it? What about access
1:09:11
to adult content on private
1:09:15
platforms? How much can the government make
1:09:17
sure that that's age restricted? And
1:09:20
that's a whole different discussion. And
1:09:24
yeah, it's all fascinating. Yeah.
1:09:27
Well, anyway, thank you for the legal analysis
1:09:29
on all this and some of the political
1:09:31
analysis. And look forward
1:09:33
to talking to you again soon and being
1:09:35
together in May for the Holy Post Live
1:09:37
event. So again, all of you who
1:09:39
want to be a part of that, go to holypost.com/events
1:09:41
and check it out. David, until next time, have
1:09:44
a good one. Yeah. Thanks,
1:09:46
Guy. This was fun.
1:09:48
Each Friday is a production of
1:09:50
Holy Post Media featuring David French
1:09:52
and me, Sky Jitani, music
1:09:55
and theme song by Phil Vischer. This
1:09:57
show is made possible by Holy Post patrons.
1:10:00
To find out how you can become
1:10:02
a Holy Post patron and to find
1:10:04
more common good Christian content, go to
1:10:06
holypost.com.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More