Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
old is true to be self evident. That all
0:02
means concrete.
0:03
There's a number of congress I get
0:05
to have a lot of really interesting people in the office.
0:07
Experts on what they're talking about. This is
0:09
the podcast. for insights into the issues.
0:11
China, bio terrorism, Medicare for
0:13
all. In-depth discussions, breaking
0:16
it down into simple terms. We old
0:18
We hold We hold these truths. We hold these
0:20
truths. With Dan Krenshaw. We hold our lives.
0:23
Americans
0:24
are watching China go through what seems like
0:26
an unprecedented level a protest in
0:28
China as Chinese officials are using every
0:30
tool that they have to both spy on their citizens
0:33
and block the free flow of information. At
0:35
the same time, they are exporting the ability to
0:37
track and collect data on American citizens
0:40
through apps like TikTok and companies
0:42
like Huawei and ZTE. Federal
0:44
Communications Commission is on the front lines
0:46
working to protect American citizens from these
0:49
threats, and Commissioner Carr joins
0:51
us again discuss all
0:53
of this, Brendon. Thanks for being back
0:55
on. So last
0:57
time I saw you, we were climbing this big cell
0:59
tower. in my district.
1:01
That's like a thing you do. Yeah. So good to
1:03
see you again. Great to be down on the ground
1:05
with you. Whenever I get out of DC, I try
1:07
to spend as little time in DC as possible in this job.
1:10
I like to get up in the air with tower Crenshaw.
1:12
They're doing the the tough work to connect to all those particular
1:14
in the pandemic. Most people just assume their
1:16
phone's gonna work, their TV's gonna turn on,
1:18
they're just gonna fire up, but it's these men and women
1:20
climbing these towers. So it's great to get up two or
1:22
three hundred feet with you. That was a pretty impressive.
1:24
I know -- Yeah. -- people made a tail from a podcast,
1:26
but I have a bureaucratic physique. So I'm
1:28
I'm scrawny. I'm bald. And for me
1:30
to get a chance to climb a tower with
1:32
a former Navy Sea with something. So I had to train
1:35
for it, did some running, actually did
1:37
a pull up pulled a muscle in my shoulder,
1:39
which I do a pull up to to make sure that
1:41
I keep up with you on that tower, but had a had a lot
1:43
of fun. Not new not easy enough
1:45
legs. Right. k. Get up there. Yeah.
1:47
That was fun. That was I've never kind of a tower before.
1:49
It it was cool. Quite the view
1:51
from up there. Yep. So and and and real
1:53
quick really briefly before we get into TikTok
1:56
and and and and so the
1:58
intricacies on that and and a few other topics.
1:59
again, let's remind
2:02
the audience what is the FCC? What what does it you
2:04
guys do exactly? Obviously, you oversee
2:06
things like cell towers. Otherwise, you wouldn't be climbing
2:08
them all the time. So give us a quick synopsis
2:10
of the American look, what is this organization? Yes.
2:12
The FCC is about thirteen or fourteen hundred
2:14
people. It's headed by five commissioners.
2:16
I'm one of them. Three are of the president's
2:19
party too or in the minority. So I'm working
2:21
my way slowly into the minority of the
2:23
FCC. And the way to think about is sort of a a
2:25
mini Congress for tech and telecom. We
2:27
vote on things that can enable the four
2:29
g to five g transition. We're helping
2:31
to authorize this new generation of low earth orbit
2:33
satellites. That's the one they don't wanna ask about.
2:35
We do that as well. Then we administer this program
2:38
called the Federal universal service fund, which is a nine
2:40
billion dollar year fund to try to support
2:42
rural Internet build, where there's no private sector
2:44
business case, and any company that's a license,
2:46
so AT and T or T Mobile that wants to merger
2:49
transaction, they have to come before the FCC.
2:51
And then we do some stuff, section two thirty, which
2:53
is very famous, is actually section two thirty,
2:55
the Communications Act. Right. So I think we have role
2:57
there to play. Right. It it is that was
2:59
that a topic of last time we talked, I think? It was
3:01
a big focus last time. Yeah. So it's in the Communications
3:03
Act, and I think we should step in there and and
3:05
put some pro speech guardrails in place. We've
3:07
yet to do that, but that's one area where I think we could
3:09
be more active. Yeah. Well, have a
3:11
ton of legislation. There's like fifty bills
3:13
and E and C just kinda ready to go, not all
3:15
of which I even think are great, but it's a
3:17
good start in my mind, in particular, simply
3:19
adds a a little
3:21
section of two thirty that that protects
3:24
political speech, basically, uses the
3:26
same language as as
3:28
you would use in a nondiscrimination law.
3:31
that's so important. Texas as a state
3:33
passed a similar law similar to
3:35
what your bill would do as well. Non discrimination against
3:37
political viewpoints it was appealed. It was
3:39
recently upheld and appealed by the fifth circuit.
3:41
Mhmm. In that case, I assume it's potentially
3:44
headed to the supreme So I think we're gonna see a lot
3:46
of activity here just supports in Congress
3:48
on this concept of nondiscrimination for
3:50
participating in what's effectively a digital
3:52
-- Right. -- elsewhere. Right. Right. And
3:56
Wow.
3:56
Well well, now I kinda wanna get into the
3:58
Elon Musk thing because we're we're already
3:59
moving in that direction. But but just for the
4:02
audience, we we are gonna talk about banning tech talk
4:04
okay, like, and and the ramifications of
4:06
that. But since we're already on this free speech path,
4:08
let's stay there. What
4:10
do you think? I mean, Musk
4:13
I'm not sure what Elon Musk thinks about it because he was trying
4:15
to get out of that deal. I
4:17
would have I never would have urged him to buy buy
4:19
Twitter, I think. think Twitter is
4:21
just successful, and I'm not and and I have
4:23
the same questions he has always had. Right? Like, how
4:25
many bots are on this? How many real people
4:27
are really on Twitter? Is this really worth
4:29
it. But maybe he'll make it worth it.
4:31
I don't know. He's obviously moving
4:33
to this more of a subscription model
4:36
that seems smart if you're trying to make money,
4:38
but this is that gonna make people use it
4:40
less? I mean, how many people use it anyway?
4:42
What do you just what do you what's your first reaction
4:44
to the whole thing? Have have we've been
4:46
watching Elon just just
4:48
go, like, head first into this. Yeah. One
4:50
most interesting thing to me with Elon
4:52
Musk purchase of Twitter is it's it's really
4:54
further exposed a free speech divide.
4:57
in this country. You know, it wasn't very
4:59
long ago that free speech
5:01
diversity of views was a very
5:03
progressive viewpoint. Exactly.
5:05
If you go back, the actual the modern
5:08
day op ed launched on the pages of the
5:10
New York Times in nineteen seventy because
5:12
of then editor John Oakes
5:14
said that diversity of opinion is the
5:16
lifeblood of democracy. The moment we insist that
5:18
everyone think the same way we do, our
5:20
democratic way of life was in jeopardy wanted
5:22
views in the New York Times that were different
5:24
than what you would hear divergent in his
5:26
views from the New York Times editors. And now,
5:28
you know, flash forward fifty years, the New York Times is
5:30
actually in the front end of enforcing orthodox,
5:32
and they don't want opinion in
5:34
their pages that's different from them. So I think
5:36
that's really been exposed by Elon
5:38
Musk. You have a lot of people that want more censorship,
5:41
more control in this entire idea
5:43
that somehow free speech is a
5:45
threat to democracy is just, you know,
5:47
antithetical White House just this
5:49
week saying that they were keeping an eye on Twitter --
5:51
Yeah. -- keeping an eye on Twitter. Yeah. Not
5:53
really. deal on my Exane. Like, what is a
5:55
dangerous video on Musk? This guy has created
5:57
SpaceX in that you guys loved all
5:59
of this shit. I
5:59
mean, it's it's it's kinda baffling.
6:02
But but you it's also interesting to watch because
6:04
it's interesting to watch how Elon and
6:07
he's like a more controlled Trump. Right? He
6:10
he he he says everything he thinks
6:12
on Twitter. And it's it's
6:14
fascinating. You you can really you get a feel for his
6:16
brain and and how he's how he's moving through
6:18
this. And and he's trying to figure it
6:20
out. Like, he's he's so transparent about
6:22
how he's trying to figure out the next steps.
6:24
He's asking questions. Like, Should we do this?
6:26
Should we do that? And it
6:28
it it it does bring up something that I've
6:30
always noticed when we talk
6:32
about censorship which is there has to be
6:34
some kind of management of content on a
6:36
social media platform. Otherwise, it becomes
6:38
unusable. And Elon Musk got
6:40
it. I've I've seen him deal with this. Right? Like,
6:42
it's like, he's not gonna ban
6:44
speech, but but he still will ban
6:46
speech. That is he that as he
6:48
said, is is, like, negative or
6:50
can't remember the other word he used, but negative speech
6:53
and said, well, there we we're right back to
6:55
square one here now because of how do you
6:57
define that? And now look, I think
6:59
the way he would define it is probably
7:01
better for conservatives than because I
7:03
don't think he would define it as political speech. I think he
7:05
would define it. think he wants it to be nice
7:07
place. I think if I'm understanding how
7:09
his mind is working, he just wants it
7:11
to be a place where, like, you actually just
7:13
get information and you talk about
7:15
your kids and,
7:16
you know, there's, like, new sites
7:19
there and and and counter news to
7:21
that news. I mean, but just not a place where
7:23
you're screaming at each other and trying to destroy each
7:25
other. I think in in his I think that's his utopia
7:27
as far as I can tell, I don't know how
7:29
you get there because people are assholes.
7:31
Right. This is my podcast, so I can see whatever I
7:33
want. You can swear too if you want anyone
7:35
to No, sir. to a to a sensor
7:38
up finity on the podcast. So, yeah, let's
7:40
let it rip. But you guys as long as the sensor
7:42
on cable news Podcasts. Yeah. Right.
7:44
Right. You know what? You know, that's fine. I'm
7:46
I'm okay with that, you shouldn't be saying whatever you
7:48
want. Kids watch that. Well, think you're I think the most
7:50
important thing is basically what you're getting
7:52
at is a first amendment standard, which is how do
7:54
we protect core political speech from
7:56
discrimination, from censorship at the end
7:58
of the day. If we can't
7:59
talk about every
8:01
single potential perspective on
8:03
a in a main public problem.
8:06
We are weaker as a country from solving it. Why
8:08
can't we discuss the potential origins of
8:10
COVID-nineteen? Why can't we discuss the efficacy
8:13
of masks? You know, or either MEKTIN.
8:15
Right? Do vaccines help you from dying, going to hospital,
8:17
or they also stop transmission? We need to be
8:19
able to discuss all of that, and we couldn't because
8:21
you had, you know, the centralization of control on
8:23
social media. And I think the path forward to your
8:25
point has to be something like the first amendment
8:27
standard, which is We have defined
8:30
core political speech. We know what it is. We deal
8:32
with it in all sorts of context. It's that
8:34
type of speech that should be protected from censorship.
8:36
In my view, is I think the digital
8:38
town square in promoting speech is so
8:40
important that we shouldn't simply rely on the
8:42
hopefully benevolent censorship
8:45
content moderation even of Elon Musk. I
8:47
think that's why your legislation and others
8:49
is so important to step in and say, look, if
8:51
you're expressing a political view, I don't care if it's
8:53
Elon Musk. someone else owning a platform, they
8:55
shouldn't be able to Crenshaw and
8:57
shut that down. But but that's really the goal. That allows
8:59
you to censor, you know, incitement of
9:01
violence, things that aren't protected by the first men
9:03
and child corn, and you don't have to have that stuff
9:05
on these websites, but you can still protect core
9:07
political speech. Right? It it will
9:09
forever be a thorny issue that you had
9:11
to attack with scalpel, not not
9:13
a hammer. And and I do think a lot of our
9:15
legislation get gets at that point because you don't wanna
9:17
destroy the Internet either. And
9:19
we we just have to decide as a country once
9:21
once these platforms are big enough
9:23
are they
9:23
the town square? And and I think I think
9:25
our general view on the conservative side is
9:27
yes. Yes. They're somewhat like a utility.
9:29
They're somewhat like a town square. They've
9:31
become so big that they can't be seen as
9:33
just a, it's just another mom and pop shop trying
9:35
to, you know, it's not the cake baker. Right. It's
9:38
not the cake baker. Just baking whatever cake
9:40
they want. Okay. And it it's
9:42
just it's different and it's unprecedented. And
9:44
our founders, they could foresee a lot
9:46
of things. They couldn't foresee this. I mean, I
9:48
think
9:48
too much. You're exactly right. You know, one analogy
9:51
that I'd look to, which is a rough analogy, but is is
9:53
cable. In the nineteen nineties, Congress decided
9:55
that being on cable was
9:57
vital to the economic well-being
10:00
of broadcast TV. And so
10:02
Congress said, look, you local cable
10:04
channel capable company, cable
10:06
operator, you don't have carte Carr to
10:08
decide what broadcast TV signals to put
10:10
on your system and not because it's too important
10:12
to be there. So Congress stepped in with some guardrails
10:14
and regulated that. They went to the supreme court -- Yeah. -- and
10:16
was upheld. And I think that's sort of the path forward, which
10:18
is whatever the the merit of being on cable
10:20
in the nineties, it's that much more important to be on
10:22
social media -- Mhmm. -- today, it's
10:24
been a challenging carte blanche to to pick
10:26
and choose which speech to carry or not.
10:28
Child porn fine. In Simon of Ireland, it's fine. But if it's
10:30
expressing, you mean, you mean, not fine. Yeah. Yeah.
10:32
Fine. Yeah. Fine. Fine. Just to
10:34
clarify your position on that.
10:36
Right. It's
10:40
in the news. It's in the conservative news about
10:42
lately. Yeah. Yeah. I know we have a right
10:44
to stop those things.
10:46
Okay. But but so so we like
10:48
free speech. Right? We we we we want these
10:50
platforms to have free speech. We're TikTok. Screw
10:52
TikTok. Okay. So that's TikTok.
10:54
Why should we ban TikTok? And I
10:57
I think people are generally following this
10:59
issue. They know that the the reason, like Trump
11:01
wanted to ban it, the reason we're also
11:03
skeptical of TikTok is because
11:05
It's it's a backdoor for China. And there's
11:09
there's a few roads I wanna go. Well, let's just
11:11
start with your your letter on this is the
11:13
first this is your your strongest opposition
11:15
to TikTok that that you specifically
11:17
have stated. So run us through your arguments
11:20
about why we're at the point now where it
11:22
it you can't even negotiate with TikTok
11:24
anymore. because it looks because some background is well,
11:26
actually, you want to you tell us the background because you're
11:28
probably you'll explain it better I can about
11:30
the negotiations with making it a a
11:32
US parent company where that's gone, why
11:34
you think that's no longer viable option, why
11:36
we should just get rid of it and just and make
11:38
poor teenagers I don't
11:40
know, live in despair and
11:42
and just a dark, dark world where they might have to
11:44
actually talk to each other and climb trees and shit
11:46
and do whatever else as
11:48
teenagers. Well, there's no question. TikTok is
11:50
immensely popular in
11:52
America with young people in this country. A lot of
11:54
people look at it. They say, well, it's just a fun platform
11:57
for sharing videos and dance memes. And
11:59
the reality is that's just the sheep's clothing. Underneath,
12:01
it operates as a very sophisticated
12:04
surveillance technology. It's not the video that you're uploading
12:06
that matters. What happens is once you have TikTok,
12:08
they reserve the right through their terms of
12:10
service to collect everything from search and
12:12
browsing history Keystroke
12:14
patterns. They say they reserved the rights to get
12:16
biometrics, including faceprints and
12:18
voiceprints. And for years, TikTok was asked,
12:20
is this secure Is it being accessed by
12:22
Beijing? And for years, TikTok
12:24
officials just gaslighted all
12:26
of us in Washington, D. C. and said,
12:28
don't worry. This data is not
12:30
stored inside China. And over
12:32
this past summer, it's really when things took a
12:34
turn against TikTok, where there was internal
12:37
TikTok communications that revealed everything
12:39
is seen in China. So all this data
12:41
that's being collected is being
12:43
accessed by person. because they're saying like this for
12:45
instance, they're saying the facial recognition
12:47
we have filters. Of course, we need facial recognition because
12:49
we need to put ears on your head when you
12:51
want the ears on your head because then you want to
12:53
send it to your friends. But you're saying
12:56
that facial red, that then now that that
12:58
data shows up in China. So if
13:00
you ever went to China, they'd be like, that's that
13:02
person. Yeah. Once the data goes back to Beijing.
13:05
There's all sorts of nefarious purposes that
13:07
can be put to blackmail, espionage,
13:09
all that. And TikTok was
13:11
asked here in Congress recently, you know, is
13:13
this data being sent to
13:15
the Chinese government, they said, no. Truths fine.
13:17
They were asked, is it being sent to the
13:20
CCP? They said, no. It's fine. there's
13:22
a third question which is, is it being sent
13:24
back to people that are themselves members of the
13:26
CCP? And TikTok said that they would prefer not
13:28
to answer that. because it's being sent back to
13:30
the What's the company? ByteDance? ByteDance.
13:33
And, you know, as we understand it,
13:36
basically, all companies in China
13:38
basic I'm not saying all
13:40
are
13:40
are in some way affiliated with the CCP
13:43
because if they weren't, they wouldn't be
13:45
companies. They they would they would be out of
13:47
business. So Is that accurate? Yeah. There's they
13:49
almost all of them have these CCP committees.
13:51
They are embedded with CCP members.
13:53
Even US companies that have offices
13:55
in Beijing. It's the same thing that And at any moment,
13:57
the CCV could be, like, give us your data.
14:00
Right. Oh, no. Under we don't have to give you oh,
14:02
no. This is China. You do have to give us your data.
14:04
I mean so And there's lots of other
14:06
threats as well. So China has said that they wanna
14:08
dominate the world in artificial intelligence
14:10
or AI, but twenty thirty, and they're gonna use it
14:12
for malign authoritarian purposes. They're
14:14
already using AI to track and monitor and
14:16
surveil their own people, and they wanna use that
14:18
technology abroad. So every time that you swipe
14:20
on TikTok. Every time you enter a search term
14:22
on TikTok, that is going back to
14:24
Beijing. It's feeding. It's training. It's improving
14:27
their AI. which need an
14:29
outside of TikTok. It's gonna be used down the
14:31
road for nefarious purposes. That's an interesting
14:33
argument that I hadn't heard before because on the
14:35
one hand, a bunch of sixteen year olds
14:37
are like, I
14:37
don't care if they know who I am. Like,
14:39
I I don't care. Like, I have a Facebook
14:42
profile. I I everybody knows who I am on social
14:44
media. I don't care. Actually, that's the whole
14:46
point of my TikTok following is to, so
14:48
people know who I am. And
14:50
so it's hard to make that argument. We're making a
14:52
different argument, which I I find
14:54
more compelling Because even
14:56
because even if, like, if I was a normal citizen
14:58
too, like, I wouldn't care. I'd be like,
15:00
okay. Tell me what's happening. Okay. China knows
15:02
who I am. So what?
15:04
You know what? I have nothing to hide. Like, I
15:06
I don't care. I I
15:08
do have things to hide now because I'm kinda anti
15:11
China, and I'm at lawmaker. So I
15:13
do care. But a normal citizen probably doesn't care.
15:15
But you're saying wait wait. The argument you're making
15:17
is interesting and worth and worth worth touching
15:19
upon a little bit. It's it's
15:21
helping them like that move was that
15:23
movie recently. I think it was just the
15:25
end of Westworld. Right? It it was
15:27
it was Westworld. It was like the final season of
15:29
Westworld, which got really weird, really liked Westworld, and
15:31
then it was like, I don't know. It's just
15:33
some weird futuristic dystopia. But it
15:35
was it was but the entire point of it
15:37
was they were building AI. They were
15:39
building the most I don't know
15:41
if you've watched all the seasons. Okay. So do are you
15:43
do you know what Westworld is? A little bit. A little
15:45
bit. Okay. So so Westworld, it's it's a
15:47
cool concept at first. Basically, they build this
15:49
amusement park for for where
15:52
it's the Wild West, but they're all robots,
15:54
but they're super smart robots, and they're
15:56
and you can go there, and you can go and shoot out. So you can do
15:58
whatever the hell you want because they're robots doesn't matter.
16:00
But it gets darker
16:02
and darker as the seasons go on. And
16:04
then the robots, like, they become self aware, they try
16:06
to escape. And it just be what you realize
16:08
is that it's been this, like, entire
16:10
experiment to create the the the best
16:12
AI possible where you like, you just
16:14
know everything about everyone. You you can
16:16
predict their movements. You predict what they're
16:18
gonna be, what they're gonna do everything. That's
16:21
kinda what's happening here. This is
16:23
Westworld. Oh, yes. So so
16:25
TikTok. TikTok is Westworld, and then
16:27
eventually, robots themselves, like, if, like,
16:29
kinda destroy the brain. And
16:32
removing TikTok from America is
16:34
kinda like destroying the brain
16:36
of Westworld.
16:37
Right. And so there's three or four I'm
16:40
so proud of myself for that analogy. By the way, I
16:42
just came up with it right now. I'm gonna use it. Yeah. I'm
16:44
really proud of myself. there's there's sort of three or four distinct
16:46
threats. I mean, the one we talked about was, you know, once data
16:48
goes back espionage, the second we
16:50
talked about is, you know, training, feeding, and
16:52
improving their AI. there's a number
16:54
of of of other ones as well. There was a recent
16:56
report that had ByteDance
16:58
officials in Beijing using
17:00
TikTok to surveil the
17:02
specific locations of of
17:04
well, describe the location of specific Americans
17:07
who those Americans were -- Mhmm. -- was not
17:09
disclosed. So that's that that's right. They
17:11
say it's employees or something that they're worried
17:13
about them going to strip clubs or something, but --
17:15
Right. -- the group that was doing the tracking
17:17
usually does that. this report was that these
17:19
were not ticked up employees that they
17:22
were surveilling. So that's that's worse. But the
17:24
other area as well
17:26
is their ability to use it
17:28
for foreign influence campaign, including the content that they're
17:30
sending to our children. Yeah. So obviously,
17:33
in China, TikTok doesn't exist, but there's
17:35
a sister app. That's what's crazy. That's what I wanted to
17:37
bring up. It doesn't exist. Right. It doesn't
17:39
exist. There's a sister app called Doyan, which
17:41
operates very similarly, except what
17:43
it shows kids inside China
17:45
is museum exhibits, science experiments --
17:47
Right. -- and learning material. Yeah.
17:49
Learning materials. Here in
17:51
US, TikTok is showing kids, things like
17:54
a blackout challenge, which kids are
17:56
doing, including underage kids that are
17:58
TikTok. And it it's terrible, but they
18:01
are suffocated in themselves and in many cases
18:03
killing themselves. There's been lawsuits against
18:05
TikTok that have said TikTok intentionally
18:07
served the blackout challenge
18:09
to US kids, including a girl that was ten years
18:11
old who killed herself because
18:13
of this. a Carr blackout
18:15
challenge. I might have really even the courts
18:17
even if those allegations are true, TikTok
18:19
wouldn't be liable in Carr
18:21
because of section two thirty -- Right. -- and for
18:23
other reasons. But to me when you put all these four
18:25
or five things together, including the
18:28
content they're serving our kids and the distinction between
18:30
what they're serving their own kids, This
18:32
is why I've said this is this is China's digital
18:35
fentanyl in terms of the country they're feeding us. Good way
18:37
than what it is. It
18:37
is digital fentanyl. because they also do the real
18:40
Fentanyl. That's also the
18:42
real assholes. So yeah. Yeah.
18:44
And that's what they're sort of bringing up again. I mean, it's they
18:46
they don't the the fact that they don't
18:48
have the same version in their own country. And look
18:50
at this because they wanna censor their country and all
18:52
and all that. There's there's there's
18:55
deeper underlying principles behind
18:57
that, but they obviously see
18:59
harm in this sort of, like,
19:01
unfettered young just
19:03
this this this
19:05
ecosystem where the
19:07
most toxic things possible get posted.
19:09
And, you know, it's not a
19:11
mature environment.
19:12
There's there's addiction problems that
19:14
we have with our kids. It's it's it causes other
19:16
kind of social or psychological problems that
19:19
we're just we're just beginning to understand
19:21
because social media is still relatively new to the
19:23
human brain. Yeah. I'm still
19:25
I'm still I think the jury's still out on
19:27
whether social media in general is a net benefit
19:29
for humanity. maybe
19:31
it's a small net benefit, but there there's some real problems
19:34
that our brains can't figure out. Yeah. And so this comes
19:36
down to what what do we do about it? And where things
19:38
stand right now is the Biden administration's
19:40
treasury department has been
19:42
reviewing TikTok for a long
19:44
time at this point. And there was New York
19:46
Times article a couple weeks ago
19:48
that ported that the treasury department Carr a
19:50
process called CFIUS, which is the committee on foreign
19:52
investment -- Mhmm. -- as a preliminary
19:54
deal in place to allow TikTok
19:56
to continue to operate but that
19:58
same report said that the number two at
20:00
DOJ, Lisa Monaco, was concerned that that deal
20:02
isn't tough enough. Mhmm. And subsequent that Port,
20:04
you also had FBI director Chris Ray testify
20:07
here in Congress and say that the
20:09
FBI has serious concerns with
20:11
testing. So my guess is that the Biden side
20:13
is really turning. I think the time I think
20:15
everybody was mad at Truths. Trump's just
20:17
being, I don't know, whatever. because everything
20:19
Trump says the glut was against -- Right. -- you
20:21
know, so Well, exactly. You've got a Democrat
20:23
senator Carr Warner Chair, sent an intel
20:25
committee who's, you know, gets a a very
20:27
serious briefings almost every
20:29
single day. And he has said that it is TikTok in his words that
20:31
scares the Dickens out of him. So this is
20:33
bipartisan. And I think the Biden administration is
20:35
really struggling with what to do with TikTok.
20:37
Given Chris Rae's statement from the FBI,
20:39
given Mark Warner's statement, apparently the
20:41
number two, Lisa Monika DOJ, think
20:43
it's been very difficult for them to justify allowing TikTok to
20:45
continue to operate, but yet that's where things are right
20:47
now is there's a deal in place that would allow them to
20:49
do it. The crux of that deal is
20:51
something called Project Texas, which would rely on
20:54
Oracle servers and moving a lot of
20:56
TikTok's operations there. The reason why I'm still
20:58
concerned about that is a couple reasons. One,
21:00
There was a recent report that again got leaked
21:02
internal communications from TikTok that
21:04
showed a TikTok official in LA
21:06
asking another TikTok official outside of
21:08
business hours off location detailed
21:11
questions about the location and other information
21:13
about those Oracle servers that would
21:15
host the TikTok content, which to
21:17
me strikes crazy. that they are surveilling. Right.
21:19
They wanna know how to get into the Oracle
21:22
servers if if that should happen. Right. So it hasn't that
21:24
we're So they're associating to potentially
21:26
do this to house TikTok on his Oracle servers. If
21:28
if TikTok is in fact casing
21:31
or surveilling, then we should say time out
21:33
game over we're not we're not going down And
21:35
then finally, another leaked internal communications
21:37
said that on Project
21:39
Texas, Texas, even if we were to
21:41
complete that process, an internal TikTok
21:43
DC official said at the end of the day remains to be seen if product and
21:45
engineering, meaning could still get access to
21:47
the data because at the end of the day, these are
21:50
their tools build in China. So I think this
21:52
comes down to a basic IT test. you
21:54
understand? Is there software backdoors
21:56
that, you know, it would take our
21:58
engineers decades to go through to
21:59
figure out because that's
22:01
I mean, the coding is difficult. Which is why
22:03
I said, I just don't see a path forward. We can be sufficient
22:05
I used to call Project Texas or the Oracle servers
22:07
in Texas. It may be that that's what it is. Yeah. Yeah.
22:09
It could be. just Texas is the best.
22:11
Right. Right. Exactly. We just call for Texas. the
22:14
gold standard. Yeah. Yeah.
22:16
That's
22:16
interesting. So but aren't they they already
22:19
use or Oracle servers. Right? Like, they already their cloud
22:21
service, but but this would this would This would be
22:23
a deeper right right now This
22:25
would have a
22:25
US control over it. Yeah. Right now, Oracle host some
22:27
of their content, Oracle has
22:29
no vision currently really into the way it works.
22:32
This new version of using
22:34
Oracle would put greater
22:36
protections around the data in theory.
22:38
So we haven't made that full transition yet. And and is their
22:40
conversation? I mean, you know, we we heard about
22:42
divesting into a US company away
22:44
from the Chinese parent company.
22:47
So
22:47
so so let's I don't know.
22:49
Let let's imagine Facebook
22:51
bought
22:51
TikTok. I mean,
22:53
they've already copied it, like, with
22:55
Reels. So let's imagine
22:57
they did that. Would that be better? Would
22:59
that be okay? Or would the tools still
23:02
I
23:02
think it'd be better. If you could break the
23:05
corporate between TikTok and the algorithm and the
23:07
CCP in Beijing, then, you
23:09
know, I've got a lot of constructive social media as you do,
23:11
but I wouldn't have the the national Right. But we
23:13
control that. So Well, we told that That's
23:15
it. I don't You know, we have at least authority. Right.
23:17
And I don't see a path forward where where where
23:19
TikTok and Beijing would ultimately agree that. I just
23:21
think that the surveillance value, the espionage
23:23
value, the app vacation even if they aren't allowed to operate in the
23:25
US, is too high to turn the keys over
23:27
to that algorithm to any US based
23:30
company. This might be beyond your scope, but I've always
23:32
been curious. do
23:34
do you think Beijing artificially bumps
23:36
the engagement numbers on TikTok? It seems to
23:38
me that that it's likely the case. I
23:41
mean, just way people get followers, the way people
23:43
get engagement, and a way they
23:45
just don't on other social media.
23:47
I'm sure. I mean, if you look
23:47
at, obviously, where we started Carr some
23:50
of protests taking place right now
23:52
in Beijing. There's no question that, you
23:54
know, CCP authorities are are controlling what
23:56
is displayed on social media there and
23:59
downplaying evidence of those protests --
24:01
Well, that's for sure. -- in inserting other
24:03
content to distract Well, I I just mean when
24:05
you see, like, these TikTok stars with millions of
24:07
followers and and hundreds of thousands of likes. III
24:09
I've heard before that that that's artificially
24:12
boosted to make you like the app better. Give any
24:14
insight onto that. I don't. But, no, it it sounds like
24:16
it makes a lot of sense. This makes like seems
24:18
like a typical kind of bad
24:20
business practices
24:22
by the Chinese. Maybe we
24:24
hit on these these Chinese protesters for for
24:26
for a minute. You were just in Taiwan. Right.
24:29
So what what what what's just what
24:31
give us some takeaways. I I don't know what you were talking about
24:33
with the Taiwanese there. I'm assuming it's
24:37
telecom stuff. So but what
24:39
what what Carr up? Well, Taiwan is just an
24:41
amazing story. I mean, it is a a
24:43
beacon of freedom in democracy that
24:45
sits less than one hundred miles from
24:47
the shores of communist China.
24:49
And obviously, you know, China
24:51
is engaged in very serious saber rattling right now when it comes to
24:53
Taiwan. And I think when you look at our own strategic
24:56
interests, in my view, it's very
24:58
clear with Taiwan. Taiwan right now,
25:00
is the producer of ninety percent
25:02
of advanced chips. So everything from smartphones
25:04
Carr tablets, computers, everything really
25:07
depends on, you know, a
25:09
free Taiwan continuing to be
25:11
there. And also, we look at shipping through
25:13
the Taiwan Strait in that region. It's something like almost
25:15
a quarter of global shipping goes there. So in terms of sort
25:17
of vital US security interests. It's
25:19
clear. What I was there for was meeting with
25:21
my counterparts focused principally on
25:24
cybersecurity and network resiliency.
25:26
It's no secret that after Speaker Pelosi's visit to
25:28
Taiwan a couple months ago, there's
25:30
a serious ramp up in cyber
25:32
activity and cyber intrusions flowing
25:34
out of the CCP. and
25:36
doing coordinated than ever before. So I was meeting
25:38
with my counterparts there, the Ministry of Digital
25:41
Affairs, basically walking through some best
25:43
practices and continuing to
25:45
collaborate on cyber issues and cyber
25:47
security issues and make sure we're learning from each
25:49
other. That's good.
25:50
Glad to hear it. I
25:52
mean, I had an interesting meeting
25:54
with some Taiwanese. And the question I always ask
25:57
is, are you gonna be like Ukrainians? You
25:59
know, will you fight to the
26:01
death? Because that that that informs
26:03
our decisions on how much support you know,
26:05
we want to give. And their answer is
26:07
always yes. Their answer is always no. It's
26:09
it's it's we're we're not letting go of our our
26:11
hard won freedom. And that's
26:13
that's reassuring, but they do need some
26:15
weapons, I think. Yeah. I think it's right. I
26:16
mean, the way I view it is, you know,
26:19
Ji is looking at Taiwan, and he's
26:21
making a very simple mathematical
26:23
calculation, which he's measuring China's
26:26
capacity and he's measuring against
26:28
Taiwan's capacity plus the capacity of
26:30
our allies and capacity meetings, fighting
26:32
will, weapons as as you put the
26:34
formula. Yeah. in
26:36
once he views China's capacity
26:38
as one percent greater
26:40
than the capacity. Those others, then he makes
26:42
move. And you're already seeing what he's doing to to prepare
26:44
for a Ukraine type situation. They're banning,
26:47
you know, leaders in the c
26:49
c Carr from owning property and other
26:51
assets outside of China. just not subject to
26:53
sanctions and other things. And and, you know,
26:55
Taiwan has not historically recently
26:57
at least a very militaristic society.
26:59
They have about I think nine
27:01
months or so of mandatory service now, but
27:04
they're looking at increasing that mandatory They've
27:06
figured it out real fast because the Ukrainians
27:08
had very strong, like, civil
27:11
defense, you like, culture
27:13
and and, like, kinda, like, Texans. I mean
27:15
-- Right. -- because I was, like, you'll never invade Texas.
27:17
And I didn't quite realize the extent that the Ukrainians
27:19
had that as well until I went there, and and and
27:21
we talked about this. It was a main reason
27:24
Kev was taken is because was just a bunch of
27:26
civilians, like, alright, I'm here for my gun.
27:28
Right. I just went for it. Yeah. Taiwanese
27:30
go full Texas and and
27:32
quickly. look if I significant risk of of China
27:34
making a move, you know, where I live, I
27:36
would have the sandbags out already, the machine guns
27:38
ready. Right. And I think that's sort of the the
27:40
level of activity see getting ramped
27:43
up from inside Taiwan. Yeah.
27:45
Okay. But but this conversation is about
27:47
telecoms. So let's talk if if we
27:49
can move also to the
27:51
general conversation about other Chinese technology,
27:55
Huawei, ZTE -- Yep. --
27:57
we hear some we hear a lot better
27:59
rhetoric coming out of
27:59
places like Europe right, where
28:02
I I think it was a British I
28:04
I don't know who it was. I just heard it on the
28:06
radio, but it was a it was a British guy. He had he
28:08
had an accent. and he seemed
28:10
like he had some authority. So I
28:13
assume he was from the government's with that.
28:15
The accent always always always seems like, yeah, they
28:17
give me nobody. but it was it was it was
28:19
it was basically saying, look, this this whole
28:21
this whole cooperation with China thing, this whole,
28:23
like, peaceful it's over. Like, it's
28:25
all over we're not we're not doing it We're
28:27
not saying their enemies, but we're we're definitely
28:29
not friends. It's over. And and that seems to
28:31
be the the broad consensus seems to
28:33
be Europe as broken up to the Chinese
28:35
threat as well. Even though we've as
28:38
Americans have been beating the drum on this for a while, so
28:40
where do we stand on that? Are we
28:42
in a Are we in a better place? What other countries do we have to worry about?
28:44
I'm worried about South America. Where
28:46
where are we? Yeah. You're right. You know, the
28:48
US really led the way on taking an
28:50
appropriately strong, tough approach to Huawei. We did
28:52
this at the FCC a couple years ago. And at first,
28:55
we didn't get much traction globally
28:57
on this issue, but COVID nineteen really helped
28:59
us turn the Carr. People looked at
29:01
China, they realize that they're putting a lot of
29:03
misinformation, disinformation, and they're
29:05
very concerned about it. And so Europe has now caught
29:07
up to the US in terms of taking a tough
29:10
stance on Huawei. That's great. To your
29:12
point, there's other parts of the of
29:14
the world where it's sort
29:16
of over to some extent in terms of how
29:18
deeply embedded Huawei in the CCPR. You
29:20
look at Africa instance. You look at South
29:22
America. As you pointed out, I was in
29:24
Nairobi, Kenya about two hours outside of
29:26
Nairobi down this dusty road,
29:28
small town. There's just Huawei
29:30
billboards everywhere. So so -- Mhmm. -- there's a real
29:32
risk when it comes to Africa and South America, but
29:34
but Europe's in a good situation. In
29:36
the US, continue to take tough action. In fact, just in the past
29:38
couple of weeks at the FCC,
29:40
we finally formally banned
29:43
Huawei and ZTE Gear from
29:45
getting approved by the
29:47
FCC for use in the US, and we
29:49
took similar action on certain
29:51
surveillance technology, hiked
29:53
vision, hintera, and Dow
29:55
Wow their surveillance cameras. Now a lot of
29:57
people say, well, didn't the FCC ban Huawei
29:59
already, and I say, sort of, but we did a couple
30:01
years ago, which we stopped allowing
30:03
federal subsidies to be used to purchase
30:05
Huawei gear. And that went a long way
30:07
because, really, Huawei was selling into these
30:09
smaller rural telcos that needed
30:11
subsidies to buy. So by ending those subsidies two
30:13
or three years ago, people felt
30:15
like we banned Huawei, but there was a loophole
30:17
which was the same companies could buy the
30:19
same gear without federal subsidies and
30:21
put that same gear in the same point in
30:23
their network. And so we finally took action at
30:25
the FCC and said, look,
30:27
There's no device that can be used in this country that doesn't get an
30:29
FCC seal of approval, a microphone,
30:31
a laptop. And we've said we are no longer
30:33
gonna approve going forward Huawei,
30:36
ZTE gear, and that stops everybody
30:38
in the US from using it for any purpose.
30:40
And the question like, you know,
30:42
Mexicans or Columbians would have is, okay, fine. You
30:44
don't want me to buy Huawei? What what should
30:46
I buy? Yeah. And and do we have a great
30:48
answer for that? You know, we're seeing some increased
30:50
competition in that space. There's this new trend towards
30:52
this technology called ORAN, which basically
30:54
is looking to replace the performance of Huawei
30:56
and ZTE in these expensive boxes
30:59
with cheap boxes but high
31:01
performance software. And so we
31:03
are seeing some increased competition and
31:05
choice to non Huawei. No. It makes
31:07
sense. So that can be made by
31:09
anybody because it's software. So you don't need to be fifteen
31:11
billion dollar a year hardware company to
31:13
use. And there's a range of smaller,
31:15
including US providers that are very good
31:17
on software. that can now compete in this
31:19
space. Oh, that that's interesting. I loved I loved
31:21
it to see when there's this leapfrog
31:23
effect in technology where we just we just you
31:25
know what? Let's not compete that technology, just bypass it. Right?
31:27
And that's that's usually the best thing for Americans
31:29
because we're we're usually the better the best at that. And
31:31
these tests are so important. A lot of people sort of step back and say,
31:33
what is the real threat with with
31:35
Huawei with ZTE. And I was, you know, up at a mouse from Air
31:37
Force Base in Great Falls, Montana. I
31:39
mean, there's this is maybe about a hundred or two
31:41
hundred miles from the with
31:44
Canada. I mean, just nothing up there. It's wheat
31:46
fields, big sky country, except,
31:48
you know, dotted throughout that
31:50
area are ICBM missile
31:52
silos. and there's, you know, men and women that are sitting
31:54
there underground ready to push the button at any
31:56
moment. And again, there's nothing up there except
31:59
dotted throughout all of that as well were
32:02
cell towers running high power,
32:04
Huawei, another gear that was
32:06
over provisioned for the needs of
32:08
providing -- Okay. -- self-service to, you
32:10
know, the small, sparsely
32:12
populated areas up there. So that's an
32:14
example of where, you know, Huawei was just suspicious.
32:16
said insuspicious places in this
32:19
country. Yeah. I mean, I I you know, we
32:21
see some cases where with the
32:23
US government doesn't allow the Chinese to buy
32:25
property in your military bases,
32:27
but there there's a a big push to to
32:29
let them buy even less. Yeah.
32:32
a phenomenon. You look now and switching from back to TikTok
32:34
a little bit. You see a lot of sort of government
32:36
officials now, particularly on on the
32:38
left side of the aisle that are on TikTok,
32:41
because they believe it's a way of sort of
32:43
reaching young voters. But it's ironic because
32:45
these same people, they don't actually put
32:47
TikTok on their phones. They take these
32:49
videos and they send it to other people or
32:51
they use burner phones. Mhmm. But to me, that's gotta
32:53
be the biggest admission of a problem. You're not
32:55
willing, you know, to protect TikTok on
32:57
your phone. that's a problem. III thought about the same thing because
32:59
I'm like, I gotta reach these people. But
33:01
I also am like, I don't have time to create even
33:03
more content than we already created.
33:06
know, there's there's reasons for that. But, yeah, even if we did do it, I would definitely
33:08
put it on a burner phone. And it
33:10
it does it kinda makes you
33:14
too if you're using
33:16
it. But but you are, you know, as a public figure, are
33:18
in a difficult position because you do wanna reach people. You
33:20
do realize, you know, quarter of young people say they
33:22
get their news from TikTok. It's
33:24
amazing. They they haven't transitioned. I mean,
33:26
it seems like Instagram tried to basically compete
33:28
with TikTok with using Reels
33:30
I don't know how competitive it's been. I mean,
33:32
obviously, I think people just use both, but --
33:35
Yep. -- you know, I don't I don't think
33:37
it's it's so complicated for the to some extent, for the Biden
33:39
administration because on the one hand, you've got economic
33:41
considerations where you have economic agencies that
33:43
always tend to wanna be not tough
33:45
on China. you got the national
33:47
security agencies and FBI that wanna be tough, but now you've got this third
33:49
element of this political consideration where
33:51
some people are saying you need to be on TikTok
33:54
to reach young voters. I think that debate right now, economic,
33:56
Nasdaq, political is taking place in
33:58
the Biden administration. I just hope that, you know,
34:00
they they end up in the right spot here
34:02
and taking appropriately tough stance
34:04
on on on TikTok. Some people say that
34:06
the window for the administration to act is post election
34:10
before Republicans in the
34:12
House get gabbles, and we'll see if they land it in that window or not. It also
34:14
just kick on here for another number of months.
34:16
I don't know why, again, it should be
34:18
bipartisan. Carr what's
34:21
what's our progress with getting companies like Apple
34:23
and Google to cooperate on here? I mean, you you
34:25
pushed for them to take it off
34:27
their App Store entirely. This is one
34:29
thing I did. You you mentioned this earlier. I wrote a letter to to Apple a while back and said,
34:31
look, you you may not be an expert on national security.
34:33
I'm not asking you
34:36
to make a decision on that and boot TikTok for that reason per se. But are an
34:38
expert on is your terms of service. And Apple
34:40
has a history of where an application
34:44
is sending data to places or pinging servers that
34:46
it hasn't been disclosed to the user. They
34:48
kick that app off the App Store. And
34:50
I said, obviously, these revelations about TikTok
34:54
and these data flows back to Beijing should be enough for you to
34:56
apply your terms of service and boot them
34:58
from that basis. But plainly, Apple
35:00
has not taken that action. Although,
35:03
At least according to Elon Musk, they are
35:05
suggesting in some form or another they may
35:07
take action against Twitter, which to
35:09
me is just is highly ironic.
35:11
Seriously. If you're Holding advertising
35:14
dollars or support from
35:16
Twitter while keeping or expanding
35:18
it on TikTok, then you're certainly
35:20
sending a signal about brand values. I
35:22
just don't think it's the one that you
35:24
intend. I mean, what
35:24
do you think about Apple's business model in
35:26
general? They're they're they're probably the
35:28
most reliant company on China. it's
35:30
concerning. And before I sent Apple letter on TikTok, I actually sent them another
35:32
one which had to do with the censorship
35:35
that they're engaging in with their app store
35:37
in China. There's an application called
35:40
voice of America, which is actually a -- Yeah. -- an operation run
35:43
by US alliance for global media.
35:45
It's funded by Congress, but it's an
35:47
independent journalistic endeavor. Right. Anyways,
35:50
Apple has removed the V0A
35:52
app from the Chinese app store
35:54
-- Right. -- at the request of the
35:57
CCP. And the reason why, at least according to my understanding,
35:59
is because the VOA app comes with
36:02
built in VPN
36:04
and security protections that China doesn't want
36:07
people to to have. Similarly, you see right
36:09
now these, you know, again, unprecedented protests
36:11
taking place in China and
36:13
yet Apple has ended
36:15
your ability to share via airdrop
36:18
broadly photographs with each other. And so I
36:20
think it's it's pretty clear to me that, you
36:22
know, Apple is certainly a cog
36:24
in the censorship wheel
36:26
of the Communist Party. And Carr I just don't think
36:28
it's a good thing. And they will give you all sorts of flower
36:30
rhetoric we stand for human rights -- Yeah. -- that
36:32
we defend democracy, global
36:34
peace through global trade. And I think
36:36
it's just it they're rhetoric founders
36:39
on their own conduct in China
36:41
and their willingness to engage -- Yeah.
36:43
-- censorship is because of the CCP. Yeah.
36:45
they're a very green company but not in the environment.
36:47
Yeah. You know, let's say if if they make money
36:49
off of it. But it's
36:50
disgusting. I mean, it it the Carr
36:53
thing is really that's that's that's
36:56
that's a pretty rep morally speaking, that's
36:58
reprehensible. Yeah. So so you're you're not
37:00
allowing people who are fighting for
37:02
their lives at this point. I believe these protests are
37:04
sparked because, you know, they've been people in their their apartments for
37:06
this insane COVID policy, which, you
37:08
know, I I don't mind the Chinese
37:11
kind of screwing themselves, but I'll
37:14
screwing up our supply chains. I mean, it's it's it's just been a mess
37:16
for the world. And and
37:18
look, the the Chinese people are innocent, and
37:20
they don't deserve that kind of
37:24
treatment. But you know, they should rebel, and that's what they're doing
37:26
now. And for Apple to
37:28
be complicit in in
37:30
basically taking down the rebellion. because what do
37:32
we say about all
37:34
of the it's Cuba or Venezuela or Iran or China or
37:36
wherever or Hong Kong when that was happening.
37:38
You know, what can we do?
37:40
Right? Are we are we gonna send in the
37:42
the CLT we're not gonna send
37:44
in anything, really. But
37:46
what we can do is help them communicate.
37:48
And, you know, at the very least,
37:50
to do nothing, at least let them use what they have.
37:52
And to to the fact that Apple is not
37:55
even letting them use Airdrop to
37:57
that is crazy. really
37:59
isn't you right? You know, when you look around the world
38:01
today in freedom movements Carr, whether it's
38:03
in Cuba, in
38:06
Venezuela, as you noted, the very first thing that happens is people take Apple's
38:08
and they pull up their smartphones and they start recording
38:10
videos Carr they start sharing images.
38:12
And the very first thing that authoritarian
38:14
regimes attempt to do is to shut that down, filtering messaging
38:17
apps. Mhmm. And and so the the most
38:19
important thing that we can do is to
38:22
support you know, the free flow of
38:24
communications over those modern technologies. The idea that Apple is stepping in
38:26
and blocking that is
38:28
just is really disturbing.
38:31
And I think you see a lot of US companies that
38:33
are doing the opposite. Obviously, StarLink, Elon
38:36
Musk's satellite operation delivering
38:38
satellites to Crenshaw help them communicate freely.
38:40
You know, that's the model of what we
38:42
should be attempting to do here. Oh, I said I wanted to
38:44
talk about that. Can can you give me your sense on and
38:47
that'll be the last last topic we hit. I just heard
38:49
boats get called. But, you
38:51
know, StarLink, it it's it
38:53
seems pretty awesome. Kinda depends on
38:56
where you are in the world. It depends on a few things.
38:58
I mean, help help help the audience for the I think
39:00
people kinda know what it is. Yeah. But what does it
39:02
take to get it? Like, can I just have a
39:04
subscription to StarLink? What's going on
39:06
with StarLink? Carr. It's a really
39:08
phenomenal new technology. We've obviously had satellite
39:10
delivered broadband for for years, but it's been very,
39:12
very slow. And what the breakthrough is with
39:14
Starlight is it's you know,
39:16
legitimate high speed Internet
39:18
service. And we've been authorizing it at the
39:20
FCC, but it's great for sort of
39:22
rural communities that otherwise can't get it. think important
39:24
tool in our arsenal when it comes
39:26
to addressing these authoritarian regimes
39:28
that, you know, rather than sending in troops,
39:31
or Palms, you enable broadband and and
39:33
and free communication over that
39:36
technology accelerates the downfall
39:38
of regimes that govern without the consent
39:40
of their people. So I think it's a great technology
39:42
and we need to continue to stand behind it at the
39:44
FCC. And Frank, I think the US government needs
39:46
having Carr arsenal, this
39:48
ability strategically to stand
39:50
up communication services in these authoritarian countries.
39:52
So what does it take? You you need a server
39:54
on the ground that speaks to the
39:56
satellites, and and Elon can just, like,
39:58
direct the satellites to a certain region. Yeah. There's couple things
40:01
you need, but in the main for the user, you
40:03
have a basically a small pizza
40:05
box sized dish. And anywhere you can
40:07
put that and plug that into a power source, you
40:09
can go directly from that dish to the
40:11
satellite and then effectively that dish is a
40:13
hotspot that can power your smartphone or laptop or
40:15
-- Yeah. -- technology. Carr it. I I've I've talked
40:17
about the maritime industry using it, and they they their
40:20
bills go from, like, thousands of dollars a month
40:22
on, you know, bigger ships to for Internet, it's, like, sixty bucks a Exactly.
40:24
It's crazy. And
40:25
this is profitable to to Elon. Well,
40:27
see, at this point, I think their goal is just
40:29
not to to bankrupt themselves and
40:32
we'll see if it ends up being profitable. You know,
40:34
we had awarded them actually some federal subsidies at the FCC about eight
40:36
hundred million in exchange for a commitment
40:38
for them to bring this service to
40:42
thousands of locations across the country
40:44
and the FCC under the new
40:46
Democrat leadership revoked that
40:48
award that we had given them. The
40:50
one good money we're spending on infrastructure. That's the thing. So you have these billions
40:52
billions of dollars for infrastructure. If these
40:54
same people ever get service again, My
40:57
back the envelope math is it's gonna cost us somewhere
41:00
north of three billion dollars in
41:02
federal subsidies to do it when we had an eight hundred
41:04
million dollar agreement with
41:06
StarLink to provide them high speed service
41:08
services. Let's just do things as
41:10
stupidly as possible. Another thing I learned today
41:12
about the infrastructure bill is that actually
41:14
banned this, like, new technology of
41:16
autonomous electric trains because we're talking about, you know,
41:18
the railway strike is the big
41:20
big top right now and somebody brought up. Yeah. And the infrastructure bill, you can't
41:22
build the no. That's Elon Musk also, by the
41:24
way. Right. And, you know, you can't
41:26
build those because there's no union workers all
41:28
working on them. because they're
41:30
autonomous and they're good for the environment. But
41:32
good job Democrats. You're you're a
41:34
bunch of idiots. So
41:36
okay. So so I guess
41:37
my my my last question on on StarLink
41:40
is is this the
41:42
and I I think we've sort of talked about this, but is
41:44
this is this the real answer for rural
41:46
broadband because there's, you know, the other
41:48
conversation is always about, you
41:50
know, laying fiber down, you
41:52
know, hundreds of miles to service
41:55
like thirty people. just incredibly cost inefficient. So
41:58
is is this really the only answer? Is
41:59
is going to be this low orbit
42:02
satellite kind of broadband? I think StarLink and
42:04
similar Carr earth
42:06
orbit technologies, is the answer for that last, you know, one to
42:08
two percent of the country? Because, you know, there's such
42:10
a to your point out a long tail where you
42:12
reach those last couple of communities with
42:15
fiber, it's expensive. I think the lion's share of all connections
42:17
here are gonna be fiber, and we've got billions of dollars
42:19
in infrastructure to potentially do
42:22
it. But this administration
42:24
is basically saying we want fiber and
42:26
only fiber to exist. And if you
42:28
do that, we're gonna run out of the hundreds of
42:30
billions of dollars and we're gonna leave these people
42:32
stuck on the wrong side that you divide. The reality is we have the technology right now whether it's StarLink
42:34
or this new sort of fixed wireless technology
42:38
which is similar but terrestrial to bring families across the
42:40
digital divide right now. So what you're saying is
42:42
we only want fiber, which is a great technology,
42:45
Yeah. But you're really saying it's for the cities. Right. For cities
42:47
and suburbs. Right. But if you're saying only
42:50
fiber, what you're saying to those families is just stay waiting
42:52
on the wrong side that
42:54
years longer than necessary. When you get we get it to you, if we get it
42:56
to, it'd be really good. But that's why we need
42:58
to put back on the table, all the different
43:02
technologies, StarLink, fix wireless and fiber and get the right mix
43:04
based on the geography and the cost. But right now,
43:06
this administration is really
43:08
heavy thumb on the scale
43:10
for fiber I
43:12
think that's a that's a mistake
43:13
and it's gonna leave people behind. Yeah. That's
43:15
unfortunate.
43:15
Optimistic place to
43:18
to leave. Alright. Hey,
43:20
Thanks so much for being on. We had a lot of different
43:22
topics that I think will very
43:24
educational. Good for the audience. Appreciate you coming
43:26
on again. Thanks so much. Enjoy it.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More