Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
It's time for Twitter this week. In
0:02
Tech, we have two
0:04
of my favorite people on two geniuses.
0:08
Corey Dottro and Alex Cantor. It's
0:11
pro big tech against big tech.
0:13
It's gonna be a great conversation. We'll talk
0:15
about Google, Killing, yet
0:17
another service Mark Zuckerberg
0:20
fighting in the Octagon Elon
0:23
Musk's robot and
0:26
the plan, Peter Teals plan,
0:28
to buy into Britain's
0:30
national health service that and a whole lot
0:32
more coming up. plus the big scam
0:34
and podcast advertising. It's
0:37
a head on Twitch.
0:41
Podcasts you love. from
0:43
people you trust. This
0:46
is toys. This
0:53
is Twit. This weekend, tech. Episode
0:55
eight hundred ninety five recorded
0:58
Sunday October second twenty twenty
1:00
two eastern blocks.
1:03
This episode of this week in tech is brought to
1:05
you by Neurava, Tired
1:07
of the complexity and cost of traditional pro
1:10
AV solutions for large spaces,
1:12
Nueva has simplified everything about
1:14
meetings and classroom audio. You get
1:16
great audio and plug and play systems
1:19
that are easy to install and manage
1:21
and cost fraction of insealing systems.
1:23
Visit noreva dot com slash
1:26
twist. And by, Eight
1:28
Sleep, Good Sleep is the ultimate game
1:30
changer. and the pod is the
1:32
ultimate sleep machine. Go to
1:34
h sleep dot com slash Twitter to check out
1:36
the pod and save hundred fifty dollars
1:38
to check out Eight Sleep currently ships within
1:40
the US, Canada, the UK, select
1:43
countries in the EU and Australia.
1:46
And by? podium. Join
1:48
more than one hundred thousand businesses that
1:51
already use Podium to streamline their
1:53
customer interactions, see how Podium
1:56
can grow your business, Watch a demo
1:58
today at podium dot com slash
2:00
twins. And by
2:04
policy genius, making it
2:06
easy to compare your from top companies.
2:09
Policygenius can help you make sure
2:11
you're not paying a cent more than you have to
2:13
for the coverage you need. head to
2:15
policy genius dot com slash twit
2:17
to get your free life insurance quotes
2:19
and see how much you could save.
2:26
It's time for twitch this week in tech.
2:28
Show we cover the week's tech news.
2:30
We have such a good panel today. I have
2:33
limited it to just two people, Agile Alex
2:35
Kombrawitz, is here from the big
2:37
tech sub spec. newsletter in
2:39
the big technology podcast. Hello,
2:42
Alex. Good to see you.
2:43
Hey, Leo. Great to see you. Are you plugging
2:46
the Vancouver hockey team?
2:48
This is the Grizzly. So this is throwback.
2:51
So for listeners, I'm wearing this throwback
2:53
Vancouver Grizzly's sweatshirt.
2:55
It's pretty cool. It's by Mitchell and Ness, which is think
2:57
my favorite clothing brand, nice. And I'm not a
2:59
fashion guy, but I did see that the grizzlies
3:01
wore the throwback jerseys, and then this girl's
3:04
coat grizzlies. And are you said are you from
3:06
Canada here. Now the closest
3:08
I've been to Canada is Washington State.
3:10
And of course, I've been to Vancouver's Washington
3:12
State in on the West Coast.
3:15
I do have some family in in Toronto,
3:17
shout out Toronto. But
3:20
but no, I'm a New Yorker. Let's
3:22
go let's go meds. there's nothing weirder
3:24
than New wearing a Vancouver grizzlies
3:26
shirt, but, you know, I will go. I
3:28
have one promise to make it to you today. What's that
3:30
is that I'm here to bring the weird. Let's go.
3:33
You're good. Also,
3:35
here, I'm thrilled to have him, Corey, doctor Ro.
3:37
You know Corey very well, I'm sure. He's
3:40
got a new book. In fact, we did a triangulation,
3:42
Corey, and Rebecca Giblin,
3:45
his co author and I on Thursday,
3:47
you might check that out on the Twitter triangulation
3:49
feed or the Twitter event feed, really
3:52
fascinating conversation. Great to
3:54
see you, Corey. You are a Canadian authentic.
3:57
I I we we walk among you. We are
3:59
like serial killers. We look just like everyone
4:01
else. He seems so normal. How
4:03
did we how could we have known? Although, I
4:05
became an American citizen about ten weeks
4:07
ago. I know. I heard I remember that. Congratulations.
4:11
Are you having regrets yet?
4:13
No opposite, actually. Like,
4:16
the worst things get the more glad I am that
4:18
I have more rights -- Oh. -- that I would
4:20
otherwise not be entitled to as someone who
4:22
is merely permanent resident. Now
4:24
your wife is American. Yes. Now,
4:26
she's British. She's British. But she's also
4:28
American now. So she she is
4:31
a Anglo American. My
4:33
daughter and I are Anglo Canadian Americans,
4:35
but I'm through my father entitled to Polish
4:38
Azerbaijan Belarusian, Russian
4:40
citizenship. Great. So I'm I
4:42
might get some of those. Most of them
4:44
are countries that don't intend to ever. I'd wanna
4:46
go, but But, you know --
4:48
Yeah. -- every now and again, you wanna go someplace
4:50
that was like a normalized nation during the
4:52
Cold War and you got the
4:55
right password for it. Yeah. It's nice
4:57
to have, like, you know you
4:59
know James Bond, a whole stack of passports,
5:01
you know, you can go to in your in your go bag
5:03
just in your Right. Just a kid. That's that's what
5:05
I'm down for. I mean, even if it takes the rest of
5:07
my life to get them and hypothetically, also,
5:10
I could get it as realty password. So, like, eight
5:12
in total. Wow. I can I just
5:14
like to have a big you should act with the rubber
5:16
band around. Exactly. Have it in the safe
5:18
with ten thousand dollars in cash, and
5:20
then just leave it open every once in a while. Case
5:22
anybody. just wondering by
5:25
maybe a a sidearm, and
5:27
you're set. You're ready to go. Someone
5:29
in the IRC San Mateo says Putin
5:31
Mike transcript me. Although, I'm a
5:33
fifty one year old man with two artificial hips
5:35
and cataracts. doesn't matter. It doesn't matter. It
5:37
doesn't matter. We're in court. That's really safe. He's
5:39
in the bottom of that. He'll take you.
5:42
How's your hips by the way? Oh, they're
5:44
getting better. Yeah. Good little by little. Last
5:46
time you were on, you showed us your your
5:48
three d model of your hip bone. Yeah.
5:51
Now I I let I should have brought them in. I've I
5:53
I made a brass cane topper
5:55
-- Yeah. -- cast from my hip bone.
5:57
Yes. And then the bone itself is in a shadow
5:59
box. But they're all they're both out in our
6:01
bar in the backyard. Right. Lisa.
6:03
I think Please talk with it. Leave them there.
6:05
That's good. Google.
6:10
Google. Google. They're added again.
6:13
Stadia. Three years old,
6:15
Google has announced They're
6:18
shutting it down. They're streaming gaming service.
6:21
I guess, really, the only question is,
6:24
whoever thought Stadia would survive
6:26
The good news is they are giving
6:28
you your money back. Why
6:33
did Google Alex even think it
6:35
could get in the gaming. What was the
6:37
what was the process there? Well,
6:40
I think that what we're seeing now is this demarcation
6:42
between old Google and new Google. Yeah.
6:44
And all Google said, let's experiment with
6:46
everything we can. Throw money behind
6:49
it because why not? Right? And that's how you end
6:51
up having a thing like stadia get funded in the
6:53
way that it got funded. and become a big
6:55
prong of Google's strategic vision
6:57
is because they were gonna make
6:59
every experimental project part of that
7:01
vision and eventually some of it would
7:03
stick. And that would be the, you
7:06
know, potential next big reinvention
7:08
for Google. Did Google get, you
7:11
know, financial responsibility.
7:14
I mean, did they start getting it? Did they grow up?
7:16
Or Well, that's what I think that's exactly what I
7:18
think we're seeing right now. And this the canceling
7:20
of stevia is, like, maybe just the beginning. Right?
7:23
We had Sumed up Chaya, CEO of Google
7:25
and Alphabet at Code a few weeks ago saying
7:27
Google needed to get twenty percent moderation.
7:30
Yeah. And and I think that, like, rather
7:32
than look at this as, like, you know, Google's bet
7:34
on gaming, I think we can really see this as a signal
7:37
for where this company is going. and maybe
7:39
where all of big tech is going, which is
7:41
that those experimental projects that used to get
7:43
the money because,
7:44
again, why not? They're not
7:45
gonna get the money anymore.
7:47
And I think this is gonna do two things. One, it's
7:50
gonna add bring needed focus to a lot of these companies.
7:52
But two, it's gonna open the door for
7:54
companies who would have otherwise been in these
7:56
incremental areas and gotten wiped out by Big
7:59
Tech to start competing
7:59
with Big Tech in a way that they haven't been before.
8:02
So I think this is, again, so just a very
8:04
dynamic, interesting time for Big Tech
8:06
and Stadia is one example
8:08
of what we're seeing in terms of the change
8:10
that's just going to accelerate, I think, over the
8:12
next few months. Do you think Coriant's maybe
8:14
also trepidation about government regulation
8:16
that maybe Google's saying we should pull some
8:18
of the tentacles back in?
8:21
I mean, III don't think
8:24
that so I think that that
8:26
merger scrutiny is a lot easier to
8:28
imagine the state imposing than merger
8:31
review and unwinding. unwinding
8:34
mergers is really hard. It's very expensive
8:36
and it takes a long time. I think it's probably a
8:38
necessary corrective And
8:40
I I do have one weird trick that
8:42
we can talk about later for how I think we could
8:44
do a lot of it very quickly. But merger
8:47
scrutiny is far more likely. And since they didn't buy
8:49
stadium. There there I
8:51
don't think there would be worried about that. But I
8:53
did wanna say that, you know,
8:56
it's underappreciated the extent
8:58
to which Google bought its way to glory instead
9:00
of inventing its way to glory. You know,
9:03
this is a company that's had one and a half internal
9:05
successes. they made a great search engine and a pretty
9:07
good hotmail clone. All of the other
9:10
things that they built internally crashed and burned,
9:12
and all of the successes that they have are things
9:14
they bought from someone else. And this is just the
9:16
latest example. Right? Google video, stank,
9:19
YouTube, succeeded. Google
9:21
couldn't build mobile OS, but Android
9:23
came along. And people will say, you know,
9:25
that Google Photos is
9:27
an internal success, and it's true, but it's an internal
9:30
success because it comes bundled on android. which
9:32
Google bought from someone else. And they bought Pecassa,
9:35
which is a lot of the back end -- Yeah. --
9:37
as well. I'm looking to know
9:39
your Google's purchase starting in thousand
9:41
one when they bought Desjardins. I forgot
9:43
about that. Desjardins. Yeah.
9:46
Remember Desjardins and blogger, of course, was
9:48
their third big acquisition. They
9:50
bought AdSense. They didn't make that up. They
9:52
deployed their whole ad tech stack, their server
9:54
management, their mobile platform, their video platform,
9:57
you know, customer service HR software,
9:59
like
9:59
all of it, their docs platform,
10:02
all came from other companies. Yeah. And
10:04
what about their room? mean, what about the browser?
10:06
Chrome, you're right. Chrome is Chrome is a browser
10:09
they did build internally. You're right. That's an omission.
10:11
Thank you. And the I take it bad. led
10:13
that project is, of course, the CEO today.
10:15
Sundar was -- Right. -- in charge. -- two and a half successful
10:17
products. You're absolutely right. But
10:19
still, it's an increasing And I don't mean to be
10:21
so. I I don't mean to be flip here.
10:23
Right? I it's but it's not it's also true
10:26
of lots of other companies that they're buying their way
10:28
to glory. And the reason
10:30
I bring that up because you asked me about anti
10:32
trust is that historically, companies
10:35
were prohibited from both merging
10:37
with major competitors and also
10:39
buying nascent competitors that they might relies
10:41
on their way to becoming a threat. And,
10:44
you know, the the modern antitrust that
10:46
was practiced for the forty years of kind of
10:48
Reaganomics that seems to be coming to a
10:50
close was extraordinarily tolerant
10:53
of acquisitions as a growth strategy. But
10:56
as I say, I think that's coming to a close. In
10:58
the UK, the Competition of Markets Authority is
11:00
challenging. you know, even small acquisitions
11:02
like Facebook buying Jiffy, which
11:05
I refuse to call Jiffy. And And
11:09
and, you know, III think that you're gonna see
11:11
more of this. And and I think it's
11:13
necessary corrective. I mean, the extent to
11:15
which VC's have become
11:18
effectively, like, corporate
11:20
recruiters who basically say,
11:22
alright, we're gonna gonna put little money into
11:24
a a startup whose purpose
11:27
is to basically produce a postgraduate
11:29
portfolio piece that we're gonna pretend
11:32
as a product, just to prove that they can work
11:34
together as a team, and then a tech company
11:36
will buy them, throw away the product, and just
11:38
put them to work. And the VC's
11:40
equity will just be like a finder's fee
11:42
it's a it's a grotesque and
11:45
wasteful way to conduct business
11:47
to say nothing of to get, you know, innovative
11:49
products into the market. I'm just looking
11:52
at this. The list of acquisitions
11:54
Google has made, the many great
11:57
companies that I've loved
11:59
when they were around that have
12:01
basically disappeared. Bikasa is a
12:03
good example. Jiku is a good example.
12:06
You you know, you can go on and
12:08
on and on They've been
12:10
they've basically been a graveyard. Feedburner,
12:15
Grand Central, which became I mean, Feedburner's
12:17
still in there. Yeah. And so is so is
12:19
Google Voice as Grand Central. Although
12:22
part of the problem with Google killing stuff
12:24
is it it makes people nervous about adopting
12:26
Google services. because -- Mhmm.
12:28
-- there's always this risk that Google's gonna
12:30
lose interest. So, Alex, you're
12:32
saying this is kind of a sububrious adjustment
12:35
of there. of their financial, you know,
12:37
financial maturity, but it's also risky,
12:40
isn't it? I do think it's risky.
12:42
Yeah. I think this is the I I made this
12:44
point. and big technology a
12:46
couple weeks ago talking about how,
12:48
you know, you might end up seeing stuff that Wall
12:50
Street likes a lot in the short term. things
12:53
like greater profitability. They
12:55
might even make some offensive moves to try
12:57
to take up some competitors. But in the long term,
13:01
I think focusing on profitability, especially when
13:03
it comes to big tech companies that are ten, twenty,
13:05
thirty, forty years old, leaves them
13:07
vulnerable to outside challenges in a way
13:09
that they weren't before. Google
13:12
is at least trying to make a people
13:14
who bought into stadia. They're gonna refund
13:16
your fancy controller
13:19
and any games you bought, which is actually
13:21
a big deal because you had to buy the games to
13:23
play them. on stadia, but they are
13:25
not gonna refund your subscription fees.
13:27
And there are some people who are
13:29
a little miffed about Google pulling
13:32
the plug like a YouTuber
13:34
named its Color TV who
13:36
says he had devoted Five
13:39
thousand nine hundred seven hours
13:41
to building up a character in Red Dead Redemption
13:43
two, a character which will
13:46
be lost. That's two hundred and forty
13:48
six days. lost in
13:50
January when Google
13:52
pulls the plug because there's currently no way
13:54
to transfer that character. Yeah. to
13:57
your own copy of Red Dead Redemption.
13:59
I'm sorry about about it. Like, for
14:02
folks who think that they that it's a good idea
14:04
to invest in products other people's platforms
14:06
without thinking that there's a risk that the platform
14:09
might pull the rug. It's
14:11
absurd at this point. Every single
14:13
thing that people build on one of these tech platforms,
14:15
you have to understand that if
14:18
if, you know, for some reason, they decide
14:20
that, you know, they don't wanna support it anymore,
14:23
It's
14:23
done. And
14:23
just do you have my thing? But you don't have
14:25
a Do you have a choice is the problem?
14:28
Right? I mean, where do you go if you're not gonna
14:30
build on at this point, you can't build
14:33
on any you're gonna it's YouTube or
14:35
TikTok. My son is two two point
14:37
one million followers on TikTok is
14:39
starting to build a career as a TikTok. creator,
14:43
but it's it's not like you could do that,
14:45
you know, on your own blog anymore.
14:47
Right? Yeah. Of course. It's different. And even
14:50
in other words, congratulations to him. He's
14:52
done. I appreciate it. You do it you do
14:54
it with your understanding that that it
14:56
might just be moment in time. And and
14:59
I think we all have to be okay with that when we're on
15:01
these other platforms because there's no other
15:03
way around it. They all Oh, by the way, I'd say
15:05
TikTok is get like, providing that opportunity
15:07
for your son to, you know, create that business.
15:09
That's a great opportunity. It wouldn't be there.
15:11
Yeah. So it it goes both ways. But
15:13
all this stuff really needs to be viewed as, you know,
15:15
potentially temporary situations because
15:18
that could always fit over time.
15:20
And often it does. Howard Bauchner: I
15:22
I agree with your cautionary note
15:24
there. I do think that
15:26
when we say there's no other way, you're right that
15:28
there is no other way right now, but it's not like there's
15:30
no conceivable other way. Right? Like
15:33
when RSS was designed, in
15:35
and its core was this idea of
15:37
blocking, lock in, and orphaning, and
15:39
so on. Yeah. So, you know, there's there's an XML
15:41
directive you can put with RSS
15:44
that says this feed is permanently moved to a different
15:46
address. So if you and your hosting company
15:48
part ways, you just send that directive out
15:50
the next time a podcatcher pulls down that XML
15:53
file, it goes, oh, I'm just gonna relocate my
15:55
my bookmark for this to a different server.
15:58
and you can just take your audience with you.
15:59
I
16:00
understand why YouTube hasn't built that,
16:03
but I I don't understand
16:05
why we should why we shouldn't want them
16:07
to build it. Right? You know, like, if
16:09
we're going to get a better deal for
16:11
creators from YouTube, one of the
16:13
ways that we'll do that is by YouTube having
16:15
a legitimate fear that if they give creators a
16:17
bad deal, that those creators will go elsewhere,
16:20
a separate issue to the one about about
16:22
this Red Dead Redemption character. But again,
16:25
at least within a single game, it's
16:27
easy to see how those stats
16:30
could be moved over. I mean, this is just a small
16:32
database entry. But I think that
16:35
firms don't like doing this because
16:37
they like their App Store to be
16:39
like a sole portal into payments
16:42
and other sources of value. And
16:44
if they do create that interoperability, that
16:46
they there is that possibility that customers
16:49
will jump ship. Right? If they're taking a thirty percent
16:51
rate or fifteen percent rate or whatever it is,
16:54
Stadia was taking out of Red Dead Redemption's
16:56
publishers. The inability
16:59
to port a character is a feature and not a bug.
17:02
It's not a technical challenge. You
17:05
know, you that they YouTube
17:07
could disclose to red dead red dead redemption,
17:09
all of that material, and and give
17:12
publishers the the technical means to
17:14
move that character over from all file.
17:17
That's not But they they just choose not to.
17:19
Right. don't you think that if you spend
17:21
the money creating this stuff, you should be able
17:23
to make the rules. Let me give an example like brick
17:25
and mortars to it. Right? If you have a restaurant,
17:28
Yeah. And you you you shouldn't
17:30
just make you'd be required to
17:33
store all the person's food preferences
17:35
so that they can then go to another
17:38
restaurant and basically mimic
17:40
exactly what you've been giving them without having
17:42
that restaurant having to work for it. should be
17:44
something that starts in the ground up. So
17:47
I I mean, that the user should be able to help, please.
17:49
Yeah. They I mean, honestly, I know what my preferences
17:51
are, so I can bring them with me. Yeah. But
17:53
if the if somehow the restaurant was able
17:55
to lock up my preferences, that
17:57
wouldn't be a good situation. Would it? I
17:59
don't Of course. Yeah. Listen. I think that, like, if
18:02
you spend the effort, you run the business, you're spending
18:04
the time. Yeah. I'll never go to that restaurant
18:06
then. If they're gonna deal my preferences and
18:08
keep them both for themselves. Let's let's
18:11
say you'll not sign you go to to let's
18:13
say you go to pizza restaurant number one.
18:15
Right? And you have, like, a you
18:17
know, you have And they name a pie for me.
18:19
The Leo pie, which is using
18:21
great tomatoes, pepperoni, And
18:24
that's all and then I I can't take that to another
18:26
restaurant because it's Oh, you shouldn't. No.
18:28
But the other restaurant can't The other
18:30
the other restaurant should be able to make that.
18:33
but there shouldn't be a requirement to go to this
18:35
restaurant and sort of please send the rest of download
18:37
Leo's rest special. Yeah. Right. And then give
18:39
it to the competitor. That competitor figure out on their
18:41
own. But LaPenski o IHOP. RSS
18:44
is a good example because, I mean, this whole
18:46
network is dependent on RSS. That's why
18:48
that podcasting works. But didn't
18:50
the platforms come along and say, yeah, we're gonna
18:52
kill RSS creating
18:54
some sort of spurious allegation
18:58
that RSS isn't working or it's not right
19:00
or it's not good. They did the same thing with XMPP
19:02
They don't want interoperability, so they
19:05
actively kill it. Don't they? And
19:08
let me be clear. I'm not saying necessarily
19:10
that I want YouTube to be forced to do this.
19:12
But in the absence of meaningful competition,
19:15
YouTube is neither being disciplined by
19:17
firms nor by regulators. And
19:19
that means that they can their corporate preferences
19:22
carry an enormous amount of weight, and
19:24
that furthermore, the lack of competition,
19:27
which arises in part out of this
19:29
anti competitive vertical integration that I was just
19:31
talking about where where Google just buys the companies
19:33
that might later compete with it means
19:35
that it has enormous power over policy
19:38
So while I I support absolutely
19:42
YouTube's fair use claims, I was a
19:44
a staunch supporter of YouTube when Viacom
19:46
was suing them for a billion dollars. I
19:48
think that it's not YouTube's
19:50
job to make sure Viacom's business
19:52
model is intact or to
19:54
respect their business model. I think what sauce for the
19:56
goose is sauce for the gander. And if you
19:59
were to try and make a tool that
20:01
allowed a YouTube
20:02
the
20:03
broadcaster to take their audience with
20:05
them to a rival platform, which
20:08
is I think analogizes to the
20:10
kinds of things that people did that YouTube
20:12
did when they made a tool that allowed
20:14
people to take the video they liked and put it
20:16
on YouTube and and only have to
20:18
respond to takedown and so on. YouTube
20:21
would reduce you to, like, radioactive rubble.
20:23
They'd they'd say you violated their terms of service.
20:25
They'd say you violated their computer fraud and abuse
20:27
act. They'd say you violated the DMCA.
20:30
and and they would they would put
20:33
you out of business for doing onto them what they
20:35
did onto others. I'm just saying I'm just
20:37
saying that They should
20:39
face the same competitive pressure. They gave
20:41
rise to something as innovative as and great
20:43
as YouTube so that the next innovative great
20:45
thing can come along.
20:48
That seems fair. You
20:50
wouldn't be against that. Right, Alex? I mean,
20:53
You
20:56
don't have to wealth. I mean, is there a
20:58
bill in the the Senate to promote interoperability?
21:01
Is that And that's the Access Act? It wouldn't
21:03
it wouldn't touch this, but it's The Access
21:05
Act is about exposing APIs for social media
21:07
and few other kinds of platforms, app stores
21:09
as well. I and
21:12
then the EU, there's the Digital Markets Act.
21:14
they're they embody the the punch line
21:16
of that Irish joke. If you wanted to get there,
21:18
I wouldn't start from here. Right. Like,
21:21
you have this, you know, you have this
21:23
situation where these firms are very dominant
21:26
and where they do act as gatekeepers. Right?
21:29
Where they're, like, they're Matt well
21:31
Matt Matt Mullenweg's post about why Tumblr doesn't
21:33
have porn anymore and why it never will was pretty
21:35
instructive here. He was like, you know, we we
21:37
submit updates to Apple three
21:39
times a month. And at any
21:41
time, they might just arbitrarily decide that the
21:44
filters we had last month are no longer good
21:46
enough, and then we just go out of business.
21:48
And he's like, I don't know how how Twitter gets away
21:50
with it. I don't know how Reddit gets away with it, but
21:52
we don't and we couldn't. Apple has
21:54
has chosen to make exceptions
21:56
for some firms and not others. We only have
21:58
hundred million users instead of
22:00
you know, however or we only have a million ten
22:02
million users instead of hundred million users.
22:05
So maybe that's why but
22:07
it it just It just puts Apple in the position
22:09
of picking winners and losers in the marketplace. And
22:12
I don't think the answer is to say Apple, you
22:14
must carry all apps no matter whether you feel
22:16
they're good or bad. but I think that
22:18
its customers should be allowed to
22:20
choose a different app store. Right? It is
22:22
after all their phone, it belongs to them. You
22:24
know, as you as you say, if if you add
22:26
the value to it, EG by opening your
22:28
wallet and buying it, then it should be
22:30
yours to use as you feel like. And,
22:34
you know, the fact that Apple doesn't
22:36
have to face meaningful competition from other
22:38
app stores for the hardware it's sold
22:41
means that a connect in this very high handed and
22:43
opaque way And it it
22:45
does just, you know, it gets to structure
22:47
the entire mobile market, kind
22:49
of, or half of it, the other half
22:51
being structured by Google, but no one elected
22:53
them. And, you know, mostly what
22:55
they use to to attain that structuring
22:58
is not technology, but the law. It's the fact
23:00
that if you were to try to unlock an Apple
23:02
phone and you know, sell a dongle that Jill
23:04
broke your phone and let you choose another app store,
23:07
Apple would sue you. So they're happy to have the state
23:09
regulators step in. and prevent
23:11
people from offering more
23:13
choice to people who own vices who wanna
23:16
use their property in different ways, but they
23:18
abhor regulation when
23:20
someone steps in and tells them how to use their property,
23:22
I would actually prefer to just withdraw the
23:24
legal protections from Apple, not
23:27
impose new obligations on them.
23:29
You say the access act should be modified
23:31
to allow a right of private lawsuit
23:34
-- Yeah. -- which is what Texas
23:36
has done with their gun. laws
23:38
and their abortion laws. Actually, no.
23:40
California did it with the law laws of Texas.
23:42
No. With the laws. It's pretty that's that's slightly
23:44
different because it's disinterested third parties. No.
23:46
This is just, like, you know, there
23:48
are some statutes that only a
23:51
public prosecutor can invoke and some
23:53
that the public can invoke. So imagine if
23:55
you went to the muffler shop and they wrecked your
23:57
car. And the only way you could sue
23:59
them is if you could get your local attorney general
24:02
to sue them. Right. That's what it's like when there's no private
24:04
right of action. So private right of action is if
24:06
you individually were harmed by
24:08
someone who violated a statute, you
24:10
can hire lawyer to sue them. statutes that
24:12
don't have a private right of action require public
24:14
prosecutor to bring action. And sometimes
24:16
that's appropriate, but I think, like with both privacy
24:19
and with the Access Act, a a
24:21
private right of action makes sense.
24:23
Yeah. I think the fear is that
24:25
it would jam the courts with
24:27
a a bunch of frivolous action as well.
24:29
Although, you know, don't see that in Texas. I
24:31
don't see it in California. So maybe And you can
24:33
you can just you can have something
24:36
like a slap act where -- Right. -- you can have early
24:38
motions to dismiss you can also have fee
24:40
recovery, which disc which discourages that
24:42
kind of thing. If you if it's loser
24:44
pays -- Right. -- people aren't gonna
24:46
bring spurious lawsuits because the other side will be
24:48
like, great. I'm just a hire a lawyer on contingency
24:51
to rack up giant billings for your frivolous
24:53
lawsuit. And at the end, I'll take it out of your hide. It's kinda
24:55
telling that we'd have to rewrite our tort system
24:57
in order for this to work. Well,
25:00
it's it's it's mostly illegal protection
25:02
and not a technical one. Right. Right.
25:05
As a technical matter, jailbreaking has
25:07
been of varying degrees of difficulty
25:09
at different times in in Apple's
25:11
device's history. But as
25:14
a legal matter, the the difficulty has
25:16
stayed constant.
25:17
the you
25:18
know, you have you have, like, CheckMate,
25:20
which is a a jailbreak
25:23
against all of the secure end plays for
25:25
eight years with iPhone models that cannot
25:27
be remediated because the secure enclave
25:29
is not field updateable because that's the whole
25:31
point. If you can modify the secure enclave,
25:34
then it's not secure. And
25:36
so, you know, hypothetically someone
25:38
could develop a jailbreak based third
25:40
party app store that leverage
25:43
Checkmate for anyone who bought an
25:45
iPhone over the first eight years of its existence
25:47
or, you know, year four through
25:49
twelve of its existence. but
25:51
they can't because Apple would come after you under
25:53
the DMCA. Is is that
25:56
so Matt Mullenweg, in fact, I wish
25:58
I'd asked him this because he was on our show couple of weeks
26:00
ago. about the porn
26:02
thing. There was a kind of a a movement
26:04
on Tumblr people saying, oh, Tumblr's bringing
26:06
porn back. Look at this. Look at this. Look at this. And Matt
26:08
had to write a blog post saying, no. No. not coming
26:10
back. It's never coming back. Is it
26:12
fair for Apple or for Matt to
26:15
blame Apple for this, Alex?
26:17
I mean, I look I
26:20
I'm not familiar with the with the controversy here.
26:22
I know that Tumblr used to be Philip Warren, I guess.
26:24
Yeah. Verizon killed it. Right.
26:26
Matt said, basically, that's the old tumbler.
26:29
Back in two thousand six, you could do that.
26:31
But nowadays because Apple would just knock us
26:33
out of the app store. And
26:36
that's forty percent of our sign ups and eighty
26:39
five percent of our page views from mobile.
26:41
We'd be out of business. Yeah.
26:44
I think But look, at
26:46
a certain point, you gotta let the companies that
26:48
are running these products make their own decisions.
26:50
Apple has a reason for not wanting to have Yeah.
26:52
But these point is they like Twitter and
26:55
Reddit, both Mhmm. -- have considerable
26:57
amount of adult content. Well, I do think that
26:59
were gonna wanna be consistent in the application
27:01
of the rules out there. He says they're too big for
27:03
Apple, the block. So they decided to make
27:05
an example out of Tumblr. If
27:08
that's the case, which I think is right. I mean,
27:10
I think Matt knows better than anybody and I like Matt
27:12
and I trust him, that's a
27:14
really good example of Apple misusing its
27:17
market power. you know, I think the one way
27:19
that Apple can really do better job is make sure
27:21
that it gets some of the scams out of the App
27:23
Store. You know, I I don't know if
27:25
you know, Apple's making the best use if
27:27
it's time being, like, morality, please, on
27:29
apps like Tumblr, and if it's going to be better,
27:31
be consistent. But there's so many scam apps
27:33
in the App Store These are well documented.
27:36
You know, they exist in the US and outside,
27:38
largely outside. And, you know,
27:40
if the company is trying to make us think that that
27:42
thirty percent fee is worth it, you
27:45
know, work on get, you know, those scams out
27:47
of the app store first, and then we can, you know,
27:49
go the level down and talk about, you know, decency
27:51
on on the apps. Yeah. And thanks to Costa
27:53
Alvario who
27:56
exposes this makes he's made this
27:58
his job, ever since Apple, blocked
28:02
his very useful
28:04
tool for writing text
28:06
on an Apple Watch, allowing
28:09
others through. He's made it his life work.
28:13
to to find scams on the App
28:15
Store. Apparently, he had a deal with he
28:17
made a deal with Apple over over
28:19
his apps being blocked,
28:22
so he's had to stop talking about that. But
28:24
Well, that's terrible. Yeah. So by the way,
28:26
can can I ask you a question? So I
28:28
think that there's been a very interesting arc
28:31
to these conversations that we've had about Big Tech.
28:33
The first one was a recognition. of
28:36
the fact that these apps have and and companies have
28:38
just become too big, and it all happened
28:40
so fast. Right? Facebook went from,
28:42
like, five hundred million users to a billion
28:45
to a couple billion users in a blink
28:47
of an eye. Amazon went from, you know,
28:49
being a percentage of of online retail
28:51
to being online retail effectively. in
28:54
a moment's notice Apple from a one trillion
28:56
or, you know, a couple hundred billion dollar company
28:58
to a three trillion dollar company in a moment's notice.
29:00
So then everyone's like, okay. These companies are
29:02
the biggest companies in the world, and
29:04
they're squeezing out competition, which is definitely
29:07
true. We need regulate them. And
29:09
and then we had this flood of ideas
29:11
and and bills that have come and tried to
29:14
reign them in. I
29:15
I just wonder if they're if they're going too far.
29:17
Some of them seem like they make a lot of sense
29:19
to me. Right? Like, the idea that a platform
29:21
cannot privilege its own products by
29:24
using the data that it gets from companies
29:26
that have to go through them. That makes sense. But
29:28
the whole idea of cutting off acquisitions,
29:30
okay, most M and A fails. So
29:33
the idea that M and A is the only thing that's made
29:35
these companies successful, doesn't doesn't really
29:37
make sense to me. The idea
29:39
of data portability, like, I I understand
29:42
that the tenants of data portability, but people are on
29:44
Facebook. people on Twitter for the network. It's
29:46
not like you can just take your data and go somewhere
29:48
else and be okay. And all this is happening
29:50
in the context of a market that's punishing
29:53
these companies Ruthlessly. I mean, Facebook's
29:55
down fifty seven percent this year in
29:57
the stock market and getting me capped by
29:59
TikTok.
29:59
So
30:00
I would just wonder if how
30:03
much how far if we've gone too far
30:05
on, let's regulate these the
30:07
fair competition back in the market without
30:10
remembering
30:10
that we have a market economy and
30:12
end letting
30:13
the market do its work as it seems to be doing
30:16
this here. I mean, I
30:18
I agree. I think that it's important to
30:20
distinguish between the dynamics
30:22
that drive growth and the dynamics that maintain
30:24
growth. So growth, I think, is driven
30:27
it's well understood by network effects. You
30:29
know, you joined Facebook because you wanted to talk
30:31
to people who are there. They joined Facebook because
30:33
they wanted to talk to you. You
30:35
made an app for the App Store because you wanted to
30:37
sell it to Apple customers. Apple customers
30:40
bought iPhones because they wanted to use your
30:42
app. And so there's this virtuous cycle that
30:44
that drives growth. But intrinsically, technology
30:46
has really low switching costs because
30:49
Computers are universal. The only computer we
30:51
know how to make is the computer that can
30:53
run all the software we know how to write. It's the,
30:55
you know, Turing machine. The von Neumann
30:57
machine And so, historically, you
31:00
know, when you had firms that had choke points in the
31:02
market the way Microsoft did in the early two thousands,
31:04
when it wouldn't maintain the
31:06
Mac office product and It became
31:08
harder and harder for CIOs to justify
31:11
having Macs in the office. was a CIO back
31:13
then. We started putting Windows machines
31:15
on designers desk so that they could access
31:17
Word files and Exile files and PowerPoint
31:19
files without corrupting them. Eventually,
31:21
we just put bigger graphics cards in them threw away
31:23
their Macs and and bought them Adobe Suite
31:26
for for Windows. And the way
31:28
Apple resolved that was not by
31:31
asking the the law to regulate Microsoft
31:34
nor was it by telling
31:37
people that they like Max Veterans, so they should
31:39
just hang in there. They reverse engineered
31:41
Microsoft products. They made pages,
31:44
keynote, and numbers. which are reverse
31:46
which reverse engineered the file formats of
31:48
Excel, PowerPoint, and Word.
31:51
And they made them feature compatible, and they
31:53
kept a team on that so that every time Microsoft
31:56
updated their file formats. Apple updated
31:58
their file formats in parallel
31:59
so that they could maintain compatibility.
32:02
that's a thing that has been ended.
32:05
Right? The
32:06
mechanisms under which we used to do that have now
32:09
been made illegal under purchase interference
32:11
with contract under non disclosure
32:13
and non compete under the computer fraud
32:15
and abuse act, section twelve one of the digital millennium
32:17
copyright write act and so on. We've created
32:20
this like a number of laws that boil down
32:22
to felony contempt of business model.
32:24
And what that's done is it's made it
32:26
possible for firms that have attained dominance through
32:28
network effects but would historically have
32:30
faced the risk of losing
32:32
that dominance also through network effects because
32:34
the corollary of a service that gets
32:36
more valuable every time joins someone
32:39
joins as a corollary as a service that gets
32:41
less valuable every time someone leaves. And
32:43
so you're you're prone to these like bank
32:45
runs on your users as we sort of see happening
32:47
with Facebook now. -- where people -- Yeah. -- even Facebook
32:50
and advertisers are leaving and then people leaving
32:52
and advertisers leaving. So, you
32:54
know, restoring that interoperability, the right
32:57
to interoperate when Facebook extended
33:00
membership to non EDU addresses, all
33:03
of the users that had hoped to court were already
33:05
on my space. And rather than telling them,
33:07
you know, you should pick a day when all of you quit
33:09
MySpace and come to Facebook or you should
33:11
maintain two separate clients, it gave
33:13
them a bot. and you could load that
33:15
bot with your login and password for myspace,
33:18
and it would go and scrape your waiting messages
33:20
and put them in your Facebook inbox. and then
33:22
it would like to reply to them, push them into your MySpace
33:24
Outbox. And, you know, if you try
33:26
to do that to Facebook today or if you try to
33:28
reverse engineer Apple's App Store, and
33:31
produce a feature compatible app store
33:33
today the way Apple did to Microsoft, they
33:35
would destroy you. And so you're
33:37
right, there's a market dynamic that drove
33:40
this growth, but it's not a market dynamic
33:42
that maintained the growth. The thing that maintained the
33:44
growth was the capture of regulation.
33:47
to prevent new firms from doing
33:49
to these firms what they did when they were new
33:51
firms. Well, I also
33:53
wonder Leo, you
33:55
can tell. You can ray me in, but No. No. No. This
33:57
is good. I'm I'm I'm and then I'm gonna bring up
34:00
some other congressional legislation
34:02
that is probably misguided, but go ahead.
34:05
If this is the case, then how do you explain
34:07
and by the way, you know, just just for the
34:09
I'm trying to learn here. So how do you explain
34:12
the notion that that Figma, which effectively
34:14
does what Adobe does, but does it on the
34:16
browser, just sold to
34:18
Adobe for twenty billion dollars. And what was
34:21
absolutely a defensive move
34:23
because Adobe knew that Figma
34:25
was gonna kick its butt if it let
34:27
it continue to to grow. And
34:30
another thing but by the way, I'm surprised,
34:32
but the consensus seems to be that that's gonna
34:34
be allowed to go through. When to me,
34:36
Corey, this is exactly what
34:39
you're talking about. But but Figma's ability
34:41
to succeed is effectively also like
34:43
pretty impressive. Right? Yeah. And then there's the
34:45
other like, with with this acquisition,
34:47
Adobe is gonna become fig effectively. The
34:50
other thing that I wonder about is what happens when
34:52
we, you know, we we we have these
34:54
conversations in the context
34:57
that, you know, nothing's gonna
34:59
change. But what happens when we move platforms? and
35:01
we go to, you know, augmented reality,
35:04
for instance. Like, the the
35:06
people who are building the operating systems in the
35:08
hardware for augmented reality right
35:10
now. You know, we don't we don't know who's gonna
35:12
win that. And that can just
35:15
like we move from desktop to mobile and
35:17
and you know, even, you know, downloaded
35:19
to to the cloud. You know, that
35:22
could also throw a few Well, this is a isn't this an
35:24
opportunity with met with the with the Metiverse
35:26
to say, let's do this differently. because otherwise,
35:28
you're gonna have Apple's metaverse. You're
35:30
gonna have a meta's metaverse. You're
35:33
gonna have maybe Microsoft's metaverse. and
35:35
you're gonna have to choose one of the other. One will
35:37
win over the life forever. Second
35:40
life and don't don't
35:42
laugh at me, but I think that The
35:44
metaverse is gonna be largely enterprise,
35:46
and there's this Facebook advertisement that
35:48
I think I've been right about soon. That's all about
35:50
enterprises use. of enterprise
35:53
uses of the metaverse. I'm I I
35:55
really believe that the metaverse is gonna be enterprise,
35:57
not social. And if it's Well,
35:59
I think suck is hoping it will be more than
36:02
just enterprise. But well, exactly. But you
36:04
look at their advertising and they're starting to Well, they
36:06
want they want it all. And Microsoft
36:08
clearly made their decision. They said, yeah. The
36:10
HoloLens isn't gonna be a consumer product. Exactly.
36:12
So but let let's throw another competitor in.
36:15
And again, this is where I'm hoping no one laughs,
36:17
but magic leap. You know, their second
36:20
generation device is not bad, really.
36:22
And it's heared entirely towards
36:24
enterprises. Yeah. So Like, I think capital
36:26
overhang, though. I mean, I think a lot of money
36:28
they owed to Virtis, but when you go to -- Yeah. --
36:30
ships little Voice openings for competitors.
36:33
Yeah. Sorry for that. I mean, I think
36:35
that the like, starting for the
36:37
end of working backwards, I think you're there's
36:40
a good case to be made that the metaverse
36:42
if it if it ever succeeds won't be
36:44
for entertainment. If for no other reason then
36:46
walking around with a brick on your faces an invitation
36:48
for someone to come up and kick you in the ass. So
36:51
I could see why it would only be used by people
36:53
who were, like, sitting comfortably. What about augmented
36:55
reality where you're I I maybe.
36:57
But I don't wanna get too caught up in there. I think
36:59
that you that the Figma story is really
37:02
interesting that that what you see is exactly
37:04
what I'm describing where there
37:06
are elements of of PSD that were
37:09
not which is the Photoshop document for
37:11
file format that were not within
37:13
this containment vessel. And so
37:15
Figma was able to make feature
37:17
compatible Photoshop replacement
37:21
that could read and write your Photoshop files That
37:23
was key. Right? Because people have a lot invested
37:26
in their existing Photoshop files. They
37:28
can't just abandon them. They need
37:30
to be able to open them and read them. PSD
37:32
was reverse engineered and that's why you can read it
37:34
in the Gimp and that's why you can read it in in other
37:36
programs and so on. But,
37:39
you know, their response was
37:41
to use their access to the capital
37:43
markets to snuff at a competitor before
37:46
it could grow to become a significant and
37:48
meaningful business all on its own.
37:50
And so, you know, that we're
37:53
we're left with this kind of tautology, which
37:55
is that if the capital markets will give you enough
37:57
money to buy a company, you must be
37:59
the best person to run it. And the way that we
38:01
can tell you're the best person to run it is you have enough
38:03
money to buy it. But going back to the Google
38:06
graveyard, it's pretty clear that
38:08
that you know, there are lots of people whom the
38:10
capital markets will entrust with the money to
38:12
buy a nascent competitor who
38:14
either, you know, treat that as predatory acquisition
38:17
and snuff it out. or who are just not
38:19
qualified to run it. We were making jokes about
38:21
the flicker URL at the beginning
38:23
of the show before we went on the air. you
38:25
know, Flicker was the first service that had
38:27
mobile photos. Right? Years before
38:30
Instagram, Yahoo bought it, and
38:32
then it became a play thing
38:34
that was dueled over among
38:36
the various, you know, venal princelings of
38:38
the Yahoo empire, and it suffered
38:41
and fell into, like, you know,
38:43
neglect and And now it's belongs
38:45
to smug mug who are gradually digging out
38:47
more than, you know, decades of of
38:49
technology debt, but it was, you know,
38:51
catastrophic. to to
38:54
be acquired. Yeah. But don't you
38:56
think that if a company gets acquired and then
38:58
ruined, that opens up the door for
39:00
another company to come and compete? So for instance,
39:03
you know, Figma is not the Adobe
39:05
isn't killing the cloud developer
39:08
design. You know, there's
39:10
a lot of competition. Yeah. Exactly.
39:12
which is an open source version. Just
39:14
raising, like, twenty million. Yeah. So exactly.
39:17
this is my point. So if Adobe kills Figma,
39:19
can't Penpott come in and start
39:21
to compete. Like you you, of course,
39:24
there is always the risk when you do
39:26
M and A that you're going to end up killing
39:28
innovation. but it's not the end of the story.
39:30
The story continues. If you mess something
39:32
up, you can oh, you are gonna
39:34
open it up to competition, And if
39:36
there is a need in the market for this thing
39:38
to be built, it will be built. Well,
39:41
but you get things like so
39:43
there's a great effluorescence of RSS readers.
39:45
at one point. And Google decided
39:47
to launch its own reader. And
39:49
that created what what venture
39:51
capitalists call the kill zone around
39:53
RSS readers and wanted to back them because
39:55
there was something that was priced at a you know, it's
39:57
cross subsidized from another business at a price
39:59
that was so low, free. that
40:02
there was no way to compete with it. And
40:04
as a consequence, we saw, you know, a decade
40:06
of neglect, which would have been fine
40:08
if Google had not then killed reader.
40:10
but readers never recovered. Right? That we
40:12
we we still you know, that was the end of RSS
40:15
effectively. It's it's now this kind of rump.
40:17
It was actually just in New York doing a book
40:19
event that Neelik Patel,
40:21
who's the editor in chief of the verge, was
40:25
was hosting. And the verge's new
40:27
redesign is amazing. and it one of
40:30
the cool things about it is it has a feed of
40:32
articles that people in the Virgil's newsroom
40:34
think are cool that's there on the
40:36
front door of the Virgil's new read as
40:39
And and I said, where can I
40:41
get an RSS for that? And he was like,
40:43
we discussed it and thought nobody would wanna
40:45
stop the RSS. That's exactly right. What
40:47
they're doing is an RSS feed. Can
40:49
I can I tell a funny story, though? So
40:52
I think that think there's a lot
40:54
of the idea of a kill zone is
40:56
is real and legit. I think there's a lot of truth
40:58
to what you said. Okay. maybe
41:01
I'm wrong here, but but I do think
41:03
that that we ended the tech world
41:05
ended up building a replacement for
41:08
RSS. and that was Twitter and
41:10
that the person who built Google reader
41:13
ended up going and working with Twitter
41:15
for a while. And he ended up unfortunately,
41:17
Bill and he regrets this building the retweet
41:19
button, which I think is a source of and
41:22
he thinks is a source of a lot
41:24
of the negative effects of social
41:26
media. So it's a mixed bag, but
41:28
it it also the market did come in
41:30
and say here is another way
41:33
that you can get, you know, your your
41:35
stories via a feed, and that was Twitter.
41:37
Reddit also has I I have to, by
41:39
the way, I think, Neil, I answer
41:41
to you, Corey is disingenuous. I
41:44
don't think that has you know, how much
41:46
does it cost to do in RSS feed? It's nothing.
41:48
You can Well, this is too far. I think she was telling
41:50
the truth. I think I think it was just basically,
41:53
like, they they built a lot of features
41:55
It's you know, they're not an engineering organization.
41:58
They had to prioritize it. But Niela knows.
41:59
He knows better than that. And
42:02
then they're Yeah. Remember,
42:04
the the verge launched to to be --
42:06
Right. -- reference customer for its own
42:09
CMS. Right. And so we like,
42:11
I have no idea. Like, if we were talking about Droupe,
42:13
I'd say, yeah, for sure. You just, like, you
42:15
know, type in the obscure search map. Yeah.
42:17
says make this thing be an RSS feed
42:19
now. Right? By the way,
42:22
the guy when
42:25
Google killed reader, there were so many
42:27
clones that came out. The market did try to
42:29
end up. you know,
42:31
building every placement. I remember, I
42:33
I'll be honest, I wasn't really into our assess
42:35
readers. And then Google killed reader
42:37
and I saw the outpouring of all this English.
42:40
and said, oh, I gotta try that. And so
42:42
I downloaded or or set up an account
42:44
on something called the old reader. And
42:47
I'm I gotta that was I
42:49
I still use RSS. I use Sumi dot
42:51
news. And by the way, I gotta point out
42:53
Sumi discovered an RSS feed at the
42:55
verge. So it's not that
42:57
feed on the column there on the right, but it's
43:00
the all posts feed from the verge.
43:02
So -- Alright. -- they're still doing
43:04
RSS. Maybe Neil, I didn't know
43:06
that. No. No.
43:08
No. They put RSS for the main feed. For the main
43:10
feed. Not for that side feed. Yeah. For the not
43:13
for that side bar. That side bar is amazing.
43:15
It's just it's just not convenient for me to keep
43:17
a browser to open some of my screen. It's just
43:19
Yeah.
43:20
You know?
43:22
But but III don't think he was be asking
43:24
me. I think they just sat down and did a triage and
43:26
they were like, nobody uses to RSS anymore.
43:29
Why will we bother? And I tried to convince
43:31
them that RSS is like, what
43:33
the people who read the news to talk about the news
43:35
and amplify the news. Yes. I use. Yep.
43:37
And so, you know, it's like it's a we're
43:39
we're a small but proud people, you know?
43:43
Like an influential, I think influential.
43:45
Like Canadians. Yeah. Like Canadians.
43:49
He seemed like such a nice guy who would have
43:51
known. We're gonna take a person that's
43:53
gonna have to take a break. Hold on, Alex. because
43:55
I we k. Otherwise, we'll be here
43:57
little eight or nine or ten. in on
43:59
your time that's, like, three in the morning. So
44:02
Alex Kenterowitz is here. The big tech big
44:04
technology podcast technology dot subset
44:07
dot com I didn't think of it this way, didn't
44:09
intend it this way, but you're
44:11
not here to defend big techs, so don't feel like
44:13
you have to. especially
44:15
when you get Corey Doctor Rowan, the author
44:18
of the latest chokepoint capitalism,
44:20
which is absolutely an indictment of not
44:22
just big tech, but big business in
44:25
general. You point out in the book and I think
44:27
it's, by the way, a great read, highly recommended.
44:30
the that
44:31
it's not just tech. It's
44:33
it's publishing. It's the
44:35
record in business. It's the live
44:37
concert business. It's every business that is out there.
44:39
In fact, we got Story will talk about
44:42
a little bit about the podcasting business
44:44
as well is becoming a
44:46
big tech. Not I'm not a part of that, unfortunately.
44:49
I want more just a butt a bit our show today
44:51
brought to you by Nueva. Nueva's
44:55
great audio simplifies today's IT
44:57
pros. Yeah. It can't have been a tough spot.
44:59
we're we're now we're doing this hybrid thing. Right?
45:01
Hybrid working and learning means you
45:04
have to equip and support more spaces
45:06
with audio and video conferencing systems.
45:09
because you've got people in the office, you've got people
45:11
elsewhere. They are all
45:13
working together. and,
45:16
you know, your IT department still got to do network
45:18
security, which is a big issue these days. And, of
45:20
course, you're shifting to cloud based solutions. You
45:23
got infrastructure issues. Product
45:25
shortages and delays add that on top.
45:27
It's put so much strain in IT's resources,
45:30
people time, expertise, and budgets.
45:33
So here's a product. Customers
45:35
will love that requires minimal
45:38
effort from IT to deploy and manage its scale
45:40
with a bonus of requiring zero end
45:42
user training, but it
45:44
solves a big problem in hybrid
45:47
work. When it comes to audio
45:49
conferencing in larger spaces, it's very
45:51
common to be faced with complex
45:53
and expensive multi component systems.
45:55
You gotta bring in people to design them and
45:57
install them. And then they've gotta continue to
45:59
maintain them and update them and manage them. Noreva
46:02
is the opposite. Great
46:05
audio simplifies as changed by
46:07
offering solutions that deliver
46:09
a high level of simplicity with Noreva
46:11
you you put in what is looks like a sound
46:13
bar that gives you
46:16
full room mic pickup. If you got a really
46:18
big room put in a couple, if you got regular sized
46:20
room put in one, It's a a soundbar, it's
46:22
got speakers, it's got microphones, and it
46:24
uses Nextiva's patented microphone
46:27
mist technology to in effect fill
46:30
the space with
46:31
microphones. And
46:32
so easy to install, you could do it. You don't even
46:35
have to call the IT department, install the RAVA system
46:37
in less than thirty minutes.
46:39
maybe an hour, if you've got giant room,
46:41
it's simple. You don't have to configure
46:43
it. You don't have to tweak
46:46
it. You could it's so much
46:48
less expensive and so much easier than
46:51
those traditional systems that could take your room
46:53
offline for days. In fact, you could start
46:55
right now Get an array of system, install
46:57
it, be ready for your very next meeting.
47:00
With an array of you can monitor, manage, update,
47:02
and adjust all the systems. no
47:05
matter how many rooms with a powerful cloud
47:07
based platform to raise a console, IT
47:10
doesn't even have to walk down the hall. NARABA
47:13
is very scalable. They could bring
47:15
their simplicity to large organizations too.NARABA
47:17
systems, NUR EVA, They
47:20
cost a fraction of traditional systems give you
47:22
a better result, solves
47:24
this problem of people in the office and people
47:26
elsewhere trying to meet and work
47:28
together. Great audio is so important
47:30
to that. Check it out noreva dot com
47:32
slash tweet and you are EVA
47:35
noreva dot com slash
47:38
tweet. This is solution. I don't think people I
47:40
mean, I guess if you've been watching our shows, you know
47:42
about it, but I I wish it were better known because
47:44
I still see conference
47:45
room so poorly equipped for audio
47:48
as he could do so much better. No Rava,
47:50
dot com slash Twitter. Thank you,
47:52
Narayva, for supporting this week
47:54
in tech, and thank you listeners
47:57
and viewers for supporting us. by
47:59
going to that
47:59
special address. So I know you saw it here. sharva
48:02
dot
48:03
com
48:04
slash twitch.
48:06
Did
48:11
Mark Zuckerberg fight in the UFC show at
48:13
all? It should have.
48:16
So I don't Saturday,
48:19
I don't know what happened. I mean, I'm hoping somebody
48:21
subscribed to this. UFC
48:23
closed its a fight card at the apex
48:25
of facility owns in Vegas to the press in the
48:27
public. They wouldn't
48:29
say initially who did it, but an
48:31
MMA insider, Aria Hawani, said,
48:34
A very good close close to the event had told
48:36
him it's something to do with Mark Zuckerberg.
48:39
Speculation is Mark's written out the
48:41
event, maybe just so they could
48:43
watch. maybe
48:46
they could record it for the metaverse. Maybe
48:48
Mark's gonna get into the Octagon. I don't
48:50
know. Wouldn't put it past him.
48:52
Oh, they have been already happened?
48:55
Yeah. So I'm I'm asking, what happened?
48:57
It was just Marcus Hooker. I feel like it was Marcus
48:59
Hooker Berginis. Great news. cronies.
49:01
watching. smoke in those mics.
49:04
And there's this amazing reaction,
49:07
Jeff, of his wife, Priscilla. Just
49:09
kind of losing her mind as the
49:11
fight goes on. But I think the more telling thing
49:13
is that there were all the UFC fighters who
49:16
were talking about how off foot was that
49:18
the entire fight could be bought out by one person.
49:21
And, yeah, I think that that that spot
49:23
on. And I don't know what Marillyn took a
49:25
break. You got enough money what could I can't
49:27
you but Yeah. Of course. But just the a, the optics,
49:29
BD Act itself, I find, like, fairly
49:32
fairly wrong. And, you know, sports is a
49:34
game for the people, you know, I I
49:36
just, you know, I can't really
49:38
see any justification for
49:40
wanting to, you know, buy, you
49:42
know, the entire seating and then
49:44
lead the public out of it. If you
49:47
wanna go watch if I go watch your fight,
49:49
but don't buy out. Has anyone else ever
49:51
done? I've never seen it. How much do you think it cost?
49:53
to buy and get started to gun. I
49:55
don't know. You probably I guess, you probably do it for a million
49:57
dollars. They make most of they make most of their
49:59
money on the on the cable
50:02
for the streams. Yeah. Yeah. Yeah. They've reviewed.
50:04
They I mean, it's like getting Kanye to
50:06
play or your corporate event or your kids
50:08
buttress -- Yes. -- or or, you know, Kincinera
50:10
or something. It's just such a
50:12
it's such an oligarch. It
50:15
really is. Yeah. So, you
50:17
know, that I was told that
50:19
when the sultan of Brunei visited
50:21
Disneyland, he had thirteen tour
50:23
guides and the vice president of operations with
50:25
a retinue of over a hundred people and a flying
50:28
wedge. ahead of them.
50:30
And they just went from right to right
50:32
place. during parts of Disneyland as they went,
50:34
which was hilarious. It's a military to,
50:36
like, ten grand a day. It
50:39
you know what? It'd be worth it not to have to get
50:41
in line. That's all I'm saying. Absolutely. Yeah.
50:44
That's best deal Disneyland. you see every
50:46
once in a while, you see videos on insta
50:48
and and TikTok of celebrities
50:51
getting ushered to the front
50:53
of the line, you know. You just have to, you know,
50:55
It's good business. I
50:58
did like how David Beckham waited the
51:00
full what was it? Six He waited for
51:02
the queen to go. That's blessing. Yeah.
51:04
God bless him. Ten hours. You know? Yeah.
51:06
I think a number of celebrities actually got
51:09
in line for that. That's respectful. I like that. That
51:11
is respectful. William
51:13
Gibson once
51:14
told me that he feels like he has just the
51:16
right amount of fame. You know, like,
51:19
what is the right amount of fame? People
51:21
enjoy his work and they tell him so
51:23
he can earn a living from it, but he it's not
51:25
like he can't eat dinner in a restaurant. Right?
51:28
I I once Angela
51:30
was visiting London and we went out for lunch
51:33
and, like, got interrupted, like, three times.
51:35
No fun. is a moderately famous
51:37
person. But, you know, we are eating at
51:39
a restaurant in London. It's not like Pen and Penantell
51:42
are really well known in the UK. And
51:44
nevertheless, like, he couldn't get through a forty minute
51:46
meal without being interrupted three times. So,
51:48
you know, I I feel like there's probably
51:51
it's there's there's definitely a threshold where the
51:53
fame gets pretty toxic. Steve
51:55
Martin once told me that he knew it
51:57
was over for him as a real person
51:59
when he attempted to ride this
52:02
subway, this was many, many years
52:04
ago, to go see a show in
52:06
Brooklyn or Queens are
52:08
somewhere. And he said, I can never he said on
52:10
that. I can't ride. I can't do it. And
52:12
I've been to dinner with him. He rents, basically,
52:14
you take over a private room. You don't eat in public.
52:17
And you're still harassed by the chef, made
52:19
her tea. You know, how how do you talk
52:22
about this relationship you have with Steve? Oh, it's
52:24
well known. I've told the I've told people about this
52:26
before. he he used to listen to my radio, so I
52:28
don't think he does anything. Mhmm. And he he
52:30
d m to me about ten years
52:32
ago saying, you don't don't have to respond
52:34
to this, but I really like your radio show. I said, yeah,
52:36
I'm not gonna respond. Steve Martin, who cares?
52:39
No. I responded and he kinda
52:41
struck up friendship. What's that?
52:43
friendship. Yeah. And it
52:46
is it he's the only person who's that
52:48
famous that I've ever spent time with. And it really
52:50
it's it's he can eat. I bet
52:52
now he can't. Now the murders in the building's big.
52:55
I bet you it's actually gotten worse from him again.
52:58
But he very famously stopped doing stadium
53:01
comedy because he said, it's not it's
53:03
not a show anymore. It's -- Mhmm. --
53:06
it's like the zoo. It's like, you know, he
53:08
talks about it and born standing up in his his
53:10
book. He did I think
53:12
he really didn't like that level of fame. But once
53:14
you get there, you can't there's no turning back.
53:17
Here's Mark, by the way, practicing.
53:20
And apparently, this
53:23
is his sparring partner made his debut
53:25
in the knocked a gun, and maybe that's why
53:27
he he
53:27
bought it out. That way,
53:28
he could just buy a ticket to it. A knock Okay.
53:30
Just buy one ticket. I can't see those.
53:33
But, you know, can I just say sorry. Go
53:35
ahead. If you're Mark Zuckerberg, you don't wanna be
53:37
harassed, the whole fight. You know,
53:40
maybe you need that flying wedge of Chronie's,
53:42
and I think that Yeah. He should
53:44
just you wanna go watch sports, go
53:46
go suck it up, and go watch it in person. to
53:49
people. III mean, it's not like they don't have
53:51
booths. Right? Like they have sky
53:53
booths. Sex leach or sex leach
53:56
or private booths. Yeah. Yeah. I
53:58
mean, the champagne is just a solve
54:00
problem. Right? Like, this
54:02
is the like, a famously solved
54:04
problem. Right? Of of sports stadiums
54:07
having VIPs who wanna sit
54:09
in a fancy booth. Wear a mustache
54:11
of it, you know, a hat. just a
54:13
little disguise. You can bring it up the
54:15
private room. Yeah. It's Yeah.
54:18
That exact III don't know. I mean, it says
54:20
it says this c thing, but I
54:23
mean, who says, yes, that's a good idea.
54:26
To to the end, you're getting bad advice
54:28
if you think that's a good idea to do this. Maybe not
54:30
for Mark. But remember, this is also the guy who,
54:32
on the fourth of July last year, posted
54:34
a picture of him wearing sunscreen
54:38
holding the American flag on a motorized
54:41
foot, you know, Leo, that III
54:43
have less of a problem with. If you wanna go make
54:45
an idiot of yourself in the ocean, By all
54:47
means, people do it every day. I
54:49
remember when he posted this that we had somebody
54:51
on and said, you know, you
54:53
don't if you're Mark Zuckerberg, you have a
54:56
fey lengths of PR professionals guiding
54:59
you, protecting your image at all times.
55:02
How did that didn't they get
55:04
a bad saying, you know, like like,
55:06
no one goes, hey, you might look a little out of touch
55:08
if you end up, you know, buying out the whole UFC
55:11
game. And I I love the fact that all the UFC
55:13
fighters called, you know, BS on it because
55:15
deservedly so. No.
55:18
That's That okay. No problem with that. If you wanna
55:20
go pull the flag, make any of it
55:22
out yourself in the ocean. Like, have
55:24
that there's a big ocean, but there's
55:26
only limited seats to the UFC. people
55:29
were offended by this though. I do remember
55:31
they were very offended. I'm not sure why. People are
55:33
offended. People are offended by everything.
55:35
So Yeah. It's
55:37
true. It's
55:41
not it's not dignified. It's not
55:43
dignified. Maybe that's that's all that's wrong
55:45
with that. Look,
55:47
is is it is it something I would do?
55:50
No. No. Whatever. Whatever.
55:52
We're taking Caesar. robots. a sports
55:54
event on that on that one. You know what's
55:56
fun? We are now, thanks
55:59
to the ongoing action between
56:01
Twitter and Elon Musk in the Delaware
56:03
Court of chancery, privy
56:05
to the fascinating texts
56:09
going back and forth between
56:12
Elon and other wealthy
56:14
individuals during
56:17
his attempt to purchase
56:19
Twitter. And
56:20
I like the I think it was the Atlantic's take
56:22
on it, but it just shows you these guys
56:24
are is stupid. as
56:28
you might even imagine, they Elon
56:30
Musk's texts shatter
56:32
the myth Of the text
56:35
tech genius. This is Charlie Warzel, writing
56:37
the world's richest man has some embarrassing
56:40
friends. number
56:42
of whom have been on this show, including Jason
56:44
Callahanis, volunteered
56:50
to run Twitter for
56:53
Elon. And then at at one point
56:57
said, I you know, let me ask my you
56:59
know, I'll ask around. remember he did this. He asked
57:01
around and said, hey, may wanna invest, you know,
57:03
put some money into Musk's acquisition.
57:06
I can I can help.
57:08
Marshall
57:10
writes few and Musk's phone appeared as
57:12
excitable as the Angel Investor, Jason Kalek,
57:14
and his who peppered his friend
57:16
with flattery and random ideas for
57:18
the service. In the span of thirty
57:21
minutes, Calican has suggested
57:23
a five point plan for Twitter that would
57:25
introduce a membership tier, creator revenue splits
57:27
algorithmic transparency and changes to the
57:29
company's operation after pledging his
57:31
loyalty. You have my sword,
57:34
he texted Musk. Callahan
57:36
has pushed new ideas for weeks.
57:40
For weeks, imagine we
57:42
asked Justin Bieber not
57:44
beaver beaver to come back and let him
57:46
DM his fans. He could sell
57:48
a million and merchandiser tickets instantly would
57:50
be insane.
57:54
Finally, Musk says, sends
57:56
a message back.
57:59
Morgan
57:59
Stanley and Jared think you are using
58:02
our friendship in a good way. This makes it
58:04
seem like I'm desperate. Please
58:06
stop. Which
58:09
poor old Jason. only
58:11
ever want to support you. Elon,
58:14
love you, man. And he said he'd
58:16
jump on a grande for him. Yeah. So
58:20
I wrote about this today. I just put the link in the IRC.
58:23
I wrote about this today for my column on medium.
58:25
And I think that, you know, the
58:27
way to understand how this
58:29
works is that it's,
58:33
you know, to be an innovator is
58:35
not to have unique genius, it's to have
58:37
good timing. that
58:39
that, you know, if you scroll down a little there,
58:41
Leo, you'll see these two diagrams I
58:44
have about what it takes to invent the helicopter.
58:46
So, you know, for five hundred years, people
58:49
invented helicopters. Right? They they were
58:51
like, oh, I've seen a screw press and I've seen a maple
58:53
key. I've invented the helicopter, but
58:55
it wasn't until someone else had invented
58:57
you know, over now, internal combustion
59:01
-- Yeah. -- that you could you could get a helicopter.
59:03
And, you know, Kevin Kelly calls us the adjacent
59:06
possible. And and this is why,
59:08
like, when it's railroading time you get railroads and
59:10
why, like, six people invented the radio within,
59:12
you know, a year of each other and so on.
59:14
an idea whose time has come. That's what
59:16
that means. Right? Yeah. Yeah. And so these
59:18
guys, you know, they had a they had a good
59:20
idea. They were not unique in having that good
59:23
idea. But what they did
59:25
have was access to the capital markets after
59:27
after edging out other people, after getting a little
59:29
bit of advantage that they could use
59:31
to you know, buy other other
59:33
people's good ideas to suppress
59:36
good ideas before they could take hold independently
59:38
by by buying our rivals or by
59:40
using predatory pricing. all of that
59:42
other stuff. And so what you end up with is people
59:44
who are just sort of mediocre donkeys,
59:47
no better than you or me,
59:49
I'm not claiming to be better than any of these
59:51
people, but I they're, you know, they're not better than
59:53
me either, and no one should put me in
59:55
charge of the lives of a hundred million Twitter users
59:57
or three billion Facebook users. years.
59:59
And and, you know, like the I think
1:00:01
we often focus too much on
1:00:04
whether these people have the right stuff for that
1:00:06
job and not enough on whether that job should
1:00:08
test. We have and I've talked about this
1:00:11
before. There's this great man hypothesis. And
1:00:14
because Elon is a billionaire or
1:00:16
Mark Cuban is a billionaire, we
1:00:19
ascribe to them, you know, they become
1:00:22
the Alexander the greats. They become the great
1:00:24
man. When,
1:00:25
in fact, maybe they didn't earn that
1:00:27
sobricay. Maybe they maybe they're just
1:00:29
the right person at the right time. That's what you're saying.
1:00:32
It's it's the providential doctrine. Right?
1:00:34
If you are rich, you must be great. And if you are
1:00:36
if you are not rich. You aren't great. Right. And
1:00:38
the way to tell whether there's someone as great as whether they're
1:00:41
rich. I think we've come to a time when
1:00:43
we've stopped worshiping billionaires or we're
1:00:45
starting to stop working billionaires. I hope. You
1:00:47
say that. I I wrote something
1:00:49
unflattering about Palantir yesterday,
1:00:51
and it turns out that Palantir is the latest
1:00:53
meme stock And there's a whole bunch
1:00:55
of weird Peter Teals. He's a short
1:00:57
seller, bud. They're in my mentions.
1:00:59
Yeah. Who is paying you to write this? Oh,
1:01:01
my and so on and so on.
1:01:04
Palantir should absolutely run the NHS.
1:01:06
I've actually I did a thread, you
1:01:08
know, post a link in there. I did thread. Palantir tier
1:01:11
should run the National Health Service of
1:01:13
Britain. Yeah. So that's the thing is Palantir is
1:01:15
so Palantir wanna they didn't
1:01:17
win. Palantir got a no bid twenty
1:01:19
six million pound contract to do work for
1:01:21
the NHS, and they're trying to leverage it to
1:01:24
this big three hundred and sixty odd million,
1:01:26
but twenty three and twenty million pound
1:01:28
contract for the NHS. And
1:01:31
they're pretty clear that no one is gonna no
1:01:34
one is gonna green light that because they're
1:01:36
palantir. And so their new strategy
1:01:38
is they're buying all the companies that have
1:01:41
NHS contracts. So this is one
1:01:43
of this is kinda maybe a little like Amazon's
1:01:46
planned to buy -- Yeah. -- one medical and
1:01:49
well, that's really interesting. So p so for
1:01:51
people who don't know, I'm sure most people know Palantir.
1:01:53
is is basically a surveillance
1:01:56
system. Much like the Palantir in
1:01:58
the Lord of the Rings, a
1:02:01
an eye of Sauron that collects data
1:02:04
and then sells it to who governments, law
1:02:06
enforcement. Are they
1:02:08
a data broker? Is that I mean, are they or is
1:02:10
it that to Denmark? They're an analytics
1:02:13
platform. They're I mean, they do a lot of, like,
1:02:15
turnkey, you know,
1:02:17
human rights abuses as a service. So,
1:02:20
like, if you wanna figure out how to do
1:02:22
how to do algorithmic racism with
1:02:24
refugees -- Yeah. --
1:02:27
you know, you can you can buy their service and feed
1:02:29
it into them and their their, you know, phrenology
1:02:31
robot will tell you that all the bad refugees
1:02:33
are brown. It strikes me that
1:02:36
that letting some a company like that
1:02:38
own or not own, obviously, that
1:02:40
we could own the NHS, but participate in
1:02:42
any way with a service
1:02:44
that has the health records of millions
1:02:46
of Britons seems like a bad
1:02:48
idea. Terrible. Not at least
1:02:50
because there's actually a really good
1:02:53
Sorry, go ahead. Now
1:02:55
you go ahead. I was just gonna say there's
1:02:57
a there's an amazing proposal in the offing,
1:02:59
Ben Goldacre, who's an evidence based medicine
1:03:01
specialist, and has done a bunch of important
1:03:04
interventions at the kind of
1:03:06
national level in the British healthcare
1:03:08
system like the register of all trials and stuff
1:03:10
that have absolutely revolutionized evidence
1:03:12
based medicine in the UK, did this
1:03:14
thing called the gold acre report, and
1:03:17
it's how you would build a research platform
1:03:19
that allow you to stacked insights from
1:03:21
the collected health records of
1:03:24
NHS service users without violating
1:03:26
their privacy. And he's like, First, you build
1:03:28
an open platform that anyone can
1:03:30
audit, anyone can use, and anyone can implement,
1:03:33
and then you host that platform and
1:03:35
close it off from everyone else, with
1:03:37
the NHS data in it and you tell
1:03:39
researchers like Send me a query
1:03:41
and I will run it on the platform and send
1:03:44
you back the data but you can't ever
1:03:46
see that data. It's owned by the public, managed by
1:03:48
the public. No. No. No. It's really good. No. No. This is
1:03:50
really good idea. This is this is
1:03:52
a this is a good like,
1:03:54
an auditable evidence to drive
1:03:56
it. Public service. This reminds me
1:03:59
of IMDB where you get everybody to create
1:04:01
great database and But it's owned by the public.
1:04:03
Oh, okay. NHS was on it. Alright. So now it's yours
1:04:05
as a on it. service. NHS owns it. Okay. Yeah. And
1:04:07
they wouldn't they wouldn't get, like, Arthur Anderson built
1:04:09
it for them or someone, you know, PricewaterhouseCooper built
1:04:11
it for them, they would have to make it open. They'd have to put the
1:04:13
code on GitHub. Right. Anyone can see this code.
1:04:15
Anyone can audit this code. but the service
1:04:18
itself, the only person people
1:04:20
with a login for it would be the people who
1:04:22
ran research for the NHS. This is the
1:04:24
only way you can keep it anonymous. in
1:04:26
effect? Well, and and and and
1:04:30
but also productive. Right? So if you're a private
1:04:32
researcher at a university or pharma company,
1:04:34
anywhere else, and you're like, I wanna know,
1:04:37
what happens when these two interventions
1:04:39
are paired? What does the data tell us? You gotta give someone
1:04:41
this drug when they're doing this this
1:04:43
PT physiotherapy -- It's huge value. -- something
1:04:45
else. Huge value. So you you can then
1:04:47
get back useful data,
1:04:50
useful conclusions but you never
1:04:52
handle the data. So the goal data reports
1:04:54
really good and it's such a it's the
1:04:56
opposite of the Palantir approach, which
1:04:58
is like we will you
1:05:01
give us your data, we'll apply our
1:05:03
magic, proprietary stuff that no one
1:05:05
is allowed to know about, and then we'll tell
1:05:07
you what's in your data. We'll tell you what you
1:05:09
know, we'll tell the LAPD where
1:05:11
to go and do stop and
1:05:13
frisks, which is, you know, like,
1:05:15
by an incredible coincidence that none of us
1:05:17
could have predicted neighborhoods where a lot of brown
1:05:19
people live. Yeah. Is
1:05:22
this what Amazon's doing, Alex?
1:05:24
Is this why Amazon's acquiring, you
1:05:26
know, one medical and others? I
1:05:28
mean, I just wanna make a comment on the Palantir.
1:05:30
I think that we need to understand that this is
1:05:32
a consulting company, not a data
1:05:35
company. And, you know, they're
1:05:37
they're more in line with the Deloitte than they are
1:05:39
in line with any, like, cloud services
1:05:41
provider. It's the analytics that they they
1:05:44
sell. Well, they sell the consulting.
1:05:46
You know, they talk about the analytics. They're
1:05:48
they're sort of dressed up consulting company.
1:05:50
And my hat take is that we're
1:05:52
gonna end up seeing that. You know, they might be
1:05:55
a meme stock today, but
1:05:57
we will see the company end up in over
1:05:59
time, you know, being worth
1:06:01
you know, what they should be worth, which is a lot less.
1:06:03
And I think their worth is largely inflated through
1:06:05
through to their government connections that they
1:06:07
have actually in the Elon texts. There was something really
1:06:09
interesting about Joe Lonsdale, Micky,
1:06:12
who is the Palantir cofounder, you know,
1:06:14
hinking out with, you know, a hundred
1:06:16
Republican congressman. We all know about
1:06:18
Peter Teel. So, you know, those political
1:06:21
connections help a lot. But over time,
1:06:23
I think we're gonna see them for what they are, which is
1:06:25
that they we should address them up in this, like,
1:06:27
you know, putting their special sauce
1:06:29
on the analytics. Like Oh, that's interesting. I
1:06:31
mean, at the end of the day. That's actually Yeah. I'm
1:06:33
not saying it's good. I I'm not saying that
1:06:35
it's useful insights, but I'm saying that, like,
1:06:37
I think that there are people who will sell
1:06:40
you confirmation bias as a service. Yeah. Right?
1:06:42
That's right. I have a thing. I wanted I have I
1:06:44
I need some policy
1:06:46
based evidence, please. Right? So I know what
1:06:48
I wanna do. Please produce the insights
1:06:51
that will let me prove to my board or
1:06:53
to other people or the market or whatever that
1:06:55
what I wanna use. Yeah. Yeah. Yeah.
1:06:58
Well, that's interesting. I hadn't really thought about it.
1:07:00
I was giving Palantir all of this superpower
1:07:02
to see into our lives. And it's
1:07:04
maybe the it's like Cambridge Analytica. They're
1:07:07
Yeah. It's it's exactly the analog that I
1:07:09
was named. Yeah. Yeah. Yeah. And I also think
1:07:11
yeah. Anyway, I'll I'll leave it there.
1:07:14
Why are they buying up well taken? Why
1:07:16
are they buying up these
1:07:18
suppliers to the NHS then? What's
1:07:22
their goal? pound tier? Republic they're
1:07:24
yeah. They're they're a looter. Right? They just
1:07:26
wanna they wanna suck up bunch of public
1:07:28
private partnership money. Oh, okay.
1:07:32
because that's again that's why Willie Sutton
1:07:34
and Robert Banks. Right? because that's where the private public
1:07:36
-- That's where the money money is. -- if you if you
1:07:38
don't have the software that's gonna take you there,
1:07:40
you need the partnerships because
1:07:42
you're a consulting company. That's boy,
1:07:45
thank you for that. You know, I've I've misinterpreted
1:07:47
really kind of given them a lot more credit than
1:07:50
they deserve all this time.
1:07:52
I just wanna shout out the people
1:07:54
in the chat saying I'm here to defend Big Tech.
1:07:56
But let's look at it reasonably.
1:07:59
Right? And the more reasonable our conversation,
1:08:01
the more progress we're gonna make. I think what I
1:08:03
really admire actually about Corey's
1:08:06
book is that it isn't a it it
1:08:08
doesn't paint a target on big techs back.
1:08:10
It paints a target on companies that have gotten
1:08:13
too big. basically, with
1:08:15
predatory techno predatory methods.
1:08:18
Yes? Is that fair? Yeah. I would
1:08:20
say that that that that what it does is it
1:08:22
identifies the
1:08:25
source of the pain that artists are feeling.
1:08:27
Right? So we spent forty years
1:08:29
giving artists longer duration of
1:08:31
copyright easier in force copyright,
1:08:34
stiffer penalties for copyright, and an
1:08:36
increase scope of copyright, the
1:08:38
entertainment industry has grown, and
1:08:40
it's gotten bigger, the share of income
1:08:43
accruing to the workers who produce the materials
1:08:45
that produce those profits has
1:08:48
shrunk over the same time. And
1:08:50
the reason is that these firms have
1:08:53
between them created checkpoints where if
1:08:55
you wanna reach your audience, you
1:08:57
have to negotiate with them
1:09:00
And and typically, if there's another firm
1:09:02
you can negotiate, it's another firm that has nearly
1:09:04
identical or actually identical terms.
1:09:06
and they've all converged on a set of negotiating
1:09:09
terms that say whatever copyright you've
1:09:11
been given, you have to hand over to us.
1:09:13
And so giving creators more copyright
1:09:15
won't help in the same way that giving
1:09:17
your bullied kid more lunch money won't help.
1:09:19
Not even if the bullies are like running a national
1:09:22
campaign saying, won't someone think of America's
1:09:24
hungry children, give them more lunch money.
1:09:26
Right? They're they're just gonna take whatever lunch money
1:09:28
you give them. And that the and that
1:09:30
you know, that's the first half of the book. It's just
1:09:32
showing how these monopsinistic.
1:09:35
It's when there's a small number of buyers who
1:09:37
can control their sellers. with this an
1:09:39
opportunistic dynamic works and how it enables
1:09:42
tactics that range from simply
1:09:45
unethical to just illegal, but no
1:09:47
one can take them on even though they are illegal. We
1:09:49
we document multiple hundreds of millions of
1:09:51
dollars in Wage Theft from Audible
1:09:53
creators by Amazon. But
1:09:56
the second half of the book are interventions that
1:09:58
aren't just more copyright that
1:10:01
actually do widen out these choke points and,
1:10:03
you know, we talked about one on triangulation that I
1:10:05
really like, which is that if
1:10:07
you audit your royalty statements, which, generally,
1:10:09
you're contractually allowed
1:10:12
to do if you audit your royalty statements. In
1:10:15
order to get the money that you find
1:10:17
is owed to you, the firm
1:10:19
will generally say either you have
1:10:21
to sue us or if you want a voluntary
1:10:24
settlement you have to submit to non disclosure.
1:10:26
And so we cite research from
1:10:29
firm that specializes in auditing, recording
1:10:31
industry contracts. They've done tens of thousands.
1:10:34
In all but one instance, over decades,
1:10:37
every accounting area they located was
1:10:39
in favor of the label, not the artist. This
1:10:41
is an amazing coincidence. As always say, it's this
1:10:44
the most incredible localized probability storm
1:10:46
you can imagine. There's no other possible
1:10:48
explanation for wildest pounding errors, but
1:10:50
would just favor the the company making the royalty
1:10:53
statement. But if you go and you find missing money
1:10:55
and we have a source that found a six figure
1:10:57
error in their favor when they audited their royalty
1:10:59
statement, You
1:11:01
have to agree not to tell anyone else who's
1:11:03
being ripped off in the same way where
1:11:05
they should go and look for the money that's being stolen
1:11:07
from them. and there's an actual fix for
1:11:10
this which is relatively straightforward. Because
1:11:12
the industry is so concentrated, all
1:11:15
of its contracts are consummated in New
1:11:17
York, California, and because of Amazon Washington
1:11:19
State, contract being a matter of
1:11:21
state regulation you could introduce
1:11:24
laws at the state level that say, as a matter
1:11:26
of public policy, nondisclosure is
1:11:28
not enforceable when it relates to material
1:11:30
emissions or errors in royalties statements
1:11:33
that negatively affect people who are owed royalties.
1:11:35
And then at the stroke of a pen, every
1:11:38
artist in the world because all of their contracts
1:11:40
are governed by Washington, New York or
1:11:42
or or California law, every
1:11:44
creator in the world would suddenly have money
1:11:46
following on the more money than all of the copyright
1:11:49
extensions the last forty years have ever provided.
1:11:51
And we just fill the back half of the book with this. We've
1:11:53
got, like, you know, twelve or fourteen
1:11:55
of these interventions that actually,
1:11:58
rather than giving artists the right to feel aggrieved
1:12:00
that their copyright is being violated, will
1:12:03
let them, like, buy groceries and put braces
1:12:05
on their kid's teeth. Yeah.
1:12:09
It's nice because you focus on artists. Obviously,
1:12:11
it affects everybody from Amazon warehouse
1:12:14
workers.
1:12:15
to burger flippers.
1:12:17
But it, you
1:12:19
know, we artists, and I put myself
1:12:21
in this category, we creators
1:12:24
wanna create and they take
1:12:26
advantage of that fact. Right? Say,
1:12:28
fine. Go ahead. You keep you
1:12:30
keep creating. We'll take care of the rest. Thanks.
1:12:33
Let's take a little break. Wanna
1:12:35
talk more. We've got this
1:12:38
new journalism competition and and
1:12:40
preservation act, which thanks
1:12:43
to the Ted cruises moving
1:12:45
forward. Lots lots
1:12:47
more to talk about, including Elon Musk's robot.
1:12:49
We didn't we didn't even get to that yet. But
1:12:51
first, a word from our sponsor Corey Doctor is
1:12:53
here, Alex Canther, which is from Big Tech
1:12:56
Big Tech Technology podcast. Our
1:12:58
show today brought to you by eight sleep.
1:13:00
If I am well rested tonight, you can thank
1:13:03
my Eight Sleep Pod
1:13:05
Pro Cover two. Now they've gotten
1:13:07
even better. Eight Sleep Pod
1:13:09
cover. If you've got an Eight Sleep
1:13:11
mattress or a pod cover, you already know.
1:13:14
That's how I found out Kevin Rose said,
1:13:16
oh, you gotta do this. Amy Webb got it,
1:13:18
then she said, oh, you gotta do this. So finally about
1:13:20
a year ago, we got it. Eight Sleep
1:13:23
is the only sleep technology that dynamically cools
1:13:26
and heats each side of your bed.
1:13:29
And the reason it's sides is so your spouse
1:13:31
can have her temperature as well to maintain
1:13:33
the optimal sleeping temperature. for
1:13:36
what your body needs. Thanks to Eight Sleep. I'm
1:13:38
getting more than half an hour,
1:13:40
more deep sleep every night, up over
1:13:42
up to an hour now. It
1:13:45
combines dynamic cooling and heating with
1:13:47
biometric tracking. So it's watching you
1:13:49
as you as you sleep.
1:13:51
So it turns out your body Kind
1:13:54
of wants it to get cool. As
1:13:56
you go into deeper sleep, I have my
1:13:58
Eight Sleep set to be warm when
1:13:59
I get in bed because it's nice and cozy. and
1:14:02
then cool off. I go into deeper
1:14:04
and deeper sleep from REM to
1:14:06
deep sleep and then back up, and then it
1:14:08
warms up again in the morning. but it it'll
1:14:10
adjust automatically or you can adjust it manually
1:14:13
as cool as fifty five degrees Fahrenheit,
1:14:16
which was awesome. during the hot
1:14:18
heat wave we had this summer. And
1:14:21
in the winter, as hot as hundred ten
1:14:23
degrees Fahrenheit, it monitors not only your
1:14:25
temperature, your movements, but also
1:14:27
the room temperature. Clinical
1:14:28
data shows eight sleep users experience
1:14:31
up to nineteen percent increase in recovery.
1:14:33
up to thirty two percent improvement in sleep quality,
1:14:36
up to thirty four percent more deep
1:14:38
sleep. More deep sleep makes a
1:14:40
huge difference. I can you know, I just
1:14:42
feel it the next day. I I feel
1:14:44
like I I am, you know, in in
1:14:47
alive and awake and and everything feels
1:14:49
better my mental clarity is better.
1:14:53
Eight Sleep just launched. In fact, we gotta get one this
1:14:55
new next generation pod, the pod three.
1:14:58
which has more sensors, double the amount
1:15:00
of sensors so they've got more accurate sleep and
1:15:02
health tracking, giving you the
1:15:04
absolute best sleep experience
1:15:06
on Earth. I'm I'm still loving.
1:15:08
are a pod two pro cover. I'm very
1:15:11
happy with it. The it's
1:15:12
not magic, but it it definitely feels
1:15:15
like it. I love the night
1:15:17
sleep. I get and you might need this.
1:15:19
More than thirty percent of Americans struggle with
1:15:21
sleep. One of the primary
1:15:23
causes of poor sleep feeling hot at
1:15:25
night. feeling
1:15:27
hot at night. You don't you don't wanna be
1:15:29
hot at night. Go to h sleep dot com slash
1:15:31
twitch. I'm gonna get you a hundred fifty dollars
1:15:34
off at checkout. Sleep cozy
1:15:36
this fall. Eight Sleep currently ships within
1:15:38
the US, Canada, UK, and and the
1:15:40
EU and Australia. Some countries in the
1:15:42
EU. Eight sleep dot com slash
1:15:45
twit. And
1:15:45
if you go there, you'll save a hundred fifty dollars
1:15:48
a a checkout on your
1:15:50
pod. Get the pod.
1:15:52
Trust me. Trust
1:15:53
me. You'll sleep better, and everybody needs
1:15:55
a better night sleep. Thank
1:15:56
you, Eight Sleep. For your
1:15:59
support, Keep me up
1:16:01
at night the idea of having a humanoid robot
1:16:03
walking around through my house. Fortunately,
1:16:06
it probably isn't gonna be Elon
1:16:08
Musk's.
1:16:10
robot.
1:16:11
Elon
1:16:13
and Tesla have an AI event
1:16:15
every year. Last year,
1:16:17
you may remember Musk
1:16:20
talked about his humanoid robot and
1:16:23
brought out a dancer in
1:16:25
a costume to
1:16:28
to show it off. Let me see if I can find a video
1:16:30
of
1:16:31
Optimus is what he's calling it.
1:16:33
I
1:16:33
think because he's a fan of the Transformers,
1:16:36
I don't know. certainly
1:16:38
made progress over the Leotard -- Yeah.
1:16:40
-- person. Yeah. Definitely better than the
1:16:42
Leotard person. It
1:16:45
barely could walk by itself. Here
1:16:47
it comes. I'll turn off the sound because it's
1:16:49
just too annoying.
1:16:52
Elon wants to put these in his
1:16:54
factories, but I think he also feel he says
1:16:56
it could be transformative for civilization.
1:17:00
He's still a little bit behind Boston Dynamics,
1:17:03
not exactly doing back flips or opening
1:17:05
doors for people, But
1:17:08
do you really That's the that's
1:17:10
the answer from last year. It
1:17:12
is not doing that. it was
1:17:14
not doing that. Elon also I would
1:17:16
take the dancer. I would take the dancer.
1:17:18
This is this is Elon's video
1:17:21
of the robot very shapely
1:17:24
delivering. Oh, maybe you can water your
1:17:26
plants. Oh, oh, that's good.
1:17:30
I thought what's her name? Chelsea
1:17:32
Steiner at the Mary Sue had a
1:17:34
good summation here.
1:17:36
Like all Musk promises, this one is vague,
1:17:38
got impressive, and riddled with issues. It's
1:17:40
also wildly unrealistic to imagine that the
1:17:42
robot will be capable enough to replace the labor
1:17:44
force as we know it. here's when Optimus
1:17:47
can do half the moves of this superstar. Here's
1:17:49
the here's the Boston Dynamics robot
1:17:51
doing back flips, dancing, Even
1:17:53
this though, they're carefully curated videos
1:17:56
because, you know, half the time it falls over.
1:18:00
Yeah. I know. But who wants this anyway?
1:18:02
Elan's factory already is loaded with robots,
1:18:04
but they're the traditional giant
1:18:06
German and Japanese robots that could pick
1:18:08
up a car, turn it around, and put it
1:18:11
back on the on the line and so forth.
1:18:13
But can they automate
1:18:15
the racism of his fact? No. No. That's
1:18:17
that's only a human can really Right now, their
1:18:19
their wage bill for doing the racism
1:18:21
is very high, and they're gonna need to automate
1:18:23
that some other And I have to bring in pay.
1:18:27
Facebook's bot let Tay do
1:18:29
it. Yeah. Yeah. Maybe it was Microsoft's.
1:18:31
Oh, but but here's another here's
1:18:33
another unpopular take since I guess that's
1:18:35
my role on the show today. I think
1:18:37
we we should be hesitant before we
1:18:40
discount, you know, this type of thing. And
1:18:42
it might not be the must've got. Okay?
1:18:44
which by all accounts is not a very impressive
1:18:46
machine. But we are seeing
1:18:48
real interesting advances in
1:18:50
robotics right now, humanoid style
1:18:52
robotics And like we've seen
1:18:55
with AI, these things, you know, we we
1:18:57
see these breakthroughs of research. We get we get
1:18:59
lots of duds. And then all of a sudden,
1:19:01
we type in a sentence and the robot is drawing picture
1:19:03
for us or we talk to it, and we'd have
1:19:05
a Google engineered fill engineered
1:19:07
field that had sent you. And I think that
1:19:09
the interesting report that no one's paying
1:19:12
attention to. Everyone pays attention to what Musk
1:19:14
does. But an interesting report that no one's paying
1:19:16
attention to along this line is that Amazon
1:19:18
has been trained has been training robots
1:19:21
to pick items out of
1:19:24
boxes and then, you know, end up, you know,
1:19:26
stowing them. So Most
1:19:28
of Amazon's workforce in the in the warehouses
1:19:30
are people that pick stuff out of boxes and
1:19:32
they put them into shelves, pick them into from the
1:19:34
shelves, and put them into crates. to be shipped
1:19:37
off, and they're making real progress there.
1:19:39
And so I think this idea of, like, you know,
1:19:41
Elon's robot is easy to laugh up,
1:19:43
but the advances that we're seeing you
1:19:45
know, in this type of robotics are very,
1:19:47
very real. And the fact that Musk is
1:19:50
talking about disappearing in in the warehouse,
1:19:52
so it doesn't come out of nowhere. And there is definitely
1:19:55
some progress here.
1:19:56
And and I'm not I'm not
1:19:58
laughing. I think that that
1:19:59
this is some serious technology to be
1:20:02
reckoned with. you know, maybe not coming from Elon
1:20:04
Musk but here today for sure. There's
1:20:06
a there's a term out of AI research
1:20:08
which is a centaur which is when
1:20:11
you have human AI collaboration, like
1:20:13
a chess robot and a chess player playing
1:20:15
together to do things that neither than it
1:20:17
could do on their own. But there's also this
1:20:19
this sort of out of labor economics. This
1:20:21
has turned the reverse center, which
1:20:23
is when the body is the inconvenient
1:20:26
meat puppet for the AI. And
1:20:29
Amazon are kind of the masters of that. I'm gonna
1:20:31
paste a link into the chat of a a
1:20:33
thing I wrote about reverse centers and Amazon.
1:20:35
So that would be things like their drivers
1:20:38
being subjected to kind of
1:20:41
like super human conditions or
1:20:43
super human constraints by
1:20:46
the by the AI's
1:20:49
that are monitoring them in the cars or
1:20:51
packers being driven to do
1:20:54
unrealistic working tempos
1:20:57
in their warehouses. And and Amazon
1:21:00
leads the country in warehouse injuries.
1:21:02
Yeah. And the more automated Amazon
1:21:05
warehouse is, the more injuries
1:21:08
people incur, I'll I'll
1:21:10
stipulate that, like, automation is
1:21:12
a thing. and robots are are
1:21:14
cool and that Amazon
1:21:17
has every motive in the world to try to make good robots.
1:21:19
But I also wanna sound the note of caution
1:21:22
that a lot of what Amazon has booked to
1:21:24
its shareholders as profits in automation
1:21:27
have really been ways to get people to work
1:21:29
in at an unsafe tempo in
1:21:31
ways that put themselves at risk, and in the case of their
1:21:33
drivers put other people at risk other users
1:21:35
of the road at risk. I know a kid who works at
1:21:37
Whole Foods owned by Amazon. and
1:21:40
it's a very different experience. Sometimes, some hours,
1:21:43
he's working for Whole Foods, some hours, he's
1:21:45
working for Amazon. And the at
1:21:47
the minute, he's on the clock for Amazon, there
1:21:50
are very clear, almost
1:21:53
unachievable metrics for his performance,
1:21:55
and
1:21:55
they're made very clear.
1:21:57
and it is a very different
1:21:59
experience.
1:21:59
It's kinda you nailed
1:22:01
it. It's you are you are
1:22:03
at the mercy of the machine. Yeah.
1:22:06
And I need to state that I'm not celebrating this
1:22:08
stuff. But when we look at that No. No.
1:22:10
I think you're right, Alex. Don't downplay. I think
1:22:12
we and we've seen this very, very powerful
1:22:15
technology. I agree. We've seen this with the explosion,
1:22:17
thanks to stable diffusion and dolly too,
1:22:19
and mid journey. I mean, just explosion
1:22:22
in AI art. Do you
1:22:24
think though you know,
1:22:26
initially, my initial reaction was this is
1:22:28
like a Cambrian explosion. Like, suddenly,
1:22:31
this is taken off, and we've reached
1:22:33
the high stick with AI in
1:22:35
some interesting way. And
1:22:37
now more and more I'm thinking it's a
1:22:39
parlor. trick that
1:22:42
the the AI is just it's
1:22:45
not really AI almost. It's just combining
1:22:47
images together in an interesting
1:22:50
Wait. What's your take on it, Alex? I have
1:22:52
you tried Dolly before? Yeah. No. It's very
1:22:54
impressive and stable diffusion and mid journey.
1:22:56
Yeah. They're very stable diffusion. seems to be
1:22:59
the fastest moving because it's open
1:23:01
source people can run it on their own servers and there are
1:23:03
a lot people who've adopted it. I
1:23:05
follow the stable diffusion Reddit
1:23:08
subbred it, and it's pretty amazing
1:23:10
what what they're doing. And yet,
1:23:14
I'm not convinced it's I
1:23:17
don't know. Is it AI? Is it true AI?
1:23:20
Of course. Yeah. Yeah. And and the images
1:23:22
are gonna be as creative as, you know,
1:23:24
the prompts that you're gonna write them. I don't think we've
1:23:26
had anything like this to, you know, in our history
1:23:29
before. And, you know, there's there's so many
1:23:31
interesting applications. Here's one.
1:23:33
I think that so anyone who's worked
1:23:36
with the marketing department, you know,
1:23:38
knows that it's always a struggle to communicate
1:23:40
to the creative department what you need them to
1:23:42
to create or what you'd like them create because
1:23:45
words are imprecise and art, you know, pictures worth
1:23:47
a thousand words. So you can't just
1:23:50
be like, I need a picture of this and and they'll know exactly
1:23:52
what you're doing. Actually, you got you know, graphic
1:23:54
designer to product manager or marketing manager
1:23:56
is one of the most difficult pieces of communication
1:23:59
I think in the business
1:23:59
world. and
1:24:00
I've been there and it's tough. Yeah. And there are ways
1:24:02
you get around to make this more prominent or we're
1:24:04
trying to get this message across. We have creative brief.
1:24:07
Now, I think what this stuff is, the amateurs
1:24:09
can make a version of the image,
1:24:11
you know, with something like dolly and then
1:24:13
pass it to the professionals, and then you
1:24:16
know, be able to vice versa. Both
1:24:18
of these these systems seem to really work well
1:24:20
with a sketch. For here's
1:24:22
an example of using old cartoons as
1:24:25
and it images says Fred Flintstone
1:24:27
turned into this is so
1:24:29
start with the if you can't the starting point,
1:24:32
it is kind of it was inspector gadget,
1:24:34
you know. I mean, don't know. I I continue
1:24:36
to be blown away every time I do, you
1:24:38
know, one of these searches. And, I mean, Leo,
1:24:40
it's sort of it kinda shows you
1:24:42
how much advances we've made. I mean, how
1:24:44
serious of events of events we've made.
1:24:47
If you're if you're sitting there and typing
1:24:49
in a sentence prompt, you know,
1:24:51
on your browser. And then next thing you know,
1:24:54
a machine will draw it for you. Yeah. You're
1:24:56
like, man, you know. Like,
1:24:58
think about how far we've come. You know,
1:25:00
if that's your reaction, Yeah. This
1:25:02
stuff becomes unremarkable once
1:25:04
it works because
1:25:05
we expect it, and I think that's the place we
1:25:07
are today with these programs. Look at this
1:25:10
progression. from
1:25:11
what is essentially a stick figure
1:25:13
as this is y
1:25:15
flu and stable diffusion combined
1:25:17
together. as
1:25:18
it creates a more and more realistic
1:25:21
little human help. There's
1:25:22
so it is a human partnership, I guess.
1:25:25
that makes sense. The funniest thing I saw in Reddit
1:25:27
last week, I just paced the URL for it into
1:25:29
the chat is was a a
1:25:31
Drake meme whose title
1:25:33
is what makes you a human and the
1:25:36
the pushing things away is
1:25:38
to love and care about others and the the yes
1:25:40
that's right is the selecting all images with by
1:25:42
weeks. Oh,
1:25:44
isn't that interesting? yeah. That's good.
1:25:47
Talking about capture. Yeah. Yeah.
1:25:49
So it's it's I mean, I
1:25:52
I agree that that this is a a very
1:25:54
powerful technology. It's
1:25:56
it's super interesting to work with.
1:25:58
the you
1:25:59
know, I've also been where you are trying to find
1:26:01
art to use as reference to give to an Illustrator.
1:26:04
Being able to describe art is really great. I
1:26:06
mean, Google image search was was
1:26:08
a huge phase
1:26:11
change for that as well. Yeah. And,
1:26:13
you know, III was an imagineer
1:26:15
for a while. And in the imagineering
1:26:18
archive, there's a room or I don't even know if
1:26:20
it's still there, but there's a room with bankers boxes
1:26:22
filled with magazine clippings of illustrations --
1:26:25
Yeah. -- organized by And
1:26:27
if you had to draw a water
1:26:29
fountain, they just had a box
1:26:31
of clipped out illustrations of waterfounds
1:26:33
that you would ring onto archives and
1:26:36
they would send up a box of reference
1:26:38
for you. Yeah. You know, so they're they they
1:26:40
your this this is definitely something that is
1:26:42
making the lives of illustrators easier,
1:26:45
making it easier for people who aren't illustrators
1:26:47
to talk to people who aren't and say, that's what
1:26:49
I mean when I say water fountain, this picture
1:26:51
here is the the kind of water fountain. I mean, not
1:26:53
this picture over here. And and
1:26:56
certainly, like Dolly and and all the other
1:26:58
ones help there as well. But
1:27:00
I will say that I don't see a path
1:27:02
from statistical
1:27:05
inference and, you
1:27:07
know, deep learning networks to
1:27:10
GAI. I I think that
1:27:12
to say that if that's true and Inference,
1:27:15
we get GAIs, like saying if we read
1:27:17
these horses carefully enough, will have a
1:27:19
locomotive. Right. Right. You are doing
1:27:21
general artificial intelligence. is human
1:27:23
scale intelligence, Scott. But is there anybody
1:27:25
actually arguing that Corey? Oh, yeah.
1:27:28
Tons. Oh, yeah. That's the whole argument.
1:27:30
Right? The whole argument that -- Mhmm. --
1:27:32
well, the whole, like, Nick Bostrom, Elon
1:27:34
Musk, Sky Net is coming
1:27:36
out of our AI. I have looked at open AI
1:27:39
and what it can do and now I'm afraid for the
1:27:41
human race. They're they're basically saying
1:27:43
we are gonna selectively breed this horse
1:27:45
long enough that eventually we're gonna have
1:27:47
a locomotive and then it's gonna kill us all.
1:27:50
And it it is like such obvious
1:27:53
nonsense to me that I'm quite baffled by
1:27:55
it. And you know what
1:27:57
is the discontinuity? So it's
1:27:59
obviously horses and locomotives. Clear.
1:28:02
But so But this is about human cognition.
1:28:04
Is there something about human cognition
1:28:07
that is unreachable?
1:28:09
No. No. No. It's just not it's just not a human
1:28:12
cognition is not statistical inference. You
1:28:14
know, we don't ever mess it entirely. It's
1:28:17
not it's not I mean statistical inference
1:28:19
might be a component of it, it probably
1:28:22
is. But the idea that that
1:28:25
increasing innovation
1:28:28
in the realm of statistical inference eventually
1:28:30
produces human
1:28:32
cognition. It's just it's it's wrong.
1:28:35
And you know, there's this
1:28:37
corollary, which is automation
1:28:41
unemployment corollary. Right? The
1:28:43
the looming automation unemployment crisis,
1:28:45
which again, I think, is just, like,
1:28:47
doesn't isn't right. Like, its
1:28:49
its foundations are wrong. So, like,
1:28:52
you look at the stories about automation
1:28:54
unemployment, you see things like, oh, the
1:28:56
most popular job in America's truck driver
1:28:59
and driving trucks is something that we can do
1:29:01
with ML because we can give them
1:29:03
a dedicated lane on the highway and set
1:29:05
them to following each other and basically invent
1:29:07
a shitty train But you
1:29:12
know, the thing is that the Bureau of Labor Statistics
1:29:14
category for truck driver incorporates
1:29:17
anyone who operates a heavy goods vehicle.
1:29:19
So it's not the most like sixteen wheeler
1:29:21
driver is not the most popular job
1:29:24
in America. It is a relative
1:29:26
all an unimportant part of our overall economy,
1:29:28
not that those people are unimportant, but like they
1:29:30
you know, if all of the truck drivers were unemployed tomorrow,
1:29:33
the change in the unemployment figures would not
1:29:35
be very large. meanwhile, we're
1:29:37
just not getting anywhere with the self driving cars.
1:29:40
Right? Like that that there's so much
1:29:42
smoke and mirrors and self firing cars, the only
1:29:44
ones that seem to perform at all are the ones
1:29:46
that actually just have a human
1:29:48
remotely driving
1:29:49
the car, overseeing the car,
1:29:51
And if those people's attention wanders, which
1:29:53
it will inevitably,
1:29:55
then those cars become murder bots.
1:29:57
And, you know, it raises an important point, which
1:29:59
is
1:29:59
that
1:30:00
we already have a lot of human intelligence.
1:30:03
Like, we
1:30:04
have billion humans. We
1:30:06
don't have enough non human intelligence.
1:30:08
Right? Like that, the capacity to be
1:30:10
vigilant for things that happens very rarely
1:30:13
is is not a capacity that humans
1:30:15
mostly have. a few people may
1:30:17
have it, but it's not a widespread trait
1:30:19
in our population, which is why the TSA
1:30:22
is really good at spotting water bottles
1:30:24
and really bad at spotting guns. because
1:30:26
they
1:30:27
never see a gun. Right? But they see water bottles
1:30:29
all day long. Like, you cannot leave neurons
1:30:31
trained to
1:30:32
do a pattern recognition for
1:30:34
a pattern that you never encounter
1:30:36
because those neurons will be retrained to
1:30:39
make you better at the
1:30:40
pattern recognition that you do all day.
1:30:42
Right?
1:30:42
And so they just forget like you
1:30:44
can train them to spot guns on an x-ray,
1:30:46
but then they'll forget not because
1:30:48
they're lazy or whatever, but because they never see guns
1:30:51
on an x-ray. They see water bottles all day long.
1:30:53
That's
1:30:55
interesting. So you're not you're not doesn't
1:30:58
require a notion of a human goal or
1:31:00
some sort of magical capability that
1:31:02
Cognizant Yeah. No. The cost just also
1:31:04
the cost is. Yeah. The conversations
1:31:07
that I think there are conversations that really
1:31:09
occur on the fringes about the stuff leading
1:31:11
to general intelligence. And I
1:31:13
I think that that, you know, the mainstream
1:31:15
conversation about this stuff looks at it.
1:31:17
rationally and says there's
1:31:19
a lot of stuff that we can do, humans and artificial
1:31:22
intelligence combined. And I
1:31:24
can Jira and the people sorry. Go ahead.
1:31:26
I beg your pardon. Go ahead.
1:31:29
No. No. You finish. I'm sorry. I thought you're done.
1:31:31
Yeah. I apologize because the latency in
1:31:33
the in the Skype is is killing us as
1:31:35
usual, not Skype Zoom. Go
1:31:38
ahead, Corey. I was just gonna
1:31:41
say during the lockdown, the World Economic Forum
1:31:43
asked me to give them a talk on technological unemployment.
1:31:46
And when I sent them the text of the talk, they
1:31:48
withdrew the invitation. So I turned it into a
1:31:50
column. and it's the
1:31:52
it just put it in the URL there. But You weren't
1:31:54
saying what they wanted you to say, Corey. I think, basically,
1:31:57
I said, like, I don't think we're gonna have
1:31:59
AI driven unemployment because even
1:32:01
if we automate some stuff, like, we're gonna
1:32:03
have to, you know, relocate every
1:32:05
city twenty kilometers inland over the next three
1:32:08
hundred years. That's full employment for everyone no
1:32:10
matter how many robots we build, there's just like
1:32:12
more work than than we can imagine, and
1:32:15
they they didn't like that at all.
1:32:17
really interesting. Yeah. Didn't
1:32:19
fit their their model. But
1:32:21
they they really believe and that's why I raised
1:32:23
it. They really believe in technological
1:32:26
unemployment, generally eye breakthroughs on
1:32:28
the immediate horizon. You know,
1:32:30
they they they aren't fringe beliefs in the in
1:32:33
the In the halls of power or in the
1:32:35
halls of business or even in finance,
1:32:37
they're they are accepted as gospel.
1:32:40
You know, there are a few things like general, you know.
1:32:43
there are a few things like general AI, hold on a second.
1:32:45
General AI, fusion, quantum
1:32:47
computing,
1:32:49
the
1:32:50
it seems prudent to perhaps
1:32:53
consider their eventual
1:32:55
invention even if
1:32:57
they're not necessarily around the corner.
1:33:01
Go ahead, Alex. I mean, it's just
1:33:03
not what I hear. You know, I mean, maybe do you think it's
1:33:05
gonna happen? No. I'm sad. I do
1:33:07
will it happen eventually? Who knows? But
1:33:09
this I I just don't agree with Corey
1:33:11
about, you know, this being an accepted
1:33:14
thing. At least not in the conversations
1:33:16
I hear. you know, maybe the Oh, you're
1:33:18
saying that people don't believe, generally,
1:33:21
I don't think so. No. I think look at the response
1:33:23
to what happened with this Google engineer who said that
1:33:25
the chatbot was Google didn't
1:33:27
like it. sent to you. Google fired him.
1:33:30
Anyone who with any standing in the research
1:33:32
candidate said that's not fair. You know,
1:33:34
faulty and idiot. Yeah. And -- Yeah. -- yeah.
1:33:36
And, like, I don't know. I mean, Corey,
1:33:38
maybe the, you know, the influencer class
1:33:40
likes to talk about this at their conferences. But
1:33:43
this idea that AGI is right around the corner,
1:33:45
just to me, I've never heard that, you
1:33:48
know, from credible folks in the industry.
1:33:50
Sure. Well, I mean, science fiction
1:33:52
writers certainly think it's nonsense. Right?
1:33:54
Like it is a recurring theme at science fiction
1:33:56
conventions. Why are all these CEOs out
1:33:59
there saying this is this this is
1:34:01
this on the horizon. But they are saying Well, I think
1:34:03
this I think the CEOs aren't saying it.
1:34:05
I think the CEOs are saying that it's very
1:34:07
powerful technology. But I don't think
1:34:09
they're saying we're gonna be hand in hand with
1:34:12
artificial general intelligence, and
1:34:14
that's what we're working toward. mean mean, you
1:34:16
know, investing in that, and Wall Street can
1:34:18
believe it. That's a hypothesis
1:34:20
to the center for existential risk who are,
1:34:22
like, you know, have attracted, like, billions
1:34:24
of dollars, hundreds of millions of dollars in
1:34:26
capital. I mean Well, just aren't you already saying
1:34:28
that we shouldn't equate money with smarts?
1:34:31
No. It's true. It's true. Right? So there what
1:34:33
I'm saying is that there it may be a small number,
1:34:35
a very rich people who believe this, but there
1:34:37
are some very rich people who believe it and a bunch
1:34:39
of weird stands for them who also believe
1:34:42
it. I just
1:34:42
think if you speak with people, credible folks
1:34:44
in the industry, you know, folks who are actually doing
1:34:46
the research. But Nick Bostrom mean,
1:34:49
I I like the guy. He's a philosopher. He's
1:34:51
not here. He doesn't work in in machine learning.
1:34:53
Yep. and you speak with researchers, you speak with
1:34:55
the tech companies, you know, they might use marketing
1:34:58
terms to talk about the power of their artificial intelligence.
1:35:01
Cindar, Pichai calling it, you know, as powerful
1:35:03
as fire. You know, that sounds like marketing to
1:35:05
me. But I never hear him or
1:35:07
anybody at Google talking about you know,
1:35:09
us reaching artificial general intelligence outside
1:35:12
of one guy, you know, who actually has an
1:35:14
interesting story to tell, and I did have one
1:35:16
on my podcast. We had an interesting conversation.
1:35:19
That that being said, you
1:35:21
know, he's the extreme
1:35:23
exception
1:35:24
and not the rule.
1:35:26
you're you're so I that's good. I have to I
1:35:28
have this is the oh, yeah. Here it is.
1:35:32
Greg La Moyne. Right? Blakely
1:35:35
mine. Yeah. Yeah. Blakely mine. Yeah. Yeah. Yeah.
1:35:37
Mhmm. Nice. Yeah.
1:35:39
I think think the picture
1:35:41
kind of says it all. If if that's not stable,
1:35:44
diffusion. I think it's a
1:35:47
tough. I guess Well, he's an interesting he's
1:35:49
an interesting character. He's not dumb. Yeah.
1:35:51
By any means. you know,
1:35:53
obviously. But he's also priest. He's spiritual
1:35:55
in ways that, you know, I think are pretty relevant
1:35:58
-- Right. -- to the story. And
1:36:00
he had some pretty neat interactions with Lambda.
1:36:02
He drew the conclusion, and he was, I think, predisposed
1:36:05
to believe that there was gonna be a, you
1:36:07
know, general intelligence that is gonna speak
1:36:09
to him. you know, through bot. He's, like,
1:36:11
talked about it before he came out, saying he
1:36:13
believes Lambda is is this intelligent, but
1:36:15
they also believe it fairies. I mean, it's not What
1:36:17
what I told him is is that I think that
1:36:20
he's wrong -- Right. -- and he'll be in the history
1:36:22
books. So I think that will be a self described
1:36:24
reasoning. Yeah. He's a mystic. He calls himself
1:36:26
a mystic. Right? Look, I encourage people who
1:36:28
are who are on this one a year. So I had
1:36:30
Blake on this on big technology podcast.
1:36:33
We spoke for an hour and a half. about his
1:36:35
interactions with Dolly. I kinda thought it was pretty
1:36:37
interesting. You know, I will listen to perspective
1:36:39
-- Yeah. -- talking about how the the bot,
1:36:41
you know, hired a lawyer. This again goes to, like,
1:36:44
my my comment before about how
1:36:46
far we've come that if the technology can
1:36:48
now fool Google and engineer into thinking
1:36:50
it's sentient, then It's probably, like, pretty
1:36:52
interesting technology that we should be focusing
1:36:54
on, you know, what it can do, the dangers of it,
1:36:56
etcetera, etcetera. And I do think that often
1:36:59
these these conversations about is it sent
1:37:01
in or not, you know, kinda take our eye off
1:37:03
the ball -- Yeah. -- on on that front. And
1:37:05
then I also had Gary Marcus on
1:37:07
who, like, also, like, anyway, I had Gary
1:37:10
Marcus on who's he called
1:37:12
it something like like foolishness
1:37:14
on stilts or something like that. nonsense on
1:37:16
stilts. And we talked through all the other
1:37:18
you know, counter arguments to Blake.
1:37:21
But yeah. Look, I think the
1:37:23
one thing that we can say is we're in a very
1:37:25
interesting moment in technology where research
1:37:28
is moving forward and the pro the practical
1:37:30
uses of the stuff that is being development
1:37:32
is being developed -- Isn't moving forward. -- is it points
1:37:35
for speculate about general
1:37:37
AI, though? I mean, is
1:37:39
this so far ahead that why should we think about
1:37:41
it? Or should we be thinking about
1:37:43
it now? I mean, is it fun? You know,
1:37:45
that's, like, we we have to admit that,
1:37:48
like, humans have a high capacity and
1:37:50
interest in fun. And it's fun to speculate
1:37:52
about just like to say, think about how much energy
1:37:54
goes into people trying to game out sports games
1:37:57
before they happen. You know, is Mark Zuckerberg
1:37:59
gonna be in the ring? You know? We like, but I don't think
1:38:01
that it's know, there's there's practical, and
1:38:03
then there's just enjoyable. Like, we
1:38:05
all love to have our minds wandered. This is why
1:38:07
this is such a big part of science fiction. you
1:38:09
know, to think about where it could go. But, like,
1:38:12
the this you know, the idea that
1:38:14
it's it's, you know, it's coming, it's here, and
1:38:16
we better get ready to make our robot friends
1:38:18
pretty quickly. you know, that
1:38:20
seems outlandish. Yeah. Bostrom's simulation
1:38:23
hypothesis is really just a fun thing to think
1:38:25
about, but kinda Yeah. And pointless to spend
1:38:27
any real energy on. I had a
1:38:29
very interesting conversation with Bostrom for
1:38:31
my book. And, you know, I just was like, this was
1:38:33
a black mirror chapter that I wrote. And I was like,
1:38:35
alright. Nick, like, go ahead and tell about
1:38:37
all the terrible things that are gonna happen.
1:38:40
He's like, listen, I've kind of been like,
1:38:42
have this bad brand because I did think about
1:38:44
it, but it's actually gonna be good. know
1:38:46
when it comes here. So
1:38:49
general AI or the simulation
1:38:51
is is this a this is a new
1:38:53
one. Leo. Yeah. Let's talk about all
1:38:55
the simulation. You know, is there free will?
1:38:57
Or, you know, is there any difference from a simulation
1:39:00
than what we're what we're living here in the future.
1:39:02
debating that in August. Yeah. I don't think
1:39:04
there's exactly there's nothing new about
1:39:07
about it. Right. But it does the cool thing about this is
1:39:09
that it does help reframe, you know, the discussion
1:39:11
or, like, Right? Give us a new angle to think about
1:39:13
it. But I think right now what it
1:39:15
is is fun. It's fun. So There
1:39:17
was a wonderful book this year from James
1:39:20
Bridal, who's the you may know him from some
1:39:22
of his weird stunts. He's the guy who built
1:39:24
a self driving car and then surrounded it in a
1:39:26
circle of salt. that it thought was
1:39:28
a road marking that
1:39:30
didn't cross. He's also the guy who
1:39:32
did the research that found that
1:39:34
YouTube kids was full of all these weird
1:39:36
semi automated Sure. Yeah.
1:39:39
Yeah. Yeah. So he wrote this book called
1:39:41
ways of being that's about extending
1:39:45
a view of personhood to the inanimate,
1:39:48
to regular software,
1:39:50
to machine learning and so on that makes quite an interesting
1:39:52
case for it. I just I just pasted
1:39:54
link to my review of the book
1:39:57
into the chat there. He
1:39:59
is a fascinating guy and, you
1:40:01
know, he he makes a great case. for
1:40:03
the idea that the fact that something
1:40:06
isn't isn't intelligent does
1:40:08
not mean that we shouldn't think of it as a person.
1:40:10
Wow. ecosystems and
1:40:15
and rocks and stuff. You don't need a kaya
1:40:17
hypothesis to treat the earth with There's
1:40:19
this there's this car in the salt circle.
1:40:23
What happened? Did you sit there? It couldn't
1:40:26
move? And
1:40:28
he wrote about how it made him feel bad, right,
1:40:30
that he invented a He heard its feelings.
1:40:33
And and, well, it hurt his feelings was
1:40:35
his point. and that, like,
1:40:37
the act the the that,
1:40:39
you know, acknowledging that it feels bad to
1:40:42
who are design a
1:40:44
thing to do something and then frustrate it
1:40:47
is a step towards a kind of wider empathy.
1:40:50
Is I really like him.
1:40:52
Do you think he apologizes to
1:40:54
Amazon's Echo if he swears
1:40:57
at it? Do you know there's a lot
1:40:59
of people who think you should? Who think Echo
1:41:01
thinks you should. assistance. Go ahead.
1:41:03
Swear it Echo. It'll it'll chastise you
1:41:05
for
1:41:05
for mistreating it.
1:41:07
Yeah. That's creepy to me.
1:41:10
I don't I think that's a
1:41:12
bridge too far. But there's,
1:41:15
you know, my wife and I have an argument. She
1:41:17
says, you shouldn't teach people
1:41:19
to be or require people to be polite
1:41:22
to inanimate objects. That's kind
1:41:24
of imbuing it with more power than
1:41:26
it deserves. It's an
1:41:28
innate There's an object that, you know,
1:41:30
there's a vegan slash vegetarian argument
1:41:32
that says that what whatever
1:41:34
you treat with the least respect is the
1:41:37
floor that
1:41:40
your respect for everyone else won't drop
1:41:42
below. So whatever it is you respect
1:41:44
least in the world, however much respect you afford
1:41:47
that, that's how
1:41:49
little respect you will treat anyone else with.
1:41:51
You'll never treat them with less respect than that.
1:41:53
And so if you raise the floor, for
1:41:55
how much respect you afford to the thing you respect
1:41:57
least, then you end up
1:41:59
raising the amount of respect you bring
1:42:02
to everything else in the world. I like that.
1:42:04
I like that. I think that's good.
1:42:07
Do you do you follow that? Probably
1:42:10
not very well. Do you ever
1:42:12
square an echo? We
1:42:14
don't have any voice assistance in my house.
1:42:16
So that's how little respect you have
1:42:18
for them. Yeah. I have I have enough respect for
1:42:21
them not to have one. Let's
1:42:24
take a little break. Having fun, I have
1:42:26
to say, with Alex Kaptorwitz big
1:42:28
technology podcast, you could
1:42:30
see why you wanna listen to this. Boy, you have
1:42:32
some great people on. That's fantastic. Thanks,
1:42:34
Leo. Yeah. We're having a nice run these days. Yeah.
1:42:36
And his book, of course, always
1:42:38
day one. which is more
1:42:40
than about Amazon, apparently. Amazon,
1:42:44
Apple, Facebook, Google, Microsoft, Childers,
1:42:46
and each zuck speaks with me for it. as
1:42:48
on Tech Titan's plan to
1:42:50
stay on top forever forever
1:42:56
There's a fun subtitle. Yeah. I love
1:42:58
it. Did you come up with that or your publisher? We
1:43:00
had a bit of a back and forth about it. They came
1:43:02
up with it. I'm like, I love that. We use it. And
1:43:04
they said, No? I was
1:43:07
going. They said, no. They
1:43:09
said, we don't we don't think we should use it. And I
1:43:11
was like, it's yours, back and forth, and eventually,
1:43:14
it stuck. We used it. and
1:43:16
it's fun. I think people read into it
1:43:18
what they want. I love it. My
1:43:20
first book I wanted to call it how to get the
1:43:22
dog here out of the disk drive for the publisher renamed
1:43:25
it a hundred and one computer answers
1:43:28
you need to know, which
1:43:31
turned out to be a strategic mistake because shortly
1:43:33
thereafter Kim Commando came out with a book called
1:43:35
one thousand and one computer answers
1:43:37
you need to know. Not really. I
1:43:39
think you can guess which one's so better. Right?
1:43:43
It should have been a hundred and one domination
1:43:45
hairs in the history. There you go. I
1:43:47
think how to get the dog out of the described was
1:43:49
a pretty good title, but our That's good. Yeah. Yeah.
1:43:52
Sure today brought to you by podium. You know
1:43:54
text messaging works. If
1:43:57
you wanna reach Elon Musk, text
1:43:59
text him. Well, it turns
1:44:02
out businesses can
1:44:04
use text messaging to stay in touch
1:44:06
with their customers. And
1:44:08
this is this is came in at very welcome
1:44:10
time. It's been a tough couple of years. for
1:44:13
small businesses, you know,
1:44:15
supply chain issues, COVID.
1:44:19
But one of the things I think we learned
1:44:21
from COVID is
1:44:24
that staying in touch with customers via
1:44:26
text is kind of the way they prefer to
1:44:29
communicate. You know, your food is ready, come
1:44:31
pick it your groceries are ready, come pick it up?
1:44:34
A lot of us don't wanna use the phone
1:44:37
to call a business, whether plumber, landscaper,
1:44:40
We don't wanna play phone tag. Maybe
1:44:42
we'd prefer to leave a quick message.
1:44:45
But if you're running a business, you could take advantage
1:44:48
of that. If the only way to reach you is with a phone
1:44:50
number, people are actually gonna turn their back.
1:44:52
You're gonna walk away from you. But if
1:44:54
that phone number can be texted or you can have on
1:44:56
your website a widget that says, send me a
1:44:58
text, or you can use text messaging
1:45:00
to reach out to customers. For instance,
1:45:03
when I every time I leave
1:45:05
my dentist, now I get a text messaging rate
1:45:07
Rate us on on Yelp or
1:45:09
or Google Business. There's an ice
1:45:12
cream shop in town that uses Podium. Every
1:45:14
three or four weeks, it says, oh, we haven't seen you in a while.
1:45:16
Here's a coupon for ice scream. No.
1:45:19
But it really works. Gets me in the door.
1:45:22
Podium gives businesses the tools to compete.
1:45:25
with the convenience offered by bigger
1:45:27
businesses like Amazon. So this
1:45:29
is really a boon for the small
1:45:31
business from healthcare providers like
1:45:34
my dentist to plumbers. Over a
1:45:36
hundred thousand businesses are texting
1:45:38
with customers using Podium. Customers
1:45:40
love the convenience, but business You will
1:45:42
love the results when car dealer sold
1:45:44
a fifty thousand dollar truck in four text
1:45:47
messages. In fact, that's
1:45:49
really that's the only way I wanna interact these
1:45:51
days. with my my
1:45:53
my dealership. Right? In fact, I just got a text because
1:45:55
I'm bringing the car in tomorrow to confirm.
1:45:57
A jeweler sold a five thousand dollar ring,
1:45:59
coordinated curbside pickups, did it
1:46:02
all in text. A dentist who
1:46:04
had had gotten really behind on his collections,
1:46:07
decided to use text messages seventy
1:46:09
percent of the out standing collections came in in two
1:46:11
weeks. Be it's just easier for people. It's not
1:46:13
they didn't wanna pay them. It just it was convenient.
1:46:16
It was easy. It was fast. With podiums
1:46:18
all in one inbox, you could do more than just chat. You
1:46:20
can get online reviews. Just send
1:46:22
a link. That works so much better.
1:46:25
Collect payments right through podium.
1:46:27
from anywhere. You can send marketing campaigns
1:46:29
that actually get a response. It's just a quick
1:46:31
text and your staff will love it too because
1:46:33
all the communications down from your customers come into
1:46:36
one inbox makes it easy for
1:46:38
you to keep track of what's going on
1:46:40
with any given customer. I want you to
1:46:42
try Podium. It's really cool.
1:46:44
See how podium can grow your business.
1:46:47
There's a great demo video at podium
1:46:49
dot com slash tweet and a pretty
1:46:51
good deal too. P0DIUM
1:46:53
dot com slash tweet.
1:46:56
Podium. Let's grow. podium
1:46:59
dot com slash tweet. Let's grow.
1:47:01
Let's go. Let's grow. the ultimate
1:47:03
text messaging
1:47:05
platform. Thank you, podium.
1:47:06
You
1:47:08
actually, Alex, had a a good
1:47:10
piece about the
1:47:12
judge in the Court of Chancery
1:47:14
that Elon Musk is going to be facing
1:47:17
This is coming
1:47:20
up October seventeenth, and
1:47:23
you did profile of Kathleen McCormick.
1:47:26
Who is she? And and and is this good
1:47:29
for Elon or bad? She's
1:47:31
a fascinating character. And I would say
1:47:33
it's largely bad for Elon she's
1:47:35
gonna be there. So just
1:47:37
to give a little context as to who she
1:47:39
is, she grew up in Delaware.
1:47:42
the
1:47:42
She was the daughter of two public school teachers,
1:47:45
her dad coached football, you
1:47:47
know, her mom eventually rose to become
1:47:50
an administrator, both duties they did.
1:47:51
in addition to the teaching. She
1:47:55
is the first person from her town,
1:47:57
Smyrna, which is a middle class town
1:47:59
in Delaware. to go to
1:48:02
Harvard.
1:48:02
And, you know, think she's gonna
1:48:04
go back into education, gets involved
1:48:07
in a legal nonprofit, starts
1:48:09
to see how the law could be used for good. It goes
1:48:11
to Notre Dame because I think because her
1:48:13
dad was a massive Notre Dame
1:48:15
fan. and
1:48:18
and gets a law degree there working on
1:48:20
human civil rights, actually
1:48:23
takes a job at a nonprofit and goes to argue
1:48:25
in front of the court in private practice and
1:48:28
eventually becomes, you know, a
1:48:30
vice chancellor. And then from there,
1:48:32
she becomes the first female chancellor of the
1:48:34
court in two hundred and twenty nine years in
1:48:36
its entire history. And
1:48:39
she has this very interesting ruling as vice
1:48:41
chancellor where these two
1:48:43
private equity companies one sells
1:48:46
as cake decorating company to the other,
1:48:48
and then COVID hits. And then they're like,
1:48:50
well, listen, no one's gonna wanna
1:48:53
decorate cakes anymore. The buyer said
1:48:55
that, and they they said, okay, we're not gonna
1:48:58
buy this company anymore. And the
1:49:00
seller sued them and it lands in front
1:49:02
of vice chancellor
1:49:03
McCormick. And
1:49:05
he basically says, like, look, you signed a
1:49:07
deal, and I'm the judge. And most
1:49:09
important for me is deal certainty. And
1:49:12
that's what our our job is here. Just to make
1:49:14
sure that when you agree to deal, the deal
1:49:16
goes through, and she forced
1:49:18
that cake deal cake that goes
1:49:20
all through. That's gotta be scary for
1:49:23
Elon, although
1:49:25
-- Absolutely. -- I don't think DecoPac went
1:49:27
for forty four billion dollars. No.
1:49:29
It was five hundred fifty million, so it was
1:49:31
a much smaller scale. But that's actually
1:49:33
surprise five hundred fifty a half
1:49:35
a billion for a cake decorating company.
1:49:38
It's a pretty legit cake that you're decorating
1:49:40
company. Wow. Yeah. The world their tagline
1:49:42
is hilarious. It's like we decorate the world's
1:49:44
best cakes or something like that. I mean,
1:49:47
that's pretty good. Half billion in a decade.
1:49:49
Wow. Not not bad. So Yeah.
1:49:51
No. This one is much bigger forty four billion
1:49:53
dollars. And
1:49:55
I I just don't think the size of the deal is gonna
1:49:58
come. Everything I know about her doesn't
1:50:00
lead me to believe. the size of the deal
1:50:02
or the court's ability to enforce is gonna come
1:50:04
into the ruling that she's gonna give.
1:50:06
Now now the question is, you know, what happens
1:50:08
with this whistleblower testimony, which
1:50:11
I think can throw a wrench in in
1:50:14
in Twitter's ability to force it to close.
1:50:16
But he only did sign the sign the deal saying that
1:50:18
he was gonna buy it. And if you look at
1:50:20
the past precedent of this judge,
1:50:22
we'll need you to believe that, you know, she she'll make
1:50:25
it go through. all things being equal.
1:50:27
So very interesting person. It might
1:50:29
be a scary proposition. I'm not sure
1:50:31
I want Elon Musk to own Twitter
1:50:34
be honest. Well, neither does Elon. Yeah.
1:50:36
I feel like he should do something to make Twitter
1:50:38
whole. I mean, this whole -- Mhmm. --
1:50:40
drama has caused Twitter staff,
1:50:43
a lot of upset, probably a
1:50:45
lot of money, Elon,
1:50:48
you know, was pretty flipping in the whole
1:50:51
thing. So I feel like he deserved he
1:50:53
needs to be somehow chastened,
1:50:55
but boy, I'm not sure I want him to be forced
1:50:57
to to bite. Of course. but but
1:50:59
that that won't come into consideration. No. She that's
1:51:02
not consideration. It's just a -- Exactly. -- it's a contractual
1:51:04
matter. That makes it interesting. And then,
1:51:06
okay, then you get into okay. So maybe they said,
1:51:08
oh, how much will Elon need to pay Twitter and make
1:51:10
it whole? It has to be,
1:51:12
you know, twenty billion dollars. something
1:51:14
in that range. Yeah. Because you look
1:51:16
at Facebook, right, which is maybe an analog,
1:51:18
which whose stock has gone down, I don't know,
1:51:21
fifty seven, sixty percent this year.
1:51:23
And
1:51:23
without without Musk, you know, Twitter
1:51:25
stock would probably fall fall by that
1:51:27
same amount. So
1:51:28
though
1:51:30
And now they're gonna really struggle to
1:51:32
operate. Their CEO has, like, very little credibility
1:51:35
inside the company and everyone's leaving. Yeah.
1:51:37
So what what's it gonna take? It
1:51:39
might end up having that might end up being
1:51:41
and I've always thought, okay, very little chance
1:51:43
this deal actually goes through. Now I'm
1:51:45
like, well, maybe maybe there is chance Well,
1:51:47
the market's fine to agree with you because the stock
1:51:49
price has been slowly ratcheting
1:51:52
up, not fifty four twenty as Elon promised
1:51:54
to pay, but it's been going up.
1:51:57
And I it might be a bargain. I mean, this
1:51:59
is an investment advice. But mean, if they got if
1:52:01
they got some many billions of dollars,
1:52:05
in cash. Who knows? They might find
1:52:07
a way a way forward? The
1:52:09
the question I don't know the answer to, but
1:52:12
maybe you guys do. given
1:52:14
how leveraged and
1:52:17
how
1:52:19
great the the profit to
1:52:22
valuation or earnings valuation ratios
1:52:24
are and how much room there is for those stocks to
1:52:26
move. If Elon
1:52:28
has to flog twenty billion dollars
1:52:30
worth of his Tesla stock. Does
1:52:33
that trigger cascade of effects that could
1:52:35
endanger all of his businesses, some
1:52:37
of his businesses? and would the
1:52:39
judge consider that? I
1:52:41
don't think the judge would consider that, but
1:52:44
yeah, I would say that remember Tesla
1:52:46
is a story stock. So what happens
1:52:48
-- Yeah. -- in Elon's business genius.
1:52:51
Right? If that takes a hit, that
1:52:53
myth. Look, he's a great business
1:52:55
man. He's built some amazing companies, no doubt
1:52:57
about that. But if he makes a very
1:52:59
stupid mistake,
1:53:00
this might end up proving to be one.
1:53:03
that could really cause an impact to
1:53:05
his other businesses. Yeah. Elon, I
1:53:07
think Larry despite being the richest man in
1:53:09
the world, Elon, most of the forty four
1:53:11
billion came from Elon's
1:53:13
loans against Elon's Tesla stock and
1:53:16
from -- Mhmm. -- people like Larry Ellison
1:53:18
who threw in a billion. Just just gonna
1:53:20
be tough for him. but two billion from Ellison's
1:53:23
gonna be -- Oh, yeah. -- to be too. So can
1:53:25
the judge text their health Those
1:53:27
people to follow through or does
1:53:29
the entire burden fall on Elon?
1:53:33
And I think that's gotta be a separate support action.
1:53:35
Yeah. Right? You've you've gotta say, do
1:53:37
do I does does Larry Ellison
1:53:39
sending a DM to Elon going, all
1:53:42
one billion or two, you tell me.
1:53:44
I'm hospitalized. Most, literally, by
1:53:46
the way, they pretty much work forward. Yeah.
1:53:49
Unbelievable. Yeah. This this thing
1:53:51
this thing will will definitely extend beyond
1:53:53
October because it well, they'll go to appeals and then,
1:53:55
of course, there's the money question. But
1:53:58
but it's it's I don't know. It's definitely
1:54:00
been fun and wild -- Wow. -- to follow.
1:54:03
Wow. But it could do some serious damage to
1:54:05
Elon. No doubt about it. Good pieces of news.
1:54:08
He's very rich, but forty forty
1:54:10
four billion dollars is nothing to sneeze at.
1:54:12
No. And I think you're right. I think that having
1:54:15
to sell that much stock, Tesla
1:54:17
stock, or
1:54:18
a significant portion of tech, you know, twenty
1:54:20
billion in Tesla stock. Right. Would
1:54:22
tank the stock. And that's a really
1:54:24
interesting and
1:54:26
Tesla shareholders hate this
1:54:28
already. Oh, yeah. It's not exactly like he's in
1:54:30
the middle of a bull market where the market will just pick
1:54:32
it right back up. Right. Markets aren't forgiving
1:54:34
right now. Yeah. Better
1:54:37
work on that humanoid robot pretty
1:54:39
darn hard Elon. Oh,
1:54:42
well, the market opportunities for a
1:54:44
for a humanoid robot that can't
1:54:46
do much are really, you know.
1:54:48
It was on somebody selling the ASTRO
1:54:53
I know you combine that with the with the
1:54:55
flamethrowers. And the next thing you know,
1:54:59
and and and putting holes in the ground
1:55:01
for your teslas to drive it. Yeah. Yeah.
1:55:03
And somehow somehow defeating geometry
1:55:05
with the power of your mind so that we
1:55:07
can add more cars to the roads without
1:55:10
creating more congestion. You
1:55:13
gotta make the robot by Twitter. That's
1:55:15
the only solution. Do you remember
1:55:17
that there was that mesothelioma blog
1:55:19
that it would just take stories from
1:55:22
Google news about mesothelioma and
1:55:24
then put them on a blog spot blog
1:55:26
with AdSense and then use the
1:55:28
revenue from that to buy Google
1:55:30
stock. And the idea was that if you ran long enough
1:55:32
you would eventually own Google. It's
1:55:35
like a paperclip AI.
1:55:37
Yeah. Eventually, you own the
1:55:39
universe. stop don't I stop
1:55:41
at Google? Take it. I wanna say
1:55:43
Matt Howie from meta filter, Bill. Sounds like
1:55:46
Matt Howie joint. Absolutely.
1:55:48
Absolutely. And how did it
1:55:50
do? Did it did it run or was it a thought experiment?
1:55:53
It made a bunch of money. You know, for a while, mesothelioma
1:55:55
was the top AdSense word on Google.
1:55:57
Right. one of those plex was worth, I don't know, like,
1:55:59
twenty bucks or so. That's the asbestos lung
1:56:02
disease that you see late night TV
1:56:04
ads for. a long time.
1:56:06
Yeah. What about
1:56:09
what do you think about the journalism competition
1:56:11
and pro preservation act? Amy
1:56:14
Klobuchar's bill,
1:56:16
which was initially blocked
1:56:19
by Ted Cruz. Cruz last
1:56:21
week, changed his
1:56:23
mind and allowed it through, and
1:56:25
in his advanced at a committee with fifteen
1:56:27
to seven vote, the seven were Republicans. voting
1:56:31
against it, but there were enough Republicans
1:56:33
voting for it that it went through. Ted
1:56:36
Cruz says, I think this amendment protects against
1:56:39
this antitrust liability being used
1:56:41
as a shield for censorship, big
1:56:43
tech hates this bill. That
1:56:45
to me is a strong positive. for
1:56:48
supporting it. There are a number of
1:56:50
problems with this bill, including this
1:56:53
thing called the first amendment to the
1:56:55
constitution. I'm
1:56:58
not sure if it'll ever get to a vote on the senate
1:57:00
floor, but it is at a committee. Should
1:57:02
I be worried, Alex? because I
1:57:05
could see a lot of problems with this thing.
1:57:07
You know, I don't think you should be worried. I
1:57:09
think this bill I I like where this bill comes
1:57:11
from, right, which is that publishers don't
1:57:14
have the ability to negotiate with platforms in
1:57:16
in a collective way. And the platforms have
1:57:18
this -- Yeah. You are. -- portion of the influence
1:57:20
of Corey's been talking about that. That's in the book.
1:57:23
that that, you know, you're
1:57:25
enjoined by law not to collude.
1:57:28
Mhmm. So they have no way to negotiate.
1:57:30
So this gives them essentially a
1:57:34
safe harbor so that they can --
1:57:36
Right. -- get together and negotiate with
1:57:38
Google and Facebook. Yeah.
1:57:41
Now here's my I'm gonna go out on a limb here
1:57:43
also. I don't know what's in the water
1:57:45
this week, but hey, let's go for it. Do
1:57:47
it, man. Do I I think that
1:57:49
publishers trying to make their livelihood
1:57:51
over negotiating with Facebook and
1:57:53
Google or playing losing games. Yes.
1:57:56
I don't think you can depend on a platform for
1:57:59
your I mean, yeah, I don't think
1:57:59
you could depend depend on a platform
1:58:02
with
1:58:02
algorithms in the news feed for your distribution.
1:58:04
I don't think you should wonder about a platform
1:58:06
like Google that's sending you traffic. You know,
1:58:09
how much they should pay you for displaying the link. I think
1:58:11
that's a net benefit. You know, it's a publishers.
1:58:13
and my I'm I run small media company
1:58:16
and we don't depend at at all on, you
1:58:18
know, Facebook traffic or Google traffic. And
1:58:20
so and I think it's better that way because
1:58:22
we make the decisions, you know, for the reader, not
1:58:24
for the platforms. Right. And
1:58:26
so, ultimately, like, I think that having this
1:58:28
right to negotiate is good. But
1:58:31
do I think that it's like an earth shattering thing?
1:58:33
No. Especially given that Facebook has
1:58:36
made a a real effort to reduce news
1:58:38
links in the news feed.
1:58:41
where they used to be a big portion of what was
1:58:43
going on on Facebook. Now they're getting
1:58:45
getting close to nonexistent. And and I think that's
1:58:47
largely good. I don't think we should get our news through
1:58:49
Facebook. I think they should find other ways to do it. There
1:58:51
is the argument. This is based on the Australian
1:58:54
bill which was promoted by Rupert
1:58:56
Murdoch who wanted to you know, get a little more
1:58:58
link money out of the big giants.
1:59:01
And there is the argument that it's
1:59:03
somewhat worked in in Australia.
1:59:06
Despite Facebook's retaliatory attempt
1:59:09
for a while, Facebook said, well, no more news
1:59:11
links for you. There's
1:59:13
also the concern though that
1:59:17
it doesn't apply to anybody
1:59:19
who employs more than fifteen hundred people.
1:59:21
And there's some concern that
1:59:24
in order to to make this work,
1:59:27
private equity will go around buying newsrooms
1:59:29
and cut it down to fourteen hundred and ninety nine
1:59:31
people. So they could
1:59:33
probably have wild imaginations. I
1:59:36
mean, it's not it's not gonna be the main part of
1:59:38
a company's business, the idea of private equity
1:59:40
will go and trim. I mean, private equity
1:59:42
returns for its own reasons, but not to
1:59:44
take advantage of the protections under this bill.
1:59:47
Court, you've pointed out that that, you know,
1:59:49
journalism has definitely suffered. Actually,
1:59:52
you have a whole chapter talking about Craig's
1:59:55
list versus Google
1:59:57
links, and which is the most
1:59:59
damaging surging to
2:00:02
to news. Yeah.
2:00:04
I mean, III think that the
2:00:07
problem with this solution is that it
2:00:09
misunderstands what
2:00:11
it is technology did to news.
2:00:14
So it's definitely true that Craigslist
2:00:17
the
2:00:18
made a more efficient way of doing
2:00:20
classified advertising than newspapers had.
2:00:22
But there's a reason that Craigslist was better
2:00:24
for classified advertising. It wasn't just different
2:00:27
cost basis. It's that in the run
2:00:29
up to the Craigslist era, the web
2:00:31
I don't know if he's one point five. Web one point
2:00:33
five, something like that, And the run
2:00:35
up to that,
2:00:36
there was a series of media
2:00:38
roll ups, right, after the telecommunications act
2:00:41
and the Clinton Years. that allowed
2:00:44
radio stations and TV stations and newspapers
2:00:47
in a single market to all come under common ownership
2:00:49
and also for there to be more cross market
2:00:51
ownership. And you saw lots and
2:00:53
lots of regional local newspapers coming
2:00:56
under a single owner.
2:00:58
large corporate owners as opposed
2:01:00
to the historic basis for news, which
2:01:02
was, you know, outside of the big cities,
2:01:05
the historic basis for news was you had like
2:01:07
petition family who own
2:01:09
the newspaper,
2:01:10
who mostly ran it as a business to
2:01:12
allow appliance manufacturer
2:01:15
or appliance retailers and grocers to
2:01:17
reach people who are interested in the sports
2:01:19
scores. And, you know,
2:01:21
in between, some of that money was peeled off to send
2:01:24
a college kid to the town meeting to
2:01:26
write down what people were saying and whatever controversy
2:01:28
there was. And
2:01:30
those papers were were mostly rolled
2:01:33
up into these big corporate
2:01:34
national organizations. And
2:01:38
one of the ways that the new owners tried
2:01:40
to justify by those roll ups was in
2:01:42
part by trimming. They trimmed
2:01:44
a lot of the regional sales force. And
2:01:46
so the local shoe leather sales force who
2:01:48
knew how to sell classified ads to local merchants
2:01:51
were replaced by centralized call rooms,
2:01:53
where you would just call Chicago or New York or
2:01:55
whatever when you or or somewhere in the Midwest.
2:01:58
when you wanted to place an ad in the newspaper
2:01:59
down the road.
2:02:01
Another way that they realized
2:02:03
new efficiencies was by selling off buildings,
2:02:06
physical plant, and outsourcing
2:02:08
core functions, which expose them to a bunch of shocks,
2:02:11
like rent shocks, and other
2:02:13
shocks, interest rate shocks, and so on.
2:02:15
where where suddenly when things got bad, it
2:02:17
got worse. Right? Because when things got bad suddenly
2:02:19
their rent might go up or the cost of leasing
2:02:21
their presses might go up, and then they would
2:02:24
be really exposed. And so
2:02:26
you had this industry that had already weathered
2:02:28
so many technological shocks. Right? The newspapers
2:02:30
survived the telegraph radio,
2:02:33
the television, cable, satellite.
2:02:36
And suddenly, they were uniquely vulnerable to
2:02:39
Craig Newmark, who is a lovely guy
2:02:42
and
2:02:42
very smart and
2:02:43
who did something really cool
2:02:45
but was not, I think, intrinsically
2:02:48
more disruptive to their business than cable television.
2:02:51
Yeah. And and the reason was that they had
2:02:53
made themselves vulnerable. So
2:02:55
that's one of the things that's that's hurt the
2:02:57
newspaper. papers. Right? Is that is this combination
2:02:59
of consolidated ownership and changes
2:03:02
in the technology? But the other thing that's
2:03:04
really hurt them and that is not addressed in
2:03:06
this bill at all
2:03:07
is fraud in the ad markets. So
2:03:09
the ad duopoly, Facebook and Google,
2:03:11
have
2:03:12
now there's a pretty strong evidentiary
2:03:14
basis to say that they steal from
2:03:16
publishers. At the Operation
2:03:18
Jedi Blue or Project Jedi Blue, which
2:03:21
was disclosed in the Texas AG case
2:03:23
against Facebook, shows that the
2:03:25
senior management team of Facebook and Google
2:03:28
sat down and illegally colluded to rig
2:03:30
the ad market so that published would
2:03:32
get less and and advertisers would pay
2:03:34
more. You add to that other
2:03:36
forms of fraud like the pivot to video,
2:03:39
which was based on lies about how many people
2:03:41
are watching videos which cost the newsrooms
2:03:43
of the country and around the world, you
2:03:45
know, billions in aggregate and made them even
2:03:48
more vulnerable And you have this thing where
2:03:50
where you have tech platforms that
2:03:52
are stealing money from
2:03:54
news organizations, and we're acting like
2:03:56
problem is that they're stealing content.
2:03:59
from news organizations
2:03:59
and allowing your users to talk
2:04:02
about the news or providing links to the
2:04:04
news so people can talk about it is
2:04:06
not stealing the news. If it's a secret,
2:04:08
it's not the news. Right? The thing that that it
2:04:10
makes it the news is that we talk about it.
2:04:12
But stealing your ad money is
2:04:14
an actual problem that we can put our fingers
2:04:16
on and that we can actually put our hands
2:04:18
around and we could say, alright, we're gonna have transparency
2:04:21
rights. We're gonna follow the model Sarbanes
2:04:23
Oxley and create individual criminal liability
2:04:25
for executives who knowingly sign or
2:04:27
produce false reports
2:04:30
on financials, you know, like, things
2:04:32
that actually address themselves to the problem
2:04:34
instead of trying to take something off to
2:04:36
one side, which is, you know, creating a
2:04:38
stream of payments based on the bizarre
2:04:40
idea that you should pay to link to the news or
2:04:42
let your users talk about the news instead
2:04:45
of unrolling the the fraud, which
2:04:47
which is a thing that would benefit all kinds
2:04:49
of creators, including newsrooms, but also individuals
2:04:52
and so on. Agreed.
2:04:54
And in the meantime, I would say publishers
2:04:56
should should not do business with these companies,
2:04:58
do whatever you can to stay away from them and
2:05:01
be be immune to
2:05:03
Well, I mean, your ability. And there's ways to They
2:05:05
own the stack. Right? They own,
2:05:08
like, it's very hard not to do business
2:05:10
with them. I disagree with you. I disagree.
2:05:12
I think that that there's independent
2:05:14
ad tech that you can use out there. There's also subscriptions.
2:05:17
And I think podcasts are also, you know,
2:05:19
another another revenue source that you don't need to
2:05:21
go through them. Here's another great
2:05:23
story about fraud in the ad world,
2:05:25
and it has to do with podcasts.
2:05:29
And this is the story that's gonna get
2:05:31
me fired from my employer, the
2:05:33
iHeart, the fabulous iHeartMedia Corporation.
2:05:36
according to Bloomberg podcasters have
2:05:39
been buying ads in mobile
2:05:41
games that
2:05:43
when you click on the ad downloads
2:05:47
the podcast. Every time a player
2:05:49
taps on one of these fleeting in game ads
2:05:51
and you you you get some virtual loop for
2:05:53
doing it, A podcast episode
2:05:55
in the background begins downloading, which
2:05:59
means the podcast company can claim
2:06:01
the gamer as a listener and
2:06:03
add a download to its overall tally.
2:06:07
You you might think well how
2:06:10
how big a deal can that be?
2:06:12
Obviously, those priest people are not listening.
2:06:15
They don't even know in many in most cases
2:06:17
that they downloaded the podcast. It
2:06:20
just it goes out into the as Carrie explained
2:06:22
to Mako sentence. The ether, it doesn't
2:06:25
the the bitbucket, doesn't even get saved
2:06:27
under your under
2:06:28
your phone. One
2:06:30
game referenced in this paper, this the study
2:06:33
came from a company called Deep Se.
2:06:36
EE, Bloomberg's
2:06:38
writing about it. One
2:06:41
company, a popular mobile
2:06:43
app from ribos called
2:06:45
subway surfers. If you played it downloaded
2:06:47
three billion times since it came
2:06:49
out ten years ago. Over a period
2:06:52
of two week in August, Bloomberg found multiple
2:06:54
publishers using the game
2:06:56
to rack up podcast downloads. including
2:07:02
I'm sad to say iHeartMedia, which
2:07:06
is is is my employer
2:07:08
for the the radio show. iHeartMedia
2:07:11
apparently was one of the number one
2:07:14
users of this and
2:07:18
they showed out more than ten million dollars
2:07:20
gained to six million unique listeners a
2:07:22
month, and they've been doing it
2:07:24
since twenty eighteen. Always,
2:07:27
you know, it's funny because we don't lie about
2:07:29
our numbers. In fact, if anything we're
2:07:31
conservative about our numbers, And
2:07:34
I always was puzzled when I see numbers
2:07:37
from some of these big
2:07:39
podcast companies like, really, you get
2:07:42
forty million downloads a month's release.
2:07:44
Well, now we know.
2:07:46
They don't. They don't.
2:07:49
this
2:07:49
is of and this is this is just one
2:07:51
more example of the of the click fraud that
2:07:53
you were talking about, Corey. I
2:07:56
mean, apps because they're so opaque
2:07:58
are a natural environment for this stuff.
2:08:01
Yeah. You know, you do have to get through the app
2:08:03
store, heuristics, and analytics.
2:08:05
But if you can smuggle something through, the
2:08:08
platforms by design don't
2:08:10
let users closely monitor
2:08:12
how the apps themselves are working, like you
2:08:15
know, you it's Apple famously
2:08:17
sued a company that made a VM that you could
2:08:19
run iOS apps and that would
2:08:21
allow you to do, like, really deep forensics on
2:08:24
it. you know, because
2:08:26
they they they don't want
2:08:28
you to think of this as software that you can,
2:08:30
like, stick your own controls on. I
2:08:32
saw you had an hour run down today, something
2:08:34
a little later on about the OG app,
2:08:36
which as I understand it is is a very similar
2:08:38
kind of thing to to in
2:08:41
terms of allowing users to gain more control
2:08:44
in that it kind of acts as an
2:08:46
overlay to your social media
2:08:48
and then loads the feed that the social media
2:08:50
company wants send you, but throws away the
2:08:52
ads. And, you know, this is Apple
2:08:55
Chucked it out of the App Store and and
2:08:57
This is the point. Right? It's to be able to exercise
2:08:59
control so
2:09:00
that when your interest diverge from the
2:09:02
App Store's interest, their interest
2:09:05
comes first. and that produces
2:09:07
the space in which all this mischief can take
2:09:09
place because as soon as you design computers
2:09:11
to treat their owners as adversaries, Right?
2:09:14
Adversaries the manufacturer, then
2:09:17
you make design decisions that,
2:09:19
you know, necessarily increase
2:09:21
their opacity. There's
2:09:23
a lot one of the reasons OG App exists
2:09:25
is because there's lot of dissatisfaction with
2:09:27
what Meta has decided to do with Instagram,
2:09:29
which is essentially Take it from
2:09:31
a very lovely site where you could share photos
2:09:34
with your friends and family and turn it into
2:09:36
TikTok because
2:09:37
that's the flavor of the month.
2:09:39
the
2:09:39
o g app eliminated ads.
2:09:41
That was probably part
2:09:43
of the problem from
2:09:44
Meta's point of view, but also eliminated everything,
2:09:46
all the algorithmic recommendations turned
2:09:49
it frankly, I loved it. It turned my
2:09:51
Instagram back into my Instagram like
2:09:53
it used to be. it
2:09:55
immediately caused problems. If one of the
2:09:57
first things Instagram
2:09:58
did
2:09:59
after I installed the o g app, I
2:10:02
recommended on that break with me on Tuesday. It's
2:10:04
gone today, by the way. one of the reasons
2:10:06
I recommend I recommended it
2:10:08
and liked it was because it gave
2:10:10
me my traditional Instagram. But as soon as I went
2:10:12
back to Instagram, Instagram said, well, there's
2:10:14
been a security event in your
2:10:16
Instagram app. Please, you
2:10:19
have to re authenticate. We gotta make sure you
2:10:21
are you. okay, fine. But
2:10:23
that happened every time I use the OG app,
2:10:25
and then it of course, they somehow some
2:10:28
somebody got Apple to pull it. from
2:10:30
the App Store, so it's gone. I
2:10:32
don't know if it's I think it's still on the Android.
2:10:34
Yeah.
2:10:35
I mean, Apple is a good
2:10:38
proxy for defending your interests when they're
2:10:40
co terminal with Apple's interests. Yeah.
2:10:42
And they do have enormous resources
2:10:45
and very skilled personnel who do that.
2:10:47
but when your interest diverge
2:10:50
from theirs, and this is not
2:10:52
unique to Apple. This is true of all the big firms.
2:10:54
Facebook has an incredible security team
2:10:56
that defends you from all kinds of threat actors.
2:10:59
The one threat actor that they want to defend you from
2:11:01
is Facebook. Right. And Apple is
2:11:03
the same. So if you're a Chinese iOS
2:11:06
user and Apple has decided
2:11:08
that access to Chinese consumers in Chinese
2:11:10
manufacturing is more important than
2:11:12
the integrity of Chinese users,
2:11:15
you have no recourse when they remove all
2:11:17
the working VPNs from the App Store and add a
2:11:19
backdoor to their cloud servers. For
2:11:21
the Chinese state to use, because
2:11:23
by design, you can't modify and
2:11:26
intervene. One of the things
2:11:28
that I think we need understand is that
2:11:30
The outer periphery of
2:11:33
how badly a firm can
2:11:35
treat you has historically and
2:11:37
I think is still determined by what
2:11:39
you can do if you're dissatisfied. Yeah.
2:11:41
You can just leave, and there's someone in the chat saying, if
2:11:43
you don't like Apple, you should just leave. Well,
2:11:46
leaving Apple Inc. causes a switching
2:11:48
cost. Right? There's the technological cost
2:11:50
of adjusting to something new. There's throwing away
2:11:52
the media that is specific to your Apple device.
2:11:55
There's other intangible
2:11:57
problems like losing the ability
2:11:59
to do rich I'm sessions
2:12:01
with your fellow Apple users. You remember,
2:12:04
Tim Cook recently said, if you wanna
2:12:06
share videos and pictures with your mom,
2:12:08
you should just buy her an iPhone. So
2:12:10
the corollary of that is if you
2:12:13
switch away from iOS, then you
2:12:15
you can then you can say goodbye to doing that
2:12:17
kind of messaging with your mom. So all of those
2:12:19
switching costs have to be weighed
2:12:21
against the benefit that you get
2:12:24
from going somewhere else. And the firms understand
2:12:26
this very intimately. Again, in the
2:12:28
Texas AG case against Facebook,
2:12:31
one of the documents that was released
2:12:34
are very frank memos between
2:12:36
product design teams who
2:12:38
are saying we are going to design, for
2:12:40
example, Facebook photos, such
2:12:42
that it is
2:12:44
very good to use not
2:12:46
because we think that is something people
2:12:48
will value, but because we think it
2:12:50
will lure people into adding their family
2:12:53
photos to Facebook And once
2:12:55
they do, they will endure the high switching
2:12:57
cost of leaving behind their
2:12:59
cherished family photos if they quit Facebook
2:13:01
and go to Google Plus, which is the rival they
2:13:03
were worried about at the time. And and
2:13:05
so firms very deliberately add very
2:13:08
high switching cost to their products, and
2:13:10
one of the things that these interoperable technologies
2:13:13
do, like OG App and like
2:13:16
VMs and and all these other things
2:13:18
that people build that are Part
2:13:20
of the long history of how technology
2:13:22
companies including Apple and Facebook
2:13:24
have confronted their own competitive challenges
2:13:27
is they allow users to have an intermediate state
2:13:30
between leaving the firm altogether,
2:13:32
leaving the service altogether, and enduring those high
2:13:34
switching costs and enduring whatever
2:13:36
ration of crap, the firm wants to shovel
2:13:38
down their neck. Right? Ad blockers are
2:13:40
a great example. Pop up blockers are
2:13:43
really good example. you
2:13:44
know, pop ups were once everywhere. The
2:13:46
browsers added them by default. Right?
2:13:48
They were doing adversarial interoperability with
2:13:51
the publishers whose content was being loaded
2:13:53
in the browser. publishers stopped
2:13:55
displaying popups because advertisers stopped
2:13:57
asking for them because users block them.
2:13:59
That is one of the mechanisms by which we make
2:14:02
technology better is by giving users
2:14:04
control over their technological destiny so
2:14:06
that firms own behavior is
2:14:08
limited because when they act
2:14:10
badly, the users just take a corrective.
2:14:13
You called, I remember, Ad
2:14:15
blockers, the largest consumer
2:14:17
boycott in history. That's Doc
2:14:19
Surls. Yeah. That's including Doc Oh, good.
2:14:21
Yeah. Alright. Yeah. Good to credit, host
2:14:23
of our Floss Weekly show. We're
2:14:25
gonna I wanna take a break. We're gonna talk about
2:14:28
Google doing what they can to make ad blockers
2:14:30
no longer work at all.
2:14:33
the
2:14:36
Alex,
2:14:36
before I take the break, anything you
2:14:38
wanna add to the last
2:14:39
few minutes of conversation? Thanks
2:14:42
for the opening. I would say that
2:14:44
if we're thinking about everybody
2:14:47
just on the podcast app. I
2:14:49
thought the podcast scam is really interesting. Isn't
2:14:51
that wild too? Oh, makes me sad news.
2:14:53
makes me so sad because -- Yeah.
2:14:55
-- it hurts us. You know? It really hurts us
2:14:57
badly. Absolutely. I assume we're doing the same
2:15:00
thing. Or Yep. Yeah.
2:15:02
The bad news is the bad news is obviously, like,
2:15:04
when you try to build a podcast inauthentic
2:15:07
real way like you and I are doing, you know,
2:15:09
you gotta build a brick by brick tough to build
2:15:11
them. And so, yeah, it does screw us.
2:15:14
The good news, if you wanna look at a silver lining,
2:15:16
is that podcast ads work Yes.
2:15:18
They're very valuable -- Yeah. -- business rules.
2:15:21
Yeah. So that's the positive note I'll give aside.
2:15:23
I I agree with you. You are you
2:15:26
asking for a secondary? Yeah. Thank you too.
2:15:28
I know we hadn't supported. Yeah. Yeah. So
2:15:30
I don't know what your experience has been, but lately,
2:15:32
it's been tougher and tougher
2:15:34
because companies mostly it's
2:15:36
agencies, not individual companies, but they want ad
2:15:38
tech. And they'll and they'll
2:15:40
raise the specter of, well, I can get all this information
2:15:42
from Spotify or iHeart. Or
2:15:45
they'll say to us, well, look, iHeart's
2:15:47
CPM's their cost for a thousand
2:15:49
listeners. So much lower. Yeah. because
2:15:52
eight million of those are fake. Right.
2:15:55
We can't it becomes hard to compete
2:15:57
with that kind of fraud
2:15:59
or and
2:15:59
or
2:16:01
co opting of the of the market. And,
2:16:03
you know, I think, honestly,
2:16:05
independent podcasting like you
2:16:07
and I do,
2:16:08
is really gonna be facing an uphill
2:16:10
battle over the next few Yes. But on the other
2:16:12
hand, yeah, I think that we renew it
2:16:14
more easily because people see the stuff works
2:16:17
And, you know, they know it's not fraud.
2:16:19
And they're more inclined to spend again. Yeah.
2:16:21
That's good. So Ultimately, if you're
2:16:23
running a business, what you wanna do, you wanna deliver for
2:16:26
customers. create loyalty and
2:16:28
produce a good product. And I think that you can
2:16:30
you can do that in in a upstanding way
2:16:32
in this industry, and it's worth heck of a lot
2:16:34
to advertisers, which is why you
2:16:35
see people wanting to mimic it.
2:16:38
Yeah. So
2:16:38
they should stop. But the the headline
2:16:41
for me is, you know, It's
2:16:43
good. It's great news about podcast ads. Yeah.
2:16:45
I wanna get the word out to advertisers because
2:16:47
if, you know, they they get they get,
2:16:49
oh, it's cheap to buy iHeart. but
2:16:51
then what they don't get is return because
2:16:54
they're fake
2:16:55
numbers.
2:16:56
Unfortunately, sometimes
2:16:59
the reaction is, well, I guess, podcasting doesn't
2:17:01
work. Let's
2:17:02
put our ad dollars back in NFL football.
2:17:04
Right? But
2:17:05
clearly, there's enough interest in podcasting
2:17:07
that I hope still are going to these great
2:17:09
links in order to fulfill it. So look,
2:17:12
I think that there's I still believe that podcasting
2:17:14
is you know, more than radio, more than television,
2:17:17
more than print. You know,
2:17:19
I I think it's the the most intimate
2:17:22
form of of media. You're there
2:17:24
with a person, you know, in my case
2:17:26
for an hour a week, in your case for many
2:17:28
hours a week. and you're there with them on the
2:17:30
book. Get tired of me. I could tell you. don't
2:17:32
know them. But you're there with them on your
2:17:34
I don't I don't think so. I I don't every time I'm I'm
2:17:36
with Humana, I hear You know what? I
2:17:38
was thinking about to be kinda fun is
2:17:41
I was
2:17:42
talking about this with Micah. I haven't
2:17:44
told my wife yet, but be
2:17:46
kind of interesting to just make
2:17:48
it more like a radio station where you just have
2:17:51
hosts come on for two or three hours and you
2:17:53
just you're always on. like twenty four
2:17:55
seven. Always say that. Mhmm.
2:17:56
Wouldn't that be interesting? Kind of
2:17:58
a, like, a live That's a continuous live stream.
2:18:01
That was my first I started out my
2:18:03
first real media gig was working
2:18:06
a sport show at NASA Community College
2:18:08
Radio. before I was in college, actually,
2:18:10
I was in high school when I wrote to these guys.
2:18:13
And it was a blast. It's amazing.
2:18:15
So I I recommend it. Yeah. It's
2:18:18
just a thought. We have that's one of the reasons we've got
2:18:20
our fabulous club twist. I
2:18:22
wanna give it a little bit of a plug because
2:18:25
it makes a big difference to our bottom
2:18:27
line. In fact, right now, it's about twenty five percent
2:18:29
of our total revenue. And that
2:18:32
helps a lot as ads start
2:18:34
to get harder and harder to sell. If I think
2:18:36
if there is a long term future for us,
2:18:38
it's through something like club,
2:18:40
Twitter. Now here's what you get seven bucks a month,
2:18:42
which think is pretty affordable dollar
2:18:46
figure. You get all ad free versions
2:18:48
of our shows. All the shows ad free.
2:18:51
You also get access to the club to a
2:18:53
discord, which a lot of people are saying
2:18:55
that's really the benefit we like. Not
2:18:58
everybody who joins the club who ends up in the Discord,
2:19:00
but a lot of us use it and love it.
2:19:02
It's a place to discuss not only
2:19:04
the shows, but you know, everything geeks
2:19:06
like books and comics and ham
2:19:09
radio and hardware and Linux and music.
2:19:12
That Discord is a lot of fun. Here's our
2:19:14
TikTok. corner. I guess they're talking
2:19:17
about TikTok in there. And of course, here's
2:19:19
the twit discussion going on
2:19:21
right now. We have animated gifts, which
2:19:23
makes it very much
2:19:25
more fun also. It's a great place
2:19:27
to put links in to other shows.
2:19:29
This interview I did with you, Corey,
2:19:31
on triangulation was not ad
2:19:33
supported. was club supported. That that's the only
2:19:35
way we could do it because
2:19:37
it's a ad hoc show that,
2:19:39
you know, we bring in people when we wanna
2:19:41
talk to them. And
2:19:42
so no advertiser can buy a show where they don't know
2:19:44
when it's on, but the club makes it possible. That's
2:19:46
why we also do hands on Mac in
2:19:48
the club with Micah and hands on windows
2:19:50
with Paul Therrotton. our entire Linux
2:19:52
shows in there. The club
2:19:54
has really given us a chance to do a lot more
2:19:57
interesting stuff. We launched our new space show in
2:19:59
there this week in space.
2:19:59
which has since gone public.
2:20:03
Also, the Twit Plus Feed, that's where those other
2:20:05
shows show up seven bucks a month. If you're
2:20:07
interested, it
2:20:07
really helps us. There's a yearly membership
2:20:10
too if you don't wanna be nickeled and dined
2:20:12
every month. You can even get corporate
2:20:14
memberships if you wanna have
2:20:16
everybody in your company listen, that would
2:20:18
be good. You
2:20:19
can also buy individual shows for two dollars
2:20:22
and ninety nine cents a month.
2:20:23
all at twit dot tv slash club
2:20:26
twits, and we thank you
2:20:28
for your support. We do have great advertisers.
2:20:31
Thank goodness.
2:20:32
because, you know, it's expensive
2:20:34
to do what we do, including
2:20:37
policy genius. Policy
2:20:39
genius is Well Genius.
2:20:41
We pay hundreds of dollars a year
2:20:44
to for insurance for our fire insurance for
2:20:46
our house or, you know, you have to for your car.
2:20:48
Right? some of us even buy insurance
2:20:50
for our phones, the fungal fingered among
2:20:53
us. But how many of you are taking
2:20:55
steps to protect your family?
2:20:58
you know, think about your costs
2:21:00
to your family mortgage payments,
2:21:02
student costs,
2:21:04
student loans and other
2:21:06
loans that don't disappear if something happens
2:21:09
to you a life insurance policy. It
2:21:11
was the when we had kids, the first thing I
2:21:13
did is I wanna make sure my family was taken
2:21:15
care of if something bad happened to me.
2:21:18
It can provide your loved ones with a financial cushion
2:21:20
they can use to cover those ongoing
2:21:23
costs, and it gives you peace of
2:21:25
mind. to
2:21:26
know that they're gonna be protected.
2:21:29
Having life insurance through your job,
2:21:31
yeah, that's nice. It never was enough to
2:21:34
really solve this problem.
2:21:36
In fact, most people, it turns out need about ten
2:21:38
times more coverage than their jobs offer
2:21:41
to provide for their families. Inflation
2:21:43
is driving up prices for everything these days
2:21:45
life insurance rates actually down
2:21:48
from this time last year. since life
2:21:50
insurance typically gets more expensive as you
2:21:52
age, the best time to buy that
2:21:54
life insurance is today. And
2:21:56
the best way to do it, policy, genius.
2:21:59
It's not and insurer.
2:21:59
They're
2:22:01
an insurance marketplace, so
2:22:03
you can get quotes from all the best
2:22:05
companies, AIG, Prudential, all of the
2:22:07
best best companies in one place. and
2:22:09
get the lowest price on your life insurance,
2:22:12
which means you could save as much as fifty percent
2:22:14
or even more on life insurance just
2:22:16
by comparing quotes on on policy
2:22:18
genius. Options started
2:22:20
seventeen dollars a month for half a million dollars
2:22:22
of coverage, course, it's gonna depend on you, your age,
2:22:24
and so forth. That's why. But it's not getting
2:22:26
cheaper to do it right now. Click the link
2:22:29
on our show notes or head to policy
2:22:31
genius dot com slash twit, and
2:22:33
you can get personalized quotes in minutes
2:22:35
to find the right policy for your needs. Now
2:22:38
I should tell you the people you're talking to at
2:22:40
Policygenius are licensed agents. They have to
2:22:42
be in order to advise you about insurance.
2:22:45
You're buying insurance from these big companies, but Policygenius
2:22:47
is not working for those companies. They work for
2:22:49
you. And they're on hand through the entire
2:22:52
process to help you understand your options. help
2:22:54
you make decisions with confidence.
2:22:56
They do not add extra fees. They
2:22:58
do not. You'll be glad to know. Sell your info
2:23:01
to third parties. Check the reviews
2:23:03
on Google, Trustpilot, and elsewhere thousands
2:23:05
of five star reviews. People really love
2:23:07
this service. They have options
2:23:09
that offer coverage in as little as a week.
2:23:12
Some avoid unnecessary medical exams.
2:23:14
Thirty million people have shopped for insurance.
2:23:17
At Policygenius, they've placed more than a hundred
2:23:19
fifty billion dollars in coverage. And again,
2:23:21
people love it. While you're there,
2:23:24
of
2:23:24
course, they do offer quotes for home, auto,
2:23:26
pet, renters, and more. So you should check that
2:23:28
out as well. But I I thought it was appropriate
2:23:30
to talk about life insurance because I think so
2:23:32
many of
2:23:33
us, especially you younger folks. You don't think
2:23:35
about that, but you got kids. You
2:23:37
got a spouse. If you got a family, you need
2:23:39
that life insurance. Policy genius
2:23:41
dot com slash twit. Get those life
2:23:43
insurance quotes free. There's no cost.
2:23:45
and see how much you can save policy genius.
2:23:48
Policygenius
2:23:49
dot com
2:23:51
slash twit.
2:23:55
Oh, gosh. There's so many stories and so little time.
2:23:57
Let me see. Before we go to
2:24:00
talk about that, Sorry, I wanna
2:24:02
ask you about the Amazon hardware
2:24:04
event, Alex. Before
2:24:07
we go to the a seventeen
2:24:10
chip price increase. Before
2:24:12
all of that, let's
2:24:13
run a quick promo for some of the things
2:24:15
that happened this week on
2:24:16
Twitter. think you might recognize some of the It's only
2:24:18
the people in the c suites and the stockholders
2:24:21
that are really getting rich on this. And the people
2:24:23
are waking up and we gotta get there
2:24:25
get before the GEA teams go up
2:24:27
because, I mean, I have one over here.
2:24:30
No. No.
2:24:32
Hold on. Previously on
2:24:35
Twitter, triangulation. Pori
2:24:37
Doctor O just coauthored a book
2:24:40
now out called chokepoint capitalism.
2:24:43
Corey and his co author Rebecca Giblin
2:24:45
are joining us. So Amazon's got
2:24:47
this flywheel that it loves to boast about.
2:24:49
It says we've got this lower cost structure. It
2:24:51
leads to lower prices. And everybody is
2:24:53
super happy. That's so funny. We've got so excited.
2:24:56
Yeah. Yeah. Because everything
2:24:57
that Amazon has ever done in its
2:24:59
business has been designed, first and foremost,
2:25:01
to lock in its customers. But also so it
2:25:03
could squeeze other competitors
2:25:05
out of the market. Tech news weekly.
2:25:08
Stadia, Google's cloud gaming service
2:25:11
is done. Google's basically announced
2:25:13
that it's going to be done. It's killing it. time
2:25:15
and time again, they prove that this is
2:25:17
just part of their brand identity, and that's a bad
2:25:19
thing to be part of your brand identity. Mack
2:25:22
Break Weekly. Yeah. That's almost I'll be
2:25:24
testing this Apple Watch Ultra in
2:25:26
the hurricane that's actually headed
2:25:29
my way. Oh, where are you right now? My
2:25:31
word? am near Tampa, Florida.
2:25:33
Did kind of the bottom line review, Steven,
2:25:35
on the Ultra? Worth it. I do
2:25:37
like the bigger screen, and I was always
2:25:39
a titanium watch guy. So if you want entertain
2:25:42
Well, there it goes.
2:25:44
Oh, bye bye, Steven. Oh, my, Steven.
2:25:46
My pleasure. talking to you. Good luck in the
2:25:48
hurricane. Twit. It keeps
2:25:51
going and going and going and
2:25:53
going. Show this show the wild shot.
2:25:55
The hurricane is happens now
2:25:57
is comes in and removes you.
2:26:01
Wait. Wait.
2:26:03
He's back. I'll ask as long
2:26:05
as my APC runs on my Maximus.
2:26:07
I hear the beeping. I hear the beeping.
2:26:10
The good news is the hurricane
2:26:12
dodged Tampa and Steven
2:26:14
is just fine. Just in case and
2:26:16
now our friends in Fort Myers, maybe that's another
2:26:18
story, but we are thoughts
2:26:21
going out to all of you. Of course, I hope
2:26:23
you did survive and do well in the hurricane
2:26:25
if you didn't. I'm sorry.
2:26:28
Sorry.
2:26:28
Did
2:26:30
you watch any of the or read any
2:26:32
of the coverage of Amazon's big event
2:26:35
this week, Alex? And what were your thoughts?
2:26:38
I was most intrigued by that sleep monitor
2:26:40
that they have. I don't know about
2:26:42
you, but think you said earlier in the show maybe
2:26:44
for the eight sleep bad that we could all use a better
2:26:47
night. Yeah. The halo, they call it. Right?
2:26:49
because it know, the halo And it doesn't have a camera
2:26:51
as far as I could tell. Yeah.
2:26:53
Now I it doesn't have
2:26:55
a camera, and I'm always interested
2:26:57
in trying to use, like, you
2:26:59
know, any way I can to sleep better.
2:27:01
and and learn a little bit more about how I sleep and
2:27:03
the things that I run your wrist, I don't sleep almost
2:27:05
anything on my wrist. So,
2:27:08
you know, I'm I I guess I'm out on those
2:27:10
Something like this is is interesting. However,
2:27:13
I just don't trust Amazon with this type
2:27:15
of thing. So, you know, maybe
2:27:17
someone can take a a do go over reverse
2:27:19
route. and copy this from Big Tech, and so
2:27:22
I can use Actually, Google with their desktop
2:27:24
will also do that because that
2:27:26
one has a camera and also listens to -- Right.
2:27:28
-- snores and stuff like that. That's how for me. Yeah.
2:27:30
Yeah. So so I don't know.
2:27:32
I I like these I like the idea. I like use
2:27:34
this idea if we can use technology to improve
2:27:36
our wellness and improve our health It's
2:27:39
interesting. This is part of what Amazon do it.
2:27:41
Amazon's a halo series
2:27:44
because they have this halo band too, which is
2:27:46
fitness tracker. Mhmm. And
2:27:48
then now this is the Halo View, which is
2:27:52
the
2:27:53
which doesn't seem as more of a watch. Right?
2:27:55
doesn't sit. And then then a halo rise, which doesn't
2:27:57
have a camera. That doesn't see. Yeah. It doesn't sit. Yeah. And
2:27:59
it
2:27:59
has a cool. It wakes you up with some light in the
2:28:02
morning. It's not it's not gonna buy it, but a
2:28:04
lot of people are gonna buy this -- Yeah. -- for sure.
2:28:06
Amazon also announced that they are
2:28:08
gonna let your Amazon Echo act
2:28:10
as an EuroBeacon. Amazon owns the
2:28:12
mesh networking company hero. And
2:28:15
in a very interesting synergy, when you
2:28:17
buy one of the new fifth generation Amazon
2:28:20
echos, it can be used
2:28:22
to extend your WiFi using
2:28:25
Arrow. I have
2:28:27
to say all of these things look like Amazon
2:28:31
simply veiled attempts at Amazon
2:28:33
to get more sensors into your home.
2:28:36
Corey Are they also You had
2:28:38
any thoughts on that? Yeah. I mean, I
2:28:41
think that is exactly what this is
2:28:43
as is the the the
2:28:45
robot astro. An competition. Yeah.
2:28:47
I I mean, III think that
2:28:50
for me, this drives home the the problem
2:28:52
or the poverty of just own two business with
2:28:54
them. because I own some heroes.
2:28:56
In fact, I own many heroes. I
2:28:59
bought them before they were an Amazon company.
2:29:03
So if the answer to if
2:29:05
you don't like Amazon, don't do business with
2:29:07
them, If that's
2:29:09
the answer, then what do I do when Amazon buys
2:29:11
the company? Right? What do you do when Google buys
2:29:13
Fitbit? What do you
2:29:15
do when Facebook buys your beloved daily
2:29:17
visit, Jiffy? You
2:29:20
you are you're stuck. Right? And
2:29:22
so, you know, it's not like I can
2:29:24
add my own firmware. It's
2:29:26
not like I can untether it. In fact,
2:29:29
the firmware updates for the heroes
2:29:31
that I own have become increasingly Amazon
2:29:34
linked. Every time I get one, I get
2:29:36
a bunch of notices about how
2:29:38
this is becoming less and less functional
2:29:40
with Alexa. They build out as it becoming
2:29:42
more and more functional with Alexa. But
2:29:44
obviously, these are the the opposite
2:29:47
signs of the same coin. And,
2:29:50
you know, it's it
2:29:52
tells you the problem with acquisition
2:29:54
driven growth in
2:29:56
a market where we choose winners based
2:29:58
on what people buy because
2:30:00
companies who have access to the capital markets
2:30:03
can just buy the companies that
2:30:05
are successful. You know, if everything in your grocery
2:30:07
store is made by Procter and Gamble or
2:30:09
or Unilever. And if there is a local,
2:30:12
you know, artisanal, oatmeal cookie place
2:30:14
that takes off, one of them will buy it.
2:30:16
And
2:30:16
when they do their press release, they
2:30:19
will say, we hear Procter and Gamble
2:30:21
understand that our customers value choice, and
2:30:23
that's why we bought the company you chose to buy
2:30:25
things from instead of us.
2:30:29
Yeah. This is pointing to an interesting
2:30:32
battle, a slightly different lens, but
2:30:34
interesting battle between Amazon and
2:30:36
Apple. And I think that
2:30:38
Apple has largely given up in the
2:30:41
in the battle for for ambient computing
2:30:43
remember it sort of rolled back HomePod
2:30:46
series still is garbage. Meanwhile,
2:30:48
Amazon's gonna be everywhere in your home.
2:30:51
as a whole new computing layer.
2:30:53
And I think there you know, as this stuff gets,
2:30:55
you know, more and more embedded in
2:30:57
our lives, the fact that we're walking around the sensors,
2:30:59
the fact that, you know, you have echoes in your house.
2:31:02
There are also mesh networks. You can speak
2:31:04
to it. It will enhance your WiFi, by the way.
2:31:06
I would like to hear a cordy things about this,
2:31:08
but now they're gonna inquire a Roomba or try
2:31:10
to at least. Right? So now that they're vacuum you're
2:31:12
you're there vacuuming the latest
2:31:14
news yet with the Right. Elizabeth Warren doesn't
2:31:16
like that. I mean, they're just the FTC. Yeah. Yeah.
2:31:19
So but, yeah. But, anyway, I think that I think
2:31:21
that it's interesting, and I wonder what will happen,
2:31:23
you know, as Apple sees you
2:31:25
know, that Amazon run away
2:31:27
with this screen less computing in
2:31:29
your home and whether it then
2:31:31
pushes Apple to try to know,
2:31:34
basically do what it's done, you know, in in
2:31:36
every other level, which is, say, we're gonna do this,
2:31:38
but we're gonna do a privacy, you know, privacy
2:31:40
focused and whether that will work. it's
2:31:43
an interesting battle that I'm looking
2:31:45
at with this. I think Apple and Amazon just
2:31:47
inch closer and closer together,
2:31:49
the farther we get
2:31:51
towards you know, them pushing beyond
2:31:53
the markets that they own, and I think that's gonna be a
2:31:55
really interesting battle. But anyway, why
2:31:58
don't we tee up Corey on the
2:31:59
on the Roomba question? That's
2:32:02
what I really have to say. The the iRobot
2:32:05
acquisition. I mean, I think it's the same thing. Amazon
2:32:07
would like to know
2:32:09
about the inside of your home and its
2:32:11
geometry. and they'd like
2:32:13
to know that so that they can leverage
2:32:15
it for parochial advantage like
2:32:17
to to make you use that
2:32:19
or or or to make to make it so that the
2:32:22
only company that you can use that geometry
2:32:24
with is Amazon. It
2:32:26
is to your benefit if you own a mobile robot,
2:32:28
and this is the kind of thing that you're interested in
2:32:31
to get accurate maps of your home from
2:32:33
that robot. But it's not to your benefit
2:32:35
to have that robot only allow
2:32:37
you to take the data that was generated by the
2:32:39
robot that you bought, that you charged
2:32:42
with your electricity, going around your home
2:32:44
that you own a rent, but that data
2:32:46
not being yours to use to your maximal
2:32:48
advantage, that data's uses being constrained
2:32:51
to uses that are good for
2:32:53
the shareholders of the company that
2:32:55
made the robot. In fact, not even the shareholders
2:32:57
of the company made the robot, the shareholders of the company
2:32:59
that bought the company that made the robot, I don't
2:33:01
understand why we as the owners of that
2:33:03
robot should feel like like
2:33:05
it's our duty to make sure that those shareholders
2:33:08
are happy. I'm on team core
2:33:10
here, but I still have to come by a moment
2:33:12
here towards the end. But, you know,
2:33:14
III bought the Roomba on prime day,
2:33:16
so go figure Amazon got me to buy it. But
2:33:18
this is really the lesson of all of this
2:33:20
is that we are willing captors
2:33:24
captain. Exactly. Right? Yeah. Yeah. That's
2:33:26
a great lesson. Yeah. You point out that
2:33:28
in your book, Corey, that -- Yeah. -- only one
2:33:30
percent of people have Amazon Prime ever
2:33:32
shop for better deals outside of
2:33:34
Amazon. It's just Yeah. Once you pay
2:33:36
for prime, you're there for life.
2:33:39
You know? Go
2:33:39
ahead, Alex. I'm
2:33:41
just gonna just do a quick oat to the Roomba
2:33:43
that
2:33:44
Does it work for you? Fives. You like it?
2:33:46
I love it. I'm a alcoholic. I run
2:33:48
it a couple times a week, and my floor is a real
2:33:50
clean. So Can
2:33:53
I ask you as a practical matter? How does
2:33:56
that work? because I have never owned a room bed
2:33:58
that didn't immediately get hopelessly lost
2:34:00
and, you know, like, a crack
2:34:02
in the pavement or, like, a cable
2:34:04
or something. They just Yeah. Yeah.
2:34:07
You probably have a very square build
2:34:09
house with no you know, it's just
2:34:12
No wires. No.
2:34:15
It it does keep my wires. I have to be little
2:34:17
careful about it, but my house
2:34:19
is, yeah, pretty boxy. I mean, it's an apartment,
2:34:21
so it's pretty small. We had a roomba that would
2:34:24
get optimal for the roomba. We had a there's
2:34:26
little at his chair kinda
2:34:28
side table thing that had just
2:34:31
enough clearance on the floor for the room to
2:34:33
think it could get under it, but not
2:34:35
quite enough clearance for it to continue. So
2:34:37
it gets stuck. Oh, yeah. It's amazing.
2:34:39
They're pretty nice. Roomba. The Roomba is the only
2:34:41
thing in my house. That's more stubborn than me. It's pretty
2:34:44
amazing. And it would go bang. Dang.
2:34:46
Dang. Try to get in there. And I said
2:34:49
Three in the morning. Sorry. You don't have every three
2:34:51
days. Every night. Three in the morning, I'm up. Picking
2:34:53
up a little Roomba, bringing it back to its
2:34:56
home. Yeah. I'll look at it. There's
2:34:58
these moments where the room but just goes for
2:35:00
it and keeps going for it. And then eventually it
2:35:02
wins. And it's a it's quite a cell
2:35:04
bridge hormone for me. I'm like, you did it.
2:35:06
You did it. There's a
2:35:08
great machine learning kind
2:35:11
of teachable
2:35:13
moment, cautionary tale
2:35:16
where an engineer used
2:35:18
an ML algorithm to get his
2:35:20
Roomba to minimize forward crashes
2:35:23
with its forward bumper so that it wouldn't
2:35:25
bang into the walls. And
2:35:27
it just started going in reverse. It's
2:35:31
given up on it entirely on it. It's a
2:35:33
smart Roomba. By the way, I'm just nominating
2:35:35
Roomba halyxt for the show title. I like
2:35:37
it. Rubble makes
2:35:40
it happen. Surround your Rubble with a little
2:35:42
bit of salt and see what see what happens.
2:35:44
Yeah. talking about browser extensions
2:35:47
and browser ad blockers, Google
2:35:50
had announced the evil
2:35:52
technology in manifest v three
2:35:55
which allowed for something called the web
2:35:57
content API was
2:35:59
going to be eliminated from Chrome and
2:36:01
hence Chrome, it's open source parent
2:36:06
and hence probably from many open
2:36:08
source projects based on Chromium, which
2:36:11
means that
2:36:12
ad blockers like my favorite Gore Hill's
2:36:15
u block origin would no longer work. They
2:36:17
require this manifest v
2:36:19
two and access to the web content
2:36:22
via the API. Google has some good
2:36:24
reasons to to dump it. It
2:36:26
can slow your it can really hit performance
2:36:28
if all the If every single extension
2:36:31
starts asking for the content, it
2:36:33
can also be a privacy problem,
2:36:35
but I think people install the Honey Chrome
2:36:39
extension really want it.
2:36:41
Know that Honey's watching every move
2:36:43
they make. blotted
2:36:45
criticism. Google has said we're gonna delay
2:36:47
this till twenty twenty four. This has been,
2:36:49
by the way, kind of, a standard for
2:36:51
Google. They'll announce some big change
2:36:54
to something or other. Everybody will complain.
2:36:57
And then Google says, oh, well, never mind. We're gonna
2:36:59
do topics. We're not gonna do we're
2:37:01
not gonna do that other thing.
2:37:03
So I hope that this is delayed forever, but
2:37:05
it's just one more reason you should not use
2:37:08
Chrome or Chromium based browsers. Vivaldi
2:37:10
says our ad blocker will continue work, will continue
2:37:12
support V2I think Brave
2:37:14
has its own ad blocker in there, but
2:37:16
I use Firefox for that reason. think
2:37:18
it's
2:37:19
good to have
2:37:20
competitor
2:37:21
to Google. And
2:37:23
finally,
2:37:26
as you know, McDonald's has left
2:37:29
Russia which
2:37:31
has given rise, I think, to
2:37:34
a number of stores that
2:37:36
look just like McDonald's called Tasty,
2:37:38
and that's it. And
2:37:40
now Russia's former Lego stores,
2:37:44
Lego has also left the country, have
2:37:46
been rebranded as world of cubes
2:37:49
But as Robishes the points out Boeing
2:37:52
Boeing Boeing, the LEGO patent
2:37:54
has expired. so it making
2:37:56
a LEGO clone is not hard
2:37:58
to do. Unclear whether
2:38:01
they don't stick with world of cubes. World of cubes
2:38:03
pretty good. Better. Yeah. Yeah. Although
2:38:05
Rob says he should have called an eastern block. I mean,
2:38:07
really, come on. Oh,
2:38:10
Rob, it skits as a national Is
2:38:13
it Pascissa? I don't say it right from now.
2:38:15
I say Pascissa. I don't know what he says. Pascissa.
2:38:18
Pascissa. Yeah. Well, that's the correct Italian.
2:38:20
Prenunciate -- Yeah. -- excuse me.
2:38:24
He suggests also it'd be nice if
2:38:26
there were some locally themed replacement product
2:38:28
lines such as sets for Berry's
2:38:30
execution in the Rubianca building's
2:38:33
basement and so forth. Oh
2:38:35
my god. Okay.
2:38:40
We can laugh as as we watch
2:38:42
the world burn. That's that's pretty
2:38:44
much the
2:38:46
story there. I hope we don't have a nuclear
2:38:48
war, World War three,
2:38:50
or any of that. And if with
2:38:53
any luck we don't, you can listen to the
2:38:55
Fantastic Big Technology Podcast
2:38:59
as created by the wonderful Alex
2:39:01
Cantorwitz. Boy, you get some great people.
2:39:03
Baragavon
2:39:04
is on
2:39:06
the most recent one, Google's senior vice president
2:39:08
of search. Really
2:39:10
good stuff. Yeah. I'm about to drop
2:39:13
an episode with Frances Halligan,
2:39:15
the Facebook Whistle accent. So
2:39:17
that's coming by the time this is live, that
2:39:19
will be live. And then later this
2:39:21
week, I have Tom Allison who
2:39:24
many people don't know, but he runs Facebook beyond.
2:39:26
So Wow. It'll it'll be some
2:39:28
some fun conversations coming up and
2:39:31
lots of AI stuff on on the
2:39:33
way as well. So if people like that conversation
2:39:35
we had, about sentience and stuff
2:39:38
like that, trying to bring all views in. It's gonna
2:39:40
be fun. Yeah. It was Kevin Kelly in August,
2:39:42
so this is the second time we've dropped his name.
2:39:44
Right. And Kevin Kelly's episode was just his
2:39:46
life advice. We didn't really talk about technology at
2:39:49
all. It was just his life advice. That's his new That's
2:39:51
amazing. something like that. Yeah. He has a book coming
2:39:53
out about it, and he has these lists
2:39:55
of a hundred things, you know, for my hundred, you
2:39:57
know, seven seventieth birthday or something like
2:39:59
that. Right.
2:39:59
Right. Right. I just loved it so much
2:40:02
as I kinda want you come on. Well, I'm gonna ask you about
2:40:04
these things. Good. So was a blast. Oh, I'll listen to
2:40:06
that one for sure. I love good. Also,
2:40:08
of course, the
2:40:09
big technology newsletter at
2:40:12
big technology dot substack
2:40:14
dot com.
2:40:16
There's
2:40:16
dally images right there up top. Oh,
2:40:18
are you using this? week. Dolly, for your
2:40:21
illustrations. I am. I was using
2:40:23
I'm I'm not a big business, so I I don't
2:40:25
have money for illustrators, but
2:40:27
I was using unsplash before. the
2:40:29
free -- Right. -- stock photos.
2:40:31
And I think this is my first week trying to
2:40:34
dial a live shout. And I like dope.
2:40:36
It's dope, man. don't
2:40:38
believe And don't forget always day
2:40:40
one how the tech titans plan to stay on
2:40:42
top.
2:40:43
Alex is excellent. Books,
2:40:45
speaking of books, Corey Doctor Rose.
2:40:47
Choi Choi Point Capitalism is
2:40:49
now available.
2:40:52
I'm ashamed to say I read it on Kindle
2:40:54
Unlimited, but I understand that you pulled a
2:40:56
block on that. Best
2:40:58
thing to do, buy it. from
2:41:01
from a chokepoint capitalism dot com
2:41:03
or Corey's really
2:41:05
fantastic, pluralistic.
2:41:07
blog, which I love.
2:41:09
Or or anywhere else books are sold
2:41:11
just to be clear. Okay. You don't mind. Yes.
2:41:13
You don't mind if it's somewhere else. Okay. Yeah.
2:41:16
anywhere finer books
2:41:18
are sold and don't forget pluralistic dot
2:41:20
net. Corey has yet to use dolly
2:41:23
for his illustrations. You
2:41:25
know, the the I I use them in bits
2:41:27
and pieces. So do you Oh, okay.
2:41:29
Yeah. Like and
2:41:31
that that illustration for today's
2:41:33
medium column with the with
2:41:36
the TED Talk Stage. I couldn't find
2:41:38
a good image of a TED Talk Stage. So I said,
2:41:40
empty TED Talk Stage. and
2:41:43
I got that, and then everything else
2:41:45
came from it. And and my thread about
2:41:47
Palantir in the NHS, haunted
2:41:49
NHS hospice whole was my prompt to
2:41:51
Dolly, and then everything else That was
2:41:53
really creepy. That that one is really,
2:41:56
really creepy. Whoof.
2:41:57
And
2:41:58
the Ted did you ask for a donkey in
2:42:00
the Ted Talk, or did it just no. No. No. I
2:42:03
I shoot that. So the Ted
2:42:05
stage is is
2:42:07
is Dolly. The jeans
2:42:10
are from a public domain source.
2:42:12
The torso is Steve Jobs
2:42:14
is torso. I thought it might be. Yeah.
2:42:17
Or That looks like the donkey is
2:42:19
turtleneck. Yeah. And the donkey
2:42:21
and the lounge shoes are both creative
2:42:23
commons images. And they're all created
2:42:25
as what you mean by a centaur. I think
2:42:27
right human. centaur. that's the
2:42:30
chickenization of TED Talk. Do you
2:42:33
and and Shoop, is that the official
2:42:35
terminology for That's the verb for yeah.
2:42:37
For like, you can tell by the pixels. that's when you
2:42:39
that's when you go in and mod an image
2:42:41
with Photoshop, you have shooped it. But in
2:42:43
my case, I gimped it. I
2:42:45
think because you are the open source
2:42:48
guy. Always a pleasure to have you
2:42:50
on your way. Thank you. I agree, Leo.
2:42:53
Yeah. It was a and I like that you ended
2:42:55
this with your own controversial take that World
2:42:57
War three is bad. It's gonna
2:42:59
be a bad thing for children
2:43:01
and other and other living creatures. Of
2:43:03
course. Yeah. Unhealthy for children.
2:43:05
Let me see. Yeah. You remember that poster you old
2:43:07
hippie you. Yeah. I had one in my bedroom
2:43:10
growing up. My god. I figured you did
2:43:12
have a little flower drawing. Yeah. Yeah. Hey.
2:43:16
Thank you. Both of you. Really a pleasure.
2:43:18
I knew that with the two of you, I didn't need anybody
2:43:20
else. What a great show Alex Cantor
2:43:22
with Scory Doctorow. Thanks for being
2:43:24
here. Thanks to all of you. I think you're probably glad
2:43:26
you were here for this as well. We do Twitter
2:43:29
every Sunday afternoon, two PM Pacific,
2:43:31
five PM. eastern twenty one hundred
2:43:33
UTC. You could tune and watch it live
2:43:35
at live dot twit dot tv if you're watching
2:43:37
live. Our IRC is open, both
2:43:39
Alex And, Corey, we're in the IRC
2:43:41
actively participating. That's fantastic. IRC
2:43:45
dot twitch dot tv. They're even
2:43:47
putting in plugs for where you could buy
2:43:50
the book, Corey. They they love you. Oh.
2:43:52
And, also, if you're a member
2:43:54
of Club twist, you can do it in the Discord.
2:43:56
After the fact, shows are available
2:43:59
at the website, ad supported twit dot
2:44:01
tv. Also on YouTube, there's
2:44:03
dedicated YouTube channel for all of our
2:44:05
shows. And of course,
2:44:06
the best way to get it. In my opinion,
2:44:08
be to find a a
2:44:10
podcast player and subscribe. And that way, you
2:44:12
get it automatically every Sunday night. just
2:44:14
in time for your Monday morning,
2:44:16
commute
2:44:17
from the bedroom to the
2:44:20
living room. Hey.
2:44:24
Oh, it's a sad old world. It's a
2:44:26
sad old world these days up to you.
2:44:28
Thank you everybody for joining us. Alright. See
2:44:30
you next time another tweet. You too. We can.
2:44:33
Bye bye. Crazy.
2:44:37
Doing the to it. Alright. Doing
2:44:39
the to it, baby. Doing the
2:44:41
to it. Alright.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More