Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
The business of open AI gets weird. Tesla
0:02
now wants to be a robotics company as
0:04
its stock price drops. Plus, when is it
0:06
time to get off the hedonic treadmill? All
0:09
that and more coming up with the cast
0:11
of Slate Money right after this. The
0:13
LinkedIn Podcast Network is sponsored by TIAA. TIAA
0:17
makes you a retirement promise.
0:20
A promise of a guaranteed
0:22
retirement paycheck for life. Learn
0:25
more at tiaa.org backslash promises
0:27
pay off. Welcome
0:29
to Big Technology Podcast, a show for cool
0:31
headed, nuanced conversation of the tech world and
0:33
beyond. We have such a fun show today.
0:36
One I've been looking forward to, the cast
0:38
of the Slate Money Podcast, is here to
0:40
talk about a series of fun stories where
0:42
tech, economics, and finance meet.
0:45
We're going to do a home and home series. So they're here,
0:47
and then I'll come over to their show in a couple of
0:49
weeks, and I'm pumped for that. And so let's kick it off.
0:51
I just want to welcome the cast
0:54
here. Felix Sandman is here. He's the
0:56
chief financial correspondent at Axios. Felix, welcome.
0:58
Thank you very much. Emily Peck
1:00
is also here. She's the market's
1:02
correspondent at Axios. Emily, welcome. Hello,
1:05
hello. Excited to be described as
1:07
the member of a cast, because now I
1:09
feel like I play myself on Slate Money.
1:11
So that's spinning my head. Last
1:14
but not least, Elizabeth Spires here. She's a
1:16
contributing writer for the New York Times' opinion
1:18
section, and she writes Slate's Pay Dirt. Elizabeth,
1:21
welcome. Thanks for having us. Thanks
1:23
for coming in. I think that
1:25
their combination of economics and tech
1:28
is very fascinating right now, because we
1:30
have this very weird situation where companies
1:32
and investors keep plowing money into
1:34
these AI startups, and we're not really sure
1:36
what the return is going to be, what
1:38
they're actually using that money for, what the
1:40
business outcome. But yet, we start to hear
1:42
numbers like trillions of dollars of investment. Really,
1:45
that's what we're hearing now. That
1:47
was Sam Altman with one of the
1:49
craziest numbers. I think I might have
1:52
called it deranged on Axios that
1:55
I've ever heard. He came out and said,
2:00
Well, he didn't quite come out and say
2:02
he was reported to have said that he
2:04
was looking to raise $7 trillion
2:07
to build a whole new infrastructure
2:10
around AI, which is so
2:12
far beyond any amount
2:14
of investment that has ever been put
2:16
into anything ever. It
2:19
kind of makes you think that maybe he
2:22
just doesn't understand numbers. Well, it's
2:24
also double the most valuable company in
2:26
the world. That's
2:28
sort of what makes this conversation he had on
2:30
20 BC with Harry Stepping really interesting and sort
2:33
of we can riff on it because
2:35
we are trying to find out what the economics of
2:37
this AI business is. And so here's
2:39
what Stepping says. He goes, in terms of
2:41
marginal cost versus marginal revenue, how do we
2:43
think about when marginal
2:46
revenue exceeds marginal cost? Basically, like, are you
2:48
going to have a profitable business? And
2:51
Sam goes, I mean, truly, I think of all the
2:53
things we could talk about. That is
2:55
the most boring, no offense. That's the most boring question
2:57
I can imagine. Stepping
2:59
goes, why is that boring? And Sam goes,
3:03
well, you have to believe that the price of compute
3:05
will continue to fall on the value of AI as
3:07
the models get better and better
3:09
will go up and up. And like the
3:11
equation works out really easily. So that's all
3:13
I'm into. I want to know what his
3:15
equation is because he does seem to talk
3:17
about all of this as if the numbers don't
3:20
really matter. You're just putting in one bigger number
3:22
and there's a smaller one for the input. And
3:25
for you, where's the line
3:27
between techno-optimism and techno-naiveté? So
3:31
the argument he's making is
3:33
like he's making two different
3:35
arguments and both of them make sense, but I feel like
3:38
he can't make both at the same time. The
3:41
first argument he's making, and this is the
3:43
same argument that Jensen Huang has been making
3:45
from NVIDIA, Which is the price
3:47
of Compute has been coming down for decades and
3:50
has now reached the point at which AI is
3:52
possible and there is no indication that it's going
3:54
to stop coming down. And so long as it
3:56
keeps on going down at the same... Rated
4:00
it has been coming down which is more
4:02
or less Moore's Law. You
4:06
don't need to worry about. The.
4:08
Long term prices from keep exists gonna go to
4:10
to zero very quickly and then or you need
4:12
to weigh about his do I get any revenues
4:14
and if the revenues are. Going.
4:16
Up and send as some point no plans call
4:19
for me. Become a very profitable company. That's
4:22
a perfectly reasonable position
4:25
to hold. And.
4:27
Then however at the same time some
4:29
open has this other position which is
4:32
basically in order for the price to
4:34
come of computer come down to. A
4:37
level where I as profitable
4:39
we need seven trillion dollars
4:41
of investment and. That
4:43
is objectively something that is never going
4:45
to happen. There just isn't seven trillion
4:47
dollars of freely available cash in the
4:49
world to invest in anything. And if
4:51
there was, it would. Not. Be
4:53
going into a I mostly so
4:55
I'm. So. That kind
4:57
of his own. Rhetoric.
5:00
Is under cussing his own rhetoric that.
5:03
Guy. Inject some nuance here which is
5:05
that does Seven trillion Dollars is a
5:07
number that Sam hasn't fully confirmed yet
5:09
can scan it also something that will
5:11
will be I for compute and potentially
5:13
other things may be training but. But.
5:16
We but I think the corby argument is
5:18
totally right which is that the stuff is
5:20
going to cost a ton of money to
5:22
train and it does to of contradict this
5:24
idea of that. the the idea of cunt
5:26
than the cost of computing is gonna come
5:29
out in has already cost a ton of
5:31
money and every if you look at right
5:33
now you know how much it costs to
5:35
to form a single chancy be tit for
5:37
tat Cbt to give a single answer to
5:39
a single question the how much guff image
5:41
any to produce a single image? it's. A
5:44
huge amount of money and all of these
5:46
companies are losing money on every. You.
5:48
Know response basically and
5:50
this is. A.
5:54
Reprise of the famous split scaling
5:56
model in our until arising have
5:58
a great column. I'm the
6:01
information about this, he basically what
6:03
is going on right now if
6:05
you have a handful of companies
6:07
led by Open A I Who.
6:10
Trying. To invest as much money
6:12
as they can as early as
6:14
possible in order to gain market
6:16
share and I P and get
6:19
them get their. A I
6:21
basically own the eyes two and half and
6:23
reach a point where no one else can
6:25
afford to build one as and or they
6:27
own it in a way that you know.
6:30
They. Have happens on the or something as
6:33
it was very unclear but. They.
6:35
Want to monopolize a i
6:37
going forwards and all of
6:39
these incredibly high value with
6:41
these multi billion dollar valuations
6:43
that with seeing only make
6:45
sense In a world where
6:47
the companies have some kind
6:49
of comparative advantage have some
6:51
kind of monopoly on something.
6:54
And this is definitely the way
6:56
the Us tech industry has evolved
6:58
over. The. Course of this
7:01
century Rights Debt is is the you
7:03
have a small handful of multi trillion
7:05
dollar tech companies that.that way by investing
7:07
huge amount of money and getting a
7:09
bunch of market share before anyone else
7:12
and then creating that kind of mode
7:14
and becoming you know basically impossible to
7:16
compete with. And the
7:18
bet that the investors are making is the
7:21
same thing is going for happen and there's
7:23
just gonna be a handful of Ai companies
7:25
rather than a I being a foot of.
7:28
Lord Public Utility likes a
7:30
Tcp Ip that everyone can
7:33
use. That. It also just creates
7:35
an incentive for any taxi. Or that
7:37
sign that model that kind of stick a finger
7:39
in the air in terms of determining how much
7:41
capital a seat and taking the biggest number possible
7:43
which seems to be part of it up. and
7:46
say, That. The Defense: The difference
7:48
here says the last time around that
7:50
the building of the Big Tech monopolies
7:52
which seems to be how we ended
7:54
up is that the cost of entry
7:56
into the As Space First started is
7:59
so high that. You. Already have
8:01
monopoly. He is open as ready. Pretty much
8:03
a monopoly on a I'm in Its and
8:05
it's mostly funded right by. A big
8:07
tech company? There's not a that doesn't seem like there's
8:09
a lot of. Innovation around
8:11
the start up space because of the
8:13
costa entry is so high. What we
8:15
have we have one of genuine
8:17
monopoly and I would choose in
8:20
video right Everyone everyone in the
8:22
I space is using the same
8:24
h one hundred chip and. One
8:27
of the fit one of the reasons why
8:29
family and once a much bigger border ecosystem
8:31
is that he feels that it is unhealthy
8:34
for and video to be the only company
8:36
making a I have since I I want
8:38
to build fabs Thanks. For. The
8:40
one that's not go seven trillion dollars
8:42
a complete fifty billion dollars pay to
8:45
like Even raising fifty billion dollars have
8:47
to build a fab is very hard
8:49
given that. You. Know a
8:51
large number of companies have tried
8:53
to build. He left a to
8:55
be a fabs and have failed.
8:57
Really Only Cfm see has shown
9:00
itself capable of building those those
9:02
fat size as far as is
9:04
that like in video Tsmc duopoly
9:06
that is still really owns most
9:08
of the sun is moat around
9:10
here. Is some almonds right
9:12
and the cost of the computers come down.
9:15
There's also the other side of the scale.
9:17
Riot and whoop the revenue. Peace. And are
9:19
we at the point yet? Alex, you saw
9:21
this more closely. that. There's. Like
9:23
a lot of money to be made in ah
9:26
for real like I know and videos making lot
9:28
of money selling tips the companies who hoped that
9:30
they make a lot of money from add that
9:32
has anyone done anything worth likes? That's the I
9:34
phone of their whatever. They certainly don't know
9:37
consumer facing a I product that his is
9:39
making any revenue but like you're right in
9:41
that kind of middle. There's.
9:43
Like on who was a common that
9:45
was some big consultancy company. You
9:48
that accenture or something said that they
9:50
just made six billion dollars on a
9:52
I consulting guess everyone in a like
9:54
derivatives states for derivatives like it is
9:56
very very hard. That. One of
9:58
the things that we have seen him. Whatever. As like
10:00
a year and a half his cecchetti be
10:02
t came out and course you know all
10:04
of the you know. Crazy. Is
10:07
that there is very little
10:09
real consumer demand from normal
10:11
human beings. are willing to.
10:14
Who Who who want to pay cash for this.
10:17
The one last thing I think could be
10:19
a money maker down the line is the
10:21
labor cost savings. Like Jeffrey Katzenberg had some
10:23
quo and access to their thank them for
10:25
next Had it where he said like with
10:27
a I the timeline for making a movie
10:29
is basically cut in half. The amount of
10:31
labor you need is cut in half on.
10:34
That. Seems like amazing amounts of money, but
10:36
it's not sexy like. So.
10:40
I do want to push back on the
10:42
idea that Open A I as as a
10:44
monopoly in this because you do have other
10:46
companies and this is going to lead and
10:48
dear dear other question but you doing you
10:50
do have other companies building these frontier models
10:52
with it are in foundational miles with the
10:54
that matter with Lambda Three and Anthropic with
10:56
cause like Cod recently surpassed Open A I
10:59
for a moment and then that This is
11:01
where the interesting question about the economics happens
11:03
for me which is that. You. Know
11:05
if all these models become come out
11:07
of I'd like you're going to have
11:09
met Islam. Are not lama three available
11:11
for free open source and where's the
11:13
actual value created? And. Does it
11:16
actually as a crude? To the malik
11:18
raiders or to the people that built
11:20
on top and I strongly believe that
11:22
is gonna be accruing to the companies
11:24
that build on top of these models
11:26
or the that is a consumer product
11:28
or business. This is like labor saving
11:30
and these business efficiencies. The. Companies
11:32
that use them innovative we are the ones that
11:34
like will actually make the money here and that
11:36
sort of goes to Met. As bet that psych
11:38
meds just give this way for free. It's not
11:40
going to be worth. Really? Anything. And
11:42
maybe that's going to Sam's point that
11:44
the cost of intelligence is gonna be
11:46
low. but then you have a real
11:48
are? Oh I question. Yeah, no, this
11:50
is exactly correct vetting. Emily's point is
11:52
very well taken on Jeffrey Katzenberg splinters.
11:54
Very well taken. This enough. One.
11:57
Way to make lots of money out of
11:59
a technology is. To. Take. A
12:01
technology in charge for it. Another
12:03
way to make lots of money out
12:05
of the technology is to take a
12:08
technology and use it to cut your
12:10
costs. And that does seem to be
12:12
something that people are already doing with
12:14
some genuine. Profitable.
12:17
Effect and businesses are doing and is
12:19
gonna is gonna become much more common
12:21
over the next few years and that
12:23
is going to be good for the
12:25
economy and that is gonna be good
12:27
for all of the companies that do
12:29
it and on some level you know
12:31
if the cost savings a high enough
12:33
then the companies will be willing to
12:35
pay some non civil amount of money
12:37
for the I that that using on
12:39
the other hand. If.
12:42
The cost savings: A. Curve.
12:45
The same. No. Matter which hey
12:47
I use and you know, Some.
12:49
Of the eyes are open source and or
12:51
even just build your own with open source
12:53
tools. And
12:56
then they'll probably go that way and there
12:58
won't be a lot of the right revenues
13:00
to the I companies and for a I
13:02
will be distance forth for productivity and profitability
13:05
in the economy and. The. Ai companies
13:07
themselves in Open A I am for
13:09
up Aiken and the rest of them
13:12
will turn out to be not particularly
13:14
valuable and this by the way as
13:16
an outcome that Open Ai has always
13:19
envisaged, right? Like in the early days
13:21
when they were asking for funding, they
13:23
said we would like you to consider
13:25
your. You. Know funding
13:28
to be in the spirit of
13:30
a donations and says still a
13:32
non profit and. If
13:35
that is the outcome the open A I
13:37
winds that's of making everyone else profitable without
13:39
being profitable itself. That is a good
13:41
outcome for the economy and that as a
13:44
good outcome for the were another factor here
13:46
that as I am am not sure
13:48
if op minute spoken directly to this is
13:50
that yeah you don't have. Incident
13:52
data and they cost the data
13:55
acquisition. Is not as it certainly isn't falling
13:57
the way that cost of computing is falling.
14:00
You know, right now you have AI companies
14:02
looking at buying traditional book publishers just so
14:04
that they can add to the corpus of
14:06
things that they're training the models on. So
14:08
the inherent value of the business isn't
14:10
just about the
14:12
algorithmic model, it's about what you can
14:14
do with it within the limitations of the
14:17
data you have to train it on. Are
14:19
we thinking too small here? That's like the other
14:21
question that's coming up because there's another thing that
14:23
Sam Altman said last week that went even more
14:25
viral than the thing that I mentioned. Whether
14:29
we burn 500 million a year or 5 billion
14:32
or 50 billion a year, I don't
14:34
care. I genuinely don't. As long as
14:36
we can, I think, stay on a
14:38
trajectory where eventually we create way more
14:40
value for society than that, and
14:42
as long as we can figure out a way to pay the bills, like
14:44
we're making AGI, it's going to be expensive. It's totally
14:46
worth it. So, yeah, I mean,
14:49
it really is... I
14:51
hate to say this, but it kind
14:53
of smells a little bit like Sam
14:55
Bankman Freed, you know? Not
14:58
saying that he's a... But that kind
15:00
of like, it doesn't matter how much it
15:02
costs just as long as you have a
15:04
positive EV somewhere down the road. That
15:08
idea of you
15:11
can lose any amount of money just so
15:13
long as the value
15:15
of your company is rising faster than
15:17
the losses are piling
15:19
up, is a very
15:24
dangerous game to play if you
15:26
don't really have a sort
15:28
of, let's call it, three
15:31
to four year plan for turning
15:33
it into profits. Like, Sam's idea here seems
15:35
to be like, well, maybe at some point,
15:37
10 years down the line, we will have
15:39
AGI and then we will make lots of money.
15:41
And there are two problems
15:44
with that, which is one, that 10 years down the line
15:46
is a very long time to be burning $50 billion a
15:48
year. But two
15:50
is that he seems
15:52
to just assume that once there's
15:55
AGI, then OpenAI will be a trillion
15:57
dollar company and worth lots of money.
16:00
Again, that's not
16:02
obvious either. To
16:05
Elizabeth's point, I think
16:08
Sam is already trying to move on
16:10
from the LLM model. Right
16:16
now, most of the AI
16:18
that is getting most of the buzz are
16:20
these large language models that need to
16:23
be trained on a bunch of existing
16:25
language. I think everyone is in
16:29
agreement that if you're going
16:31
to get AGI, artificial general
16:34
intelligence, it's not going to
16:36
be a chatbot
16:38
that basically gives
16:41
language answers to language questions because
16:43
it's trained on language models. He's
16:46
going to need to invest a huge amount of money in something
16:48
much bigger than that. Yeah, we don't even know if
16:50
AGI is possible. So it's sort of putting
16:53
any timeline underneath it is
16:55
speculative and I realize that
16:57
that's part of your
16:59
job if you're working
17:01
in an innovative frontier tech
17:04
company. But in the case
17:06
of AGI specifically, even experts
17:08
who have been studying this for decades aren't
17:10
sure that we will ever get to
17:12
AGI. So even making
17:14
estimates about what sort
17:16
of resources it would take and how long
17:19
it is. It's very strange
17:21
to look at a company, it's not
17:23
a public company, I guess, again, Altman
17:26
can make speculative statements about it,
17:28
but he does so with such
17:31
confidence when the underlying goal
17:33
is not even something we know
17:35
can be achieved. Right,
17:37
but this is one of the
17:40
things that Silicon Valley VCs love
17:42
is people who have great confidence
17:44
about things that are highly improbable.
17:47
They have a bunch of, you know,
17:50
they've learned by looking
17:52
that if you fund someone
17:55
who is very confident about something that seems
17:57
impossible, then there's a good chance you'll lose
17:59
all your money. But there's also like
18:01
those are the ones that have the biggest
18:03
attempts as well It's
18:06
good more value for society
18:11
What what are the problems that AGI Like
18:14
I can like rattle off many problems with
18:16
society and none of them in my head
18:18
can be solved by Sam Altman and his
18:20
company I think the big answer
18:22
on that front is scientific discoveries like I think
18:24
it's no accident that one of the things that
18:27
deep mind will tell you about is Alpha
18:30
fold like in the first breath where they've
18:32
been able to decode proteins because they think
18:34
that You know will help for drug discovery
18:36
and maybe there's an idea that you train
18:38
these bots on You know all
18:40
the scientific literature and you give it some problem
18:42
sets and the things that they're able to
18:44
do now Or there everybody's working
18:46
on his reasoning so they can break it
18:48
down to the component parts and then you
18:50
know Try different solutions on each step and
18:53
eventually get you to a solution and
18:55
I do it So I do wonder let's say we
18:57
don't get to AGI But let's say we get some
18:59
things that might you know, maybe short
19:02
but close right? So these agents that
19:04
take action for us disability to reasoning
19:06
to reason scientific discovery Making
19:09
our everyday business operations more efficient. Maybe
19:12
that is something that's that's quite valuable
19:14
I don't I wouldn't you know happily
19:16
burn 50 billion dollars a year on
19:18
it But you know to earnestly take
19:21
up Sam's case like maybe
19:23
there is something there I guess it's like
19:25
cure for cancer. That's like the answer, but
19:27
it's not just a cure for cancer It's
19:30
like a highly individualized cure for
19:32
cancer, right? it's it's
19:34
it's the ability for an Individual
19:37
person with an individual genome to go
19:40
in and with an individual, you know
19:43
cancerous growth and get
19:46
a A treatment that
19:48
is tailored for them at a
19:50
very low cost right now that kind of
19:52
exists But it costs like over a million
19:54
dollars and if we can bring that down
19:56
from a million dollars to you
19:58
know, a hundred dollars It does, that's
20:01
because of all the money being spent
20:03
on the healthcare system
20:05
is so inefficient and expensive and the
20:07
problems are so basic. Not
20:10
to be all like there are starving people
20:13
in other countries kind of an argument, but
20:15
there are more immediate and solvable
20:17
healthcare problems that these billions and
20:19
billions and billions of dollars could
20:21
go to solve to better society
20:24
right now versus spending $150
20:26
billion or $5 billion a year on
20:28
something we may not ever
20:30
come to fruition and maybe no one
20:32
can afford in the final answer. If
20:38
you look at the people who
20:40
are funding this, some of them
20:42
do have what you
20:45
might call quasi-philanthropic goals. They
20:48
do think of this as a form
20:51
of for-profit
20:53
philanthropy, Matthew Bishop would call it
20:55
philanthropic capitalism and okay,
20:58
fine, we can have a whole other segment on that if
21:00
we want, but I
21:03
don't think anyone is ... There's a small pocket of
21:05
true believers saying that this
21:09
is the first best place to invest money for
21:11
the sake of the well-being of the planet and
21:13
if you want to help the poor, then this
21:16
is the best way to do it. That
21:19
small pocket kind of lost
21:21
a lot of credibility when FDX imploded
21:23
because a lot of them were effective
21:26
altruists of some flavor and
21:28
I think we've kind of moved
21:30
on from that. I think that to say that it
21:32
is not the first best philanthropic
21:34
place to invest your money to help the
21:37
poor is not to say that it's a
21:39
bad investment. I agree with that, but I
21:41
think there are two other things that we
21:43
have to look at. One is that a
21:46
lot of the strategic money that's going into AI
21:49
right now is still just about AI
21:51
hype and whenever you sort of scratch
21:53
the surface of what a lot of
21:55
people like Altman are saying, they're clearly
21:57
relying on the fact that most people when
21:59
they think about AI can't distinguish between, say,
22:01
a large language model
22:03
or machine learning
22:05
or image-based visual
22:08
or image-based generative AI. It's
22:11
all just one category. And these are very
22:13
different technologies. And I know
22:15
we were going to talk about Tesla a little
22:17
bit. Elon Musk is now claiming that Tesla
22:19
is an AI company. And when
22:22
I see that, I just see an
22:24
attempt to get money that's already
22:26
flowing into a very specific sector to
22:29
start flowing in his direction. Well,
22:32
he's calling it a robotics company, which is different than AI.
22:34
And we can talk about that. Yeah. He's
22:36
also got a separate company called XAI, which is
22:38
an AI company that he's raising like $5 billion
22:41
for. He also says that his robotics model will
22:43
be cineered by 2025. And
22:46
if that's, you know... Elon
22:48
says lots of things. But to your
22:51
point, Elizabeth, insofar as the people making
22:53
these investments, and to be clear, these
22:55
investments are large, but they're not enormous.
22:57
They're like, you know, some
22:59
fraction of the VC money out there. And
23:01
the VC money out there is some small
23:04
fraction of the total, you know, investor
23:06
base. Insofar as
23:08
the people making these investments are
23:11
being silly and making category errors and doing
23:13
all of the things that you say that
23:16
they're doing, like these
23:18
are VCs. Being
23:21
silly and making category errors is
23:24
what VCs do. And the whole
23:26
point about VC money is it's risk
23:28
capital that literally everyone who is invested
23:30
in a VC fund can't afford to
23:32
lose. Like, this is the correct money
23:34
to make dumb bets that are going
23:37
to lose. It is
23:39
not dangerous for VCs to light
23:41
a billion dollars on fire. It is perfectly fine.
23:43
They always have and they always will. Can
23:46
I... Let me give you a
23:48
counterpoint on that one. A lot of the money
23:50
funding these companies have come from the big tech companies.
23:53
Right, so you think about Microsoft as being a
23:55
huge funder of OpenAI and Google and Amazon have
23:58
been huge funders of Amazon. and
24:01
Metas has used its own money to build Vama
24:03
3. So like try to find someone who's like
24:05
really made, taken VC monies
24:07
and put it into the development of large language
24:10
models and it's a little bit tough to find
24:12
it without big tech money. So actually I think
24:14
what you have is instead of VCs taking this
24:16
money and sort of squandering it, you
24:18
have these tech companies taking
24:21
the investment capital of retail
24:24
investors and institutions. It is not the
24:26
investment capital of retail investors, it is
24:28
their own profits. It is all highly
24:30
profitable companies. Microsoft is famously
24:32
just giving Azure compute more
24:34
than is actual cash
24:36
dollars. Google has
24:38
definitely invested a lot of money into
24:40
DeepMind over the years and
24:44
Facebook has famously bought billions
24:47
of dollars worth of H100 chips and
24:49
yeah, fine. But this is money
24:51
they can afford to spend and
24:54
again, I'm not, these
24:56
are already multi-trillies. So
24:59
if you look at the other multi-trillion
25:02
dollar companies, if they
25:04
burn a few billion dollars, they will
25:06
still be multi-trillion dollar companies. It's
25:09
kind of no harm, no foul. Okay,
25:11
so we've talked a little bit about Elon Musk in
25:14
trying to pivot to robotics within Tesla. Why
25:17
don't we take a break and come back and unpack that. So
25:19
we'll be back right after this. The LinkedIn
25:21
Podcast Network is sponsored by TIAA. In the
25:23
last 100 years, we've seen financial markets swing,
25:28
decades of savings lost in days,
25:30
all showing that a retirement plan
25:33
without a guarantee quite simply isn't
25:35
enough. So more than a retirement
25:37
plan, TIAA makes you a retirement
25:39
promise. A promise of
25:42
a guaranteed retirement paycheck for
25:44
life. A promise that pays
25:46
off. Learn more at tiaa.org
25:48
backslash promises pay off. Hi,
25:52
I'm Thorber Korn, LinkedIn's chief product officer.
25:55
On my podcast, Building One, we dive deep
25:57
into what it takes to build great products.
26:01
Recently we had Zach Perret, the CEO of
26:03
Platt, and he shared about his struggles building
26:05
a financial app for consumers and how he
26:07
was able to turn it all around with
26:09
a critical pivot. Take a listen. I
26:12
personally couldn't resonate as much with the consumer set
26:14
that we were trying to reach. I just didn't
26:16
have that level of empathy. When we
26:19
made the shift to building a B2B product though,
26:21
I was building the product that I wanted. My
26:23
co-friend and I were creating the product that we wanted
26:26
ourselves and we had so much empathy for what that
26:28
product was. Such a great insight. You know, in that
26:30
sense we got lucky because we were creating a thing for
26:32
ourselves. And then the people that we were talking to also
26:34
had the same problems we did. They were FinTech developers. We'd
26:36
been the FinTech developer. We've been trying
26:38
to build a FinTech product for a year. And
26:40
so we had such deep empathy. We
26:42
had such a clear ability to... If you want
26:45
to hear more of Zach Perret's story and the
26:47
lessons that follow, listen and subscribe
26:49
to my podcast, Building One. And
26:53
we're back here on Big Technology Podcast with
26:55
the cast of Slate Money. Great to have
26:57
you all here. We're cast members. The
27:00
cast members. I remember when I
27:02
joined Disney, they were like, congratulations on becoming a
27:04
cast member. And I was like, okay, this is
27:06
a really weird company to work for. Yes.
27:09
Well, okay. So maybe we'll use a different word.
27:11
The hosts of Slate Money. How's that? Anyway,
27:14
so we talked a little bit before the
27:16
break about the Tesla robotics play. It's happening
27:18
in this moment where Tesla seems to be
27:20
in rough shape. And I know you've talked about
27:22
it on the show, but just
27:24
for context, it's down 25% year to date.
27:28
It's up 8% over the past one
27:30
year, which is interesting, sort of kind of
27:32
lost in this narrative. But BBC just
27:34
had a story asking if the wheels have come
27:36
off for Tesla, saying there was a time where
27:39
it seemed like it could do no wrong,
27:41
but now the company is struggling and it really
27:43
captures it with falling car sales, intense competition from
27:45
Chinese brands, problems with the Cybertruck,
27:47
low sales have hit revenues and hurt profits,
27:49
and the share price has gone more than
27:52
a quarter since the start of the year.
27:54
It's now in the process of cutting 14,000 employees
27:57
and it's also cut the entire team responsible
27:59
for it. It's a much
28:01
admired supercharger network. So
28:04
what is going on with Tesla? And then we
28:06
can get in a little bit to this robotics
28:08
pivot. But what's the thing? I know you've talked a lot about it. My
28:11
big picture theory of Tesla is
28:14
that it had first move
28:16
for advantage and for a
28:18
long time its EVs
28:20
were three years ahead of everyone else.
28:24
And they're not anymore. And
28:27
now they're basically zero years ahead of
28:29
everyone else. Or
28:31
maybe like a
28:33
tiny bit, depending on what you're looking for.
28:37
And if you look at the stock
28:40
market valuation, it is trading at
28:42
50 times forward earnings compared to
28:44
standard car companies that trade at
28:47
like four or five times forward
28:50
earnings. And good ones like Toyota.
28:56
So something doesn't compute. Something doesn't
28:58
add up there. The idea behind
29:01
that massive multiple
29:03
that it trades on is
29:06
that it has some kind of
29:08
unique competitive advantage over the
29:10
rest of the car industry. And
29:13
if you look around at who's making the
29:15
best EVs and the best value EVs out
29:17
there, it's BYD. It's
29:19
not Tesla. And we're talking about global
29:22
companies here. Tesla has
29:25
a nice little advantage in the United States because
29:27
the United States government is doing everything it can
29:29
to avoid Chinese EVs being sold here. So it
29:32
gets to avoid that competition in the US. But
29:34
that's not the case in the rest of the
29:36
world. And the rest of the world is all
29:38
that. Also, on our show, we've talked about Tesla
29:40
not infrequently as a meme stock. And while it's
29:42
not game stock, there is a lot of the,
29:44
I think, stock, the value of
29:47
the stock is heavily wrapped up in
29:49
Elon as a personality and a
29:51
brand. And so some of this, I think,
29:53
is at least Ross Gerber, who's
29:55
a big Tesla shareholder, argues that some of
29:57
the fall in the stock price is really...
30:00
about Elon sort of being a
30:02
chaos monkey within his own company. Right.
30:05
And Elon, he can't stop founding new
30:08
companies, right? He's
30:10
got XAI now, he's got Neuralink,
30:12
he's got the boring company, he's
30:14
got Twitter. I'm sure
30:16
there's a few I'm forgetting. He's
30:19
very... SpaceX. SpaceX, of
30:21
course. Yeah. And like, you
30:24
know, he's trying to do all of these
30:26
things at once while tweeting maniacally through the
30:28
whole thing. And
30:30
so at some point you have to
30:32
ask when does Elon stop being the
30:35
reason why Tesla's multiple
30:38
is 10X everyone else and starts
30:41
being actually a weight
30:43
on the stock that is under if he
30:45
left, you know, the stock price would go
30:47
up rather than down. I
30:49
wonder if he's just... So on
30:51
our show, I guess last week,
30:54
we talked about the supercharger situation,
30:56
you know, layoffs and
30:59
cutting out this part of Tesla's
31:01
business that is widely admired and
31:03
believed it can someday be profitable.
31:05
And why does this make
31:08
sense? And I tried to argue that I think one of
31:10
our readers calls it the 40 chess,
31:12
you know, argument that like, it
31:14
seems so irrational. There has to
31:16
be some reason that Elon Musk
31:18
did this, that like, he
31:21
can't be this like unhinged and
31:23
wild. And so I kind
31:25
of thought that even though I'm not like exactly
31:28
like an Elon Stan or anything. And
31:30
someone wrote in and was like, no, this
31:32
was just really unhinged and wild. And his
31:35
no one wants him to do this, his own company
31:37
doesn't didn't want this to happen. He's a long history.
31:39
It's just possible the man is out of control. He
31:42
has a long history of erratic and impulsive behavior too.
31:44
And sometimes people I think part of
31:46
his lore is that you
31:48
can be a certain kind of charismatic entrepreneur and
31:50
there's a class of people who admires you
31:52
for that kind of chaos or
31:55
the sort of very
31:57
confident, You
31:59
know, impulsive. That you're making were toys framed,
32:01
as you know, I lived with my gut
32:03
and you I'm sort of in bodies that
32:06
and some people admire. By
32:08
I personally think it's a sign
32:10
of a Ceo is not terribly
32:13
stable. they wouldn't like a different
32:15
investor by I understand the appeal
32:17
to certain people. That.
32:20
At a certain point it's like the wheels have
32:22
come off and and the stuff. He's to do
32:25
isn't working anymore like. You know, used to never
32:27
do your homework and get great math grades and then
32:29
at some point. He malware eighty five
32:31
realizing that don't understand how out in the word
32:33
as frog or. Exactly. But
32:35
isn't that showing him a tiny bit short
32:37
Ny? I agree with lot of this, but
32:40
also like he did, he has been able
32:42
to build Tesla and Spacex is doing well.
32:44
I mean Axis I think a disaster, but.
32:48
That may sound silly. were like i think
32:50
i think this is this is a super
32:52
interesting question is that the. More.
32:55
That. What? You're doing is
32:57
solving and engineering problem. The
33:00
mess a he tends to do spacex
33:02
has to big advantages. One is that
33:05
he kind of doesn't. Touch. It
33:07
very much he doesn't spend much time on earth.
33:09
He has a woman named When Shot over who
33:11
runs it, runs it, who by all accounts is
33:13
excellent. Any kind of trust said to. Do.
33:15
The right thing as it runs itself,
33:18
but also. It's. Solving engineering problems
33:20
is how do we get really heavy
33:22
things up into space and he by
33:24
I can solve that problem. In
33:27
the early days of Tesla, what he
33:29
had was an engineering problem. How do
33:32
I build an electric cars Electric cars
33:34
the something that didn't really exist. He
33:36
wanted to build and links a car
33:38
that was your more powerful than bathroom
33:40
justice affordable as as as whom I
33:43
seek us and everyone's that it couldn't
33:45
be done and he did it and
33:47
that was an engineering problem and that
33:49
was his great contribution to the world
33:51
right? He showed that it could be
33:54
done. But. Then
33:56
the having shown that it could be done. Other
33:58
people's questions. The relay for they could
34:01
do it to a now they are doing it
34:03
to when they're doing it frankly just as well
34:05
if not better than he. If.
34:08
You. Go further away from engineering
34:10
problems. In to say he had take.
34:13
Boring company. He thinks it's an engineering problem
34:15
like how you build a tunnel. in fact
34:17
it says you know zoning problem enough for
34:19
added threat in the transit problem and s
34:22
trying to deal with local government problem and
34:24
he's terrible a bad and is going nowhere
34:26
and as a disaster. If
34:28
you buy Twitter, there's no engineering
34:30
at all. It's all about like
34:32
working with humans and networks in
34:34
moderation in all of this kind
34:37
of stuff. and he has no
34:39
idea how to do that. So
34:41
I think that you know. There.
34:43
Are things he's good at, but the
34:45
kinds of things that Tesla needs to
34:48
do in order to be successful going
34:50
forwards or not. Really engineering problems as
34:52
the Well does. Not sitting here Going
34:54
U S. E B
34:57
need to be technologically much more advanced in
34:59
order to be successful. No one is. Desperately.
35:03
To holding their breath waiting for yeah full
35:05
self driving cars and autonomy to arrive. I
35:07
give it comes. it comes best. For.
35:10
The time being. If.
35:12
You want to compete on he these. You've
35:14
got to compete frankly on cost and is
35:17
very hard to compete with Chinese On Caught.
35:19
In fact, it's impossible. I agree with
35:21
the top line thesis that ill and
35:23
success of companies is correlated. See whether
35:25
or not it's as engineering problem but I
35:27
believe it for exactly the opposite Really reason
35:30
that the like that as I don't think
35:32
he learns really an engineer and where he
35:34
didn't did I say i'm in that I've
35:36
you to imply that he knows that assault
35:39
these engineering problems and I don't think that's
35:41
what's happening? I think where you see him
35:43
being successful is it a very early stage
35:45
of a company when his subic his skills
35:48
are writing a check for capital intensive business
35:50
nobody else wants but money and. And
35:52
then managing shareholder expectations and then
35:54
the Mormon Church. These companies get.
35:57
The more he's not mediated
35:59
the Pr people and lawyers.
36:02
People sort of begin to understand that
36:04
he's not the best Manager says engineering
36:06
capabilities are not barely exists. It is
36:09
that an engineer by. You
36:11
know, education or trades? Or
36:14
his a yes or no. So the question. Is
36:16
is he a good Ceo is not
36:18
a good product person necessarily. And
36:21
in the things you expect to see, oh dear,
36:23
manage. Well managed. Shareholder expectations communicate
36:25
well externally, and that's where
36:28
he's shooting himself in the
36:30
foot constantly. right?
36:32
And I think the more he gets
36:34
involved in that on us afloat the
36:36
the weirder guess like as who would
36:38
say with work which is clearly a
36:40
creature of yo you on musk as
36:42
product manager or all of the crazy
36:44
back and forth insanity around with a
36:46
blue and who gets sick mother and
36:48
who doesn't the net that those kind
36:50
of product decisions when he makes them
36:52
ah that tend to work out very
36:54
badly. That said, Yo. The
36:58
model S when it came out
37:00
was as a product, genuinely revolutionary
37:02
and amazing and everyone's mind was
37:04
blown. Feeds you know the amazing
37:07
videos his spacex rocket like landing
37:09
vertically and saying up right after
37:11
going to stay feel? it's okay.
37:13
that's a really legitimately impressive product.
37:16
Did Ilan must personally design them?
37:18
Know that use? you know he?
37:20
He has enough engineering mouth to
37:23
release. kind of understand what's going
37:25
on there. So. Can we then
37:27
think about this robotic thing as the next
37:29
in the line of engineering problems that he's
37:31
tackled and try to solve? And as.basically what's
37:34
happening with this. Is it? In.
37:36
Terms of like this framing of of. Tesla.
37:39
As robotics company. And.
37:41
Understand the client city Well that's like.
37:43
Sense. Other actually building a robot they have
37:45
a you annoyed robot called optimists that
37:47
they say they're gonna really really dramatic
37:49
automation and auto companies As he announced
37:51
that that makes sense for Tesla, but
37:53
is is your talk about robots for
37:55
General Year said on Understand it at
37:57
all. Your. What's what's.
38:00
What's this robot supposed to do?
38:02
Alex? I don't fully know.
38:04
I mean it is supposed to be,
38:06
I guess. I guess it's the human
38:08
I'd robot you would imagine. You can
38:10
sort of put it into action the
38:12
same way you would like an L
38:14
M, except in the real world. So
38:16
something that's assistive, something I imagine can
38:18
do work. On but.
38:21
We've. Had like Boston Robot, Boston Robotics or
38:23
Dynamics has been doing. These were like robot
38:25
demos for a while but they're not exactly
38:27
like mass produced outside of like sometimes like
38:30
the and Y P D will buy one
38:32
and they'll be like this whole blow up
38:34
around it. Is a creepy
38:36
robot a my supermarket that. Like or
38:38
every every heirloom. yeah yeah, no I've
38:40
I've certainly had a couple of like
38:43
cute little robots in hotels so late.
38:45
deliver your room service to burn. I'm
38:47
but does. but see. the other thing
38:49
that we have to mention about. This
38:52
sort of extended. He. Longs
38:55
universe is that. He
38:58
can kind of put whatever he
39:00
likes wherever he likes this robots
39:02
that he's talking about. Is
39:04
you know may be part of Tesla
39:06
right now, but maybe it could suddenly
39:08
turn out to be part of X
39:11
A I You know if he woke
39:13
up one morning and and decided to
39:15
change his mind in s the If
39:17
part is nothing up. X Ai is
39:20
actually him. Basically
39:22
threatening. Their
39:25
borders. Tesla. And
39:27
saying like, unless you give me another
39:29
hundred billion dollars worth of pay, I'm
39:31
just gonna do all of my sexy
39:33
I stuff somewhere else He said that
39:35
quite explicitly. Yeah.
39:38
He famously brought a bunch his Tesla
39:40
engineers have to Twitter after he bought
39:42
it president for us. in the have
39:44
a Twitter engineers serve. As.
39:46
An investor in any of the lawns
39:48
companies, you kind of. Don't.
39:51
Know what you're investing in? Because of that
39:53
money could just wind up. Benefiting.
39:56
A completely different company altogether. Yep,
39:58
says is from my. Trusting
40:00
engineering.com they say that robot is
40:02
designed to be a general purpose
40:04
machines that can help humans in
40:06
various domain such as manufacturing, construction,
40:08
health care and entertainment. A
40:11
I will add my new Tesla you
40:13
know I If you want to revolutionize.
40:17
The. American economy. Robot.
40:19
That can build houses would be amazing
40:21
because the cost of building a house.
40:23
The labor cost of building a house
40:25
is not only extremely high, but there
40:27
just isn't enough labor to go around
40:29
that there's a massive labor shortage of
40:31
people who out there are skilled enough
40:33
to build a house and if we
40:35
could get a bunch of robots to
40:37
do that, that would be amazing for
40:39
making housing or affordable. Yeah, I'm
40:42
watching a video of it now and this
40:44
robot is like taking things off and assembly
40:46
line stalking and and like special compartments in
40:48
some container so. Who knows her
40:50
body. I you know, a lot of
40:52
robotics used in manufacturing. It's not like.
40:55
Yeah, but that's on like assembly lines and the
40:57
idea is that if you if you. Put
41:00
a sort of x and and a i
41:02
tip interest in it. Can. Work.
41:04
In like real world or yes,
41:07
situations like. A building. Site.
41:11
Okay as were coming towards a close I just
41:13
want to talk about this thing that as like
41:15
had and ah my prep doc with Ron John
41:17
for like months and haven't gotten around to of
41:19
but I think this is the right. Crowded.
41:21
Talk about it with and that is
41:24
sort of when it's time to get
41:26
off their Donna treadmill and retire and
41:28
whether retirement is still gonna be. A
41:31
things are just set it up. There was
41:33
as reddit posts where this person are
41:35
posted and. They. Said I After the
41:37
first two or three million a paid off
41:39
home and a good car, there's no difference
41:42
in quality of life between you and Jeff
41:44
Bezos. Basically. Like the sooner than
41:46
you figure this out, the happier you're going
41:48
to be. And time as the currency of
41:50
life, not money, And. The.
41:52
Desire Austin Reef is the founder of
41:54
Morning Bruce said he posted this and
41:57
he like summarize the responses and he
41:59
said it's funny. Everyone I know who
42:01
has two to three million dollars think the
42:03
magic number is ten million and everyone I
42:05
know was ten million. Think the magic numbers
42:07
twenty five million and everyone I know who
42:09
is twenty five million Six, the magic number
42:11
is one hundred million. So busy to saying
42:14
I know average people. that's really think that's
42:16
kind of a humble brag. but it is
42:18
I I guess like let me turn it
42:20
over those the slate money crew on this
42:22
one. Where. Do you think
42:24
about this and and do I mean
42:26
I guess like were so so The
42:28
first the first and we need to
42:30
ask is like you know the let's
42:32
let's be clear about defining ah times
42:34
more with what with defining here. Is.
42:40
How. Much money? can you
42:42
be happy? Living
42:45
on. In the
42:47
absence of any income. How
42:49
much money you need to have in
42:51
order to retire comfortably and have basically
42:53
the same standard of living is Jeff
42:55
Bezos to within in and five percentage
42:58
points says days as. So
43:01
and as that's that's an interesting question.
43:03
But one of the. One
43:06
one of the ways that you need to the next
43:08
question that you need to ask is. How
43:11
much money are you making right
43:13
now in income? Because
43:15
to your point, Alex about the hedonic treadmill.
43:18
The whole point of the hit on a
43:20
treadmill is that you. Are
43:23
a little bit unsatisfied with your current
43:25
income and you want a little bit
43:27
more income and that is not a
43:29
function of. Well, thats a function of
43:32
income. Now of lump sum of cash
43:34
of an amount of wealth will generate
43:36
a certain amount of income. And for
43:38
our purposes, let's just say four percent.
43:41
Let's just say that you know a
43:43
lump sum of cash will generate. A
43:45
certain amount real income in perpetuity of roughly
43:47
four per cents. If you have a million
43:49
dollars, thou give you forty thousand dollars a
43:52
year in. Real. Income
43:54
in perpetuity. so. If.
43:57
You have amassed your million dollars
43:59
of wealth by earning one hundred
44:01
and fifty thousand dollars a year.
44:03
And then you retire with a
44:05
million dollars and suddenly have to
44:07
zone forty thousand dollars a year.
44:10
That's a major decrease in your
44:12
standard of living. If however, you
44:14
just graduated from college and you inherited
44:16
a million dollars he own you've never
44:18
had forty thousand dollars you to live
44:20
on the new suddenly have to forty
44:22
thousand dollar your income stream then it's.
44:25
An. Increasing a fan of living and you can probably
44:27
do that so I think. It's.
44:29
That there's two variables here, right? It's
44:32
not just the question of how much
44:34
money as enough is Also, how much.
44:36
Income. Or you use to.
44:38
And if you can reach that
44:41
point where the amount of money
44:43
you have. Divided. By
44:46
twenty five will is equal
44:48
to your current income. Then
44:50
I think you're happy than retire. And.
44:53
Let me just talk about like this for
44:55
time and question overall because more more we
44:57
see that are social systems are overburdened and
44:59
I think a big political issue of for
45:01
the next couple years is gonna be whether
45:03
these things like social security and he to
45:06
kick in at the age as they do
45:08
and years. As one quote from Ben Shapiro
45:10
he said no one of the United States
45:12
should be retiring at sixty five years old.
45:14
Frankly, I think retirement self as a
45:17
stupid idea monsieur some sort of problem. In
45:20
and we already said that I'm who sit
45:22
down with I s he has enough money
45:24
for as I said that good of society
45:26
Yeah. I'm full first,
45:28
Indian, and thirty percent of people retire
45:30
between ages of sixty two and sixty
45:33
four. And. A
45:35
man. A bunch of people retire at sixty
45:37
fast. The people retire I think earlier than
45:39
they. Think they're gonna retire? I
45:41
don't really know them superyachts talking about for
45:44
meaningful. When to retire isn't
45:46
really. they don't have as much agency
45:48
and making that decision. I think then
45:50
someone like a Bench Shapiro A is
45:52
imagining it. You know you. You.
45:55
Got laid off some your last job because
45:57
you're too expenses. Near company. Would
45:59
rather. Hire someone thirty years younger than you, So
46:01
that's what happens. another son, you're out of work,
46:04
and. You. Know you're sixty one years old
46:06
and known wants to hire sixty one year old
46:08
anymore. So. You're consulting and you're
46:10
basically retired or you get hurt on
46:12
the job. There's so many people. You.
46:15
Know without college degrees that are doing some
46:17
kind of physical labor and they'd bodies can't
46:19
make it to sixty five, or six to
46:21
seven, or. And I think been
46:23
all says sisters incapable. Of imagining the
46:25
lives of people who not like color
46:27
elites are. You know when you look
46:30
at people here retire earlier than sixty
46:32
five we have. our people don't even
46:34
have retirement plans and they they end
46:36
up during. It's is because the work
46:38
is exhausting. Interview for your during a
46:41
job where you have to do hard
46:43
labor or even in positions where you're
46:45
on your feet all day in retail
46:47
for five maybe six days a week.
46:51
And. And I think some of this when when
46:53
been says he. Doesn't think people should retire
46:55
as a key is reflecting as sentiment
46:57
that's a little bit political which is
46:59
that your work is inherently good. And.
47:02
Everyone should strive to work and and as that
47:04
are the reality as a lot people work and
47:06
really crappy jobs that make the misses her paw
47:08
so it's a you sort of have to ask
47:10
yourself who benefits are not. And
47:13
then in terms of how much how much you need, I
47:15
think. And Felix is written about
47:17
this like. No one really
47:19
knows how much they need in retirement like it's
47:22
a real big mystery. Like you get to that
47:24
end it and you have a lump sum of
47:26
money. but then like. You. Don't know one
47:28
of the. The parts of the equation which
47:30
is like how long you're gonna live it's kind of
47:32
a mystery and hope that the bass but you also.
47:35
Need the money to last and so that best
47:37
number you is reach said think it's There's a
47:39
lot of anxiety there in terms of like making
47:41
that decision like I'll. Miss. That bringing
47:43
in money and like. Hope. What
47:46
I have lasts for the next twenty
47:48
thirty years or something. And and arms
47:50
to be clear offer like the people with
47:52
twenty five Really goofy. maybe the hundred million
47:55
like this. That point the a bit just
47:57
to be cleared. Those people that do not
47:59
think they need a hundred money and because
48:01
they are worried about burning through that twenty
48:03
five million does the people. They may be
48:06
the hundred million because at that point you
48:08
start becoming more ambitious in terms of. How
48:11
much money you want to have When you
48:13
die and you want to leave money for
48:15
your family and your kids, You wanna leave
48:17
money to charity? Wanna You know? you want
48:20
a certain amount of wealthy, one a certain
48:22
amount of legacy? It's a little bit like
48:24
once you have twenty five million, there is.
48:27
Almost. Zero chance you're just gonna
48:29
spend Ill. And
48:31
that's remembering the government. yeah on not
48:33
semiconductors. do we think the government that
48:35
has borrowed against social Security but brow
48:37
Reagan about to see like a war.
48:40
On. Retirement as they try to figure
48:42
out a way to raise the retirement
48:45
age and he a can avoid Alvin.
48:47
That won't happen because Source Our Social
48:49
Security enjoys enormous bipartisan support And and
48:51
they arrived yet people republican specifically who
48:54
would rather that not be the case
48:56
because it makes it hard to kill
48:58
entitlements generally by it's it's it's. A
49:01
given that their face is is.
49:03
When it most rapidly aging segments
49:05
of the population it's gonna be
49:07
very difficult to get anything past
49:10
politically that would actually in out
49:12
a take put a dent in
49:14
social Security as a program. I
49:18
think that raising the retirement and that's
49:20
something I could see happening other it's
49:22
happen in other countries. People hate it
49:24
and they. They have happened in that hazard and.
49:26
Prior have as happen in this country
49:28
Yes and it does. I mean it
49:31
makes us at sends people do live
49:33
longer and unfortunately put more poor people
49:35
on low income people don't really know
49:37
that much longer. Sell. I'm not
49:39
sure about it as a policy. Overall, Okay
49:43
are we talked about a I Tesla
49:45
and the Died Treadmill and says pretty
49:47
diverse but super fun our conversation So
49:49
thank you to the Slate Money crew
49:52
the cohosts Felix, Emily and Elizabeth. Great
49:54
getting a chance to speak with you
49:56
about this stuff and I really can't
49:58
wait to hang out. In your neck
50:00
of the word sometimes. Ambience: Has
50:03
been salix like clouds. Thanks again thinks
50:05
everybody's I will be back on Friday
50:07
with runs on way to break down
50:09
the new two weeks news. ah until
50:11
then next to. Us.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More