Podchaser Logo
Home
The US has found a common enemy to unite against: China. AI is not going to kill you - not yet at least. The Fed will get the slowdown it wants, maybe even recession. Europe does not realise just how far the policy agenda has changed in DC.

The US has found a common enemy to unite against: China. AI is not going to kill you - not yet at least. The Fed will get the slowdown it wants, maybe even recession. Europe does not realise just how far the policy agenda has changed in DC.

Released Friday, 5th May 2023
Good episode? Give it some love!
The US has found a common enemy to unite against: China. AI is not going to kill you - not yet at least. The Fed will get the slowdown it wants, maybe even recession. Europe does not realise just how far the policy agenda has changed in DC.

The US has found a common enemy to unite against: China. AI is not going to kill you - not yet at least. The Fed will get the slowdown it wants, maybe even recession. Europe does not realise just how far the policy agenda has changed in DC.

The US has found a common enemy to unite against: China. AI is not going to kill you - not yet at least. The Fed will get the slowdown it wants, maybe even recession. Europe does not realise just how far the policy agenda has changed in DC.

The US has found a common enemy to unite against: China. AI is not going to kill you - not yet at least. The Fed will get the slowdown it wants, maybe even recession. Europe does not realise just how far the policy agenda has changed in DC.

Friday, 5th May 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

It all started with mom. She's always

0:02

had a way of making everyday moments feel like

0:04

an adventure. So this Mother's Day, Kate

0:06

Spade New York is here to help you thank mom and

0:08

all the mother figures in your life. From our

0:10

newest arrivals, like springy dragonfly

0:13

adorned handbags and jewelry, to gifts

0:15

under $100, to the best sellers

0:17

on mom's most wanted list, there's something

0:19

for everyone. Maybe you can even treat

0:21

yourself too. Shop the Mother's

0:23

Day gift guide at katespade.com.

0:26

Welcome

0:28

to The Other

0:31

Hand,

0:34

a podcast

0:36

by Jim Power and Chris Johns that looks

0:38

at the major political, economic and

0:40

financial developments around the world from

0:43

a uniquely Anglo-Irish perspective.

0:52

All our podcasts can be found at our Substack

0:55

site and all good podcast platforms.

1:00

We're doing something a little bit different today. We

1:02

have a guest joining us from San

1:05

Francisco, Noah Smith. Noah

1:07

grew up in Texas. He studied physics

1:09

in college. He was an economics PhD

1:11

student at the University of Michigan.

1:14

He was an assistant finance professor at Stony

1:16

Brook University in New

1:19

York state. He was then an economic

1:21

columnist for Bloomberg Opinion,

1:23

left that in 2021 to become

1:25

a

1:26

full-time blogger. Although he had

1:28

been blogging, I think for about a decade

1:30

prior to that. The blog is called Noah

1:33

Opinion. I would strongly recommend to

1:35

all our listeners to check it out,

1:37

subscribe. It's absolutely fantastic.

1:40

He covers economics, politics,

1:42

finance, technology. An interesting

1:45

fact about Noah is that he believes that

1:47

rabbits

1:48

are an underrated pet

1:50

and he lives in San Francisco with two

1:52

rabbits, Cinnamon

1:54

and Constable Giggles. Noah

1:56

is also a self-described

1:59

techno.

1:59

optimist. And it's on

2:02

that note, no, I'd like to start off in

2:04

the economists today. There was

2:07

an interview with your Val Noah

2:09

Harari. Just throw a quote

2:11

at you. He said that AI

2:14

has hacked the operating system of human

2:16

civilization. And storytelling

2:18

computers will change the course of human

2:21

history. As a techno optimist,

2:23

how do you respond to that? Great,

2:26

let's, let's see what they can do. That's

2:28

pretty much my response.

2:29

It's, um, I don't know about the

2:32

metaphors of the operating system of human

2:34

society. I don't really know what that necessarily

2:36

means, um, which is not a criticism.

2:39

I do think that, you know, large language

2:41

models, chatbots like GPT

2:44

four or, or whatever, you know, they'll

2:46

change a lot of things because a lot of what we spend

2:48

our time doing is sort of talking to each other that

2:51

will, that will change. Also those,

2:54

those machines will provide a way for us to

2:56

tell computers and robots

2:58

and all, you know, mechanized systems what

3:00

to do simply by talking at them. So instead of

3:02

having you having to actually write code

3:05

and essentially write the instructions

3:07

of what the machine should do, you say, Hey

3:09

machine,

3:10

create a window on my screen that

3:12

looks like

3:13

this and you know, has this in it

3:16

and the machine will just do it. It'll understand

3:18

your language very naturally. And it will

3:20

be able to convert that into some sort of, you know,

3:23

computer code,

3:24

write that code and execute the code in

3:26

a way that you want. Or suppose you're using a robot

3:28

in a factory, uh, you can say, Hey robot,

3:31

you know, bring me that part from over there on the robot

3:33

just knows which part to bring you and knows sort

3:35

of how to do it. And I think that this is going

3:37

to change the way we use technology

3:40

because we'll be able to talk to technology

3:42

the way we are trained to talk to each other. So

3:44

I guess maybe if that's the operating system of human

3:46

civilization, you know, we, we talk to each other

3:49

to tell each other what to do. And we understand our

3:51

own

3:52

language and linguistic instructions, at

3:54

least some of the time. And I think computers will

3:56

now be able to do that. And that's a big change.

3:58

Uh, I think

4:01

one of the underrated possibilities that

4:04

people are not thinking about yet in connection with

4:06

this technology is that this will make technology

4:09

accessible to people who are

4:11

much less technically trained.

4:13

So right now you have to

4:15

use computers effectively or to use robots

4:17

or to use anything technological.

4:20

You essentially have to think like a simple

4:22

machine and you have to write lines

4:25

of code, you know, open your text editor, write lines

4:27

of code in a language and then compile it. And

4:29

that requires you to essentially

4:31

do machine part instructions in

4:34

your head before translating them onto

4:36

the page and telling the machine to do them.

4:38

So I think that has really given

4:40

a lot of advantage in our society to people

4:42

who have the capability to think like machines.

4:45

And now through the magic of

4:47

large language models, I think we are able

4:50

to, we will be able to skip

4:53

that step of forcing ourselves to think like a

4:55

machine. I think people who are not very

4:57

good at thinking like a machine will still be able

4:59

to use machines. And I think that

5:01

early evidence already points to this happening. So

5:03

I think that's a big underrated change

5:05

that people aren't talking about. That there is obviously

5:08

a lot of debate and disagreement. You know,

5:10

I've seen you write about Noam

5:12

Chomsky, Tyler account, people like that who

5:14

are very different views. Jeffrey

5:16

Hinton, who's just stepped down from

5:19

Google, has been saying a lot of

5:21

negative stuff really about

5:23

AI and chat GPT over the last

5:26

few days. You are very much on

5:28

the positive side of the debate. I

5:31

actually do agree

5:31

pretty strongly with Tyler Cowan on

5:33

this one. And he has also been on the positive

5:36

side of the debate and been fairly harshly

5:38

critical

5:39

of the people who want a pause in

5:41

AI research. I think that his

5:43

arguments are persuasive and the arguments of people who

5:45

don't want a pause

5:47

are persuasive to me. And, you know,

5:49

I don't want to put words in people's mouth. But to summarize those arguments,

5:53

I think

5:54

the first argument is that many people

5:56

will not pause. So if you make a pause,

5:58

that simply seeds the field to the people who

6:01

refuse to pause,

6:02

which in this case is China. Researchers in

6:04

other countries, these people aren't going to follow the U S

6:06

if, if we declare a pause, even if

6:08

we have an ability to do that. I think

6:10

the second, the second argument is that

6:13

we don't actually know what these technologies

6:16

are capable of yet. So to pause

6:18

simply delays the day when we find out. And

6:20

if we pause for six months, AI

6:22

research, we won't necessarily be able to

6:25

figure out things that would help us make AI safer

6:28

within those months. Because after

6:30

the AI research resumes, changes

6:33

will happen that will invalidate the findings we

6:35

found during the pause. And so I don't think,

6:37

I think a pause is not particularly effective in

6:40

this case because it will

6:42

essentially just delay the day

6:45

when we find out what the new technology is

6:47

capable of and are thus able to design

6:49

effective, you know, safeguards on

6:51

the technology.

6:52

Also, I think there has been a great

6:55

model

6:56

in terms of what the danger

6:58

is from AI,

6:59

I think that most people, when you

7:01

talk about AI, including Jeffrey

7:04

Hinton has talked about this. They will

7:06

talk about the danger in, as in terms of

7:09

replacing human jobs. I don't

7:11

know so much about the details of the technology

7:13

of AI, only the basics, but I do

7:15

know a bit about the economics of, of,

7:18

you know, job replacement, things like that. And I

7:20

think that

7:21

not only is that threat overblown,

7:24

but it also is not the kind of thing

7:26

that's amenable to a pause. So if you pause, if

7:28

AI is destined to make humans obsolete, which

7:30

I don't think it is, but if it is, then

7:33

pausing for six months and

7:35

giving humans an extra six months before

7:38

they go obsolete

7:39

is not going to do a damn thing.

7:41

And so I think that

7:44

is effectively useless. Now, the other

7:46

danger that people talk about

7:48

from AI is, you know,

7:50

sort of the technology going haywire

7:52

and inflicting harm upon humanity

7:55

via, you know, nuclear launches,

7:57

bioweapons, financial fraud.

7:59

sewing social division, you know, there's

8:02

a bunch of ideas that people

8:04

have for how this could happen. And I think

8:06

that a pause there would,

8:09

that would, you know, I mean, preventing the destruction

8:11

of humanity for six months is a useful thing

8:13

to do. And I don't actually know whether

8:16

or not, you know, AI will destroy humanity.

8:18

I think the current crop of AI will

8:20

definitely not

8:21

destroy humanity. That's not a thing that could happen.

8:23

But I think that we might reach relatively

8:26

soon an AI that could by

8:29

adding a bunch of stuff that the current AIs do

8:31

not have, such as the ability to act

8:33

as a perpetual agent that's always sort of on

8:35

and thinking and acting. Various other things

8:37

we need to add in order for it to be able to do that,

8:39

you know, we need to connect it, have APIs

8:42

to connect it to the financial system. And we'd have it

8:44

have to be able to like synthesize voices and

8:46

make calls and basically have to be extremely

8:48

multifunctional. I think that

8:50

we can see that it can do those

8:52

things. But the question is, would a pause

8:55

make us safer from that? And

8:57

I think what makes us safer from

8:59

AI trying to screw with our

9:01

systems, our, you know, technological

9:04

and weapons and finance systems and these things,

9:06

what makes us safer from that is having it done

9:09

in a small way and building safeguards

9:11

to it. So if you if you watch the Terminator

9:13

movies where the AI wakes up and launches

9:16

nukes and destroys humanity, it's the first thing

9:18

it ever does.

9:19

When the AI, the minute the AI comes online, it

9:21

thinks, oh, my God, humans are threatening me. And

9:24

the first thing it does is launch a global thermonuclear

9:26

war. We must absolutely not hook up AI

9:29

to any sort of weapon systems, especially

9:31

weapons of mass destruction. You know, if it's like some

9:33

little drone or something that could just cause

9:35

an accident as the worst thing or kill a civilian

9:37

or something, maybe I don't know. But then

9:40

I think that hooking up AI to the nuclear

9:42

weapon systems is a very bad idea. And

9:44

we should there seems to be a high

9:46

likelihood that AI will do small bad things before

9:49

it does the big bad thing that kills everybody. And

9:51

we'll see it do that. And we'll understand how it does

9:53

that. And we'll come up with countermeasures for it.

9:55

We'll know not to, you know, we'll have

9:57

we'll have changed the way our phones work so they don't.

9:59

just trust, you know, voices from

10:02

trusted people or we'll, we'll have some way of verifying

10:04

that a phone call comes from the right person,

10:07

or we'll have some way, we'll just add

10:09

verification systems for communication so AI

10:11

can't

10:12

deep fake its way into tricking people to do things.

10:14

We'll implement stricter controls on biological,

10:16

you know, synthetic bio kits that you can get at home. We

10:18

should be doing that anyway. But I think one

10:22

pretty consistent thing we've seen is that human beings

10:25

as a group, not individually, but as a collective

10:27

group are very, very, very good

10:29

at finding ways to destroy other human beings

10:31

and doing that for stupid, crazy reasons

10:34

of our own. We're very good at that. If

10:36

Ted Kaczynski

10:37

mailing bombs because he's concerned about the environment,

10:40

I mean, that's crazy.

10:41

You'd have to work to get an AI that crazy,

10:43

or you have, you know, terrorists doing 9-11

10:46

or whatnot. And we already have, we've

10:48

been implementing safeguards to, you

10:50

know, guardians, nuclear terrorism, theft of nuclear

10:52

materials, things like that for a long time. And we

10:54

probably have to update those safeguards in the age

10:57

of rogue AI. We'd also know

10:59

AI doing small bad things would also tell us where

11:01

the rogue AI is coming from, who made it, how

11:03

to, how it works, how to stop it. You learn

11:06

about bad things by bad things happen. And that's just sort

11:08

of an unfortunate fact of the world. You know, it's,

11:10

you can't learn about bad things by having

11:13

a few internet enthusiasts sit

11:15

around and dream and ponder ways that

11:17

things could go bad. You don't learn much because

11:19

there's so many possibilities that they can dream

11:21

up that you don't know which possibilities are more likely. The way

11:23

you know which possibilities are more likely is by having AI

11:26

do some small bad things and figuring out, oh

11:28

my gosh, a small bad thing happened. Like

11:30

crashing car,

11:32

right? You know, you can sit there,

11:34

I can sit there and think of ways that a car could

11:36

crash or that AI could accidentally crash a car.

11:38

But that's essentially useless compared to

11:40

testing AI at low speeds,

11:43

we're in a non-fatal situation or in a safe situation,

11:46

watching it crash and then saying, okay, why did it crash?

11:49

That's how engineering works. A pause just

11:51

makes a delays the day when we will start

11:53

that process of learning how the machine

11:55

can crash. Thank you, Noah. I'd like to move

11:57

the discussion on from tech in the interest of time.

11:59

we could talk about it all day and

12:02

ask you to briefly comment

12:04

on a piece that you wrote recently about

12:06

the slow banking crash,

12:09

I think you called it. We've had three

12:11

banks now fail in the United States,

12:13

the second, third and fourth largest in

12:15

US bank failure history. Each

12:18

one has been described as a one-off, it's

12:20

self-contained, it's not going to happen again,

12:23

everything's okay, we are told. It

12:25

strikes me that from

12:27

a behavioral point of view, the business model of these

12:29

banks, which was to essentially, and I

12:32

oversimplify a little, to bet on

12:34

low interest rates, staying low forever

12:36

was clearly wrong. And if three of them

12:38

have done it, there must be more surely, aren't

12:40

there? They all have to some degree. Yeah, that's

12:42

right. So the way they did this primarily

12:45

was to pay, was to do

12:47

two things. Number one, they bought long-term

12:50

bonds, and the banks that failed bought

12:52

many more long-term bonds than other banks. So the

12:54

large banks that we have like Chase

12:56

or Citibank, their

12:59

ability to buy a lot of these long-term bonds is reduced

13:01

by the regulation that we implemented after the global

13:03

financial crisis.

13:05

And after those banks almost failed

13:06

because they're systemically financially important. But

13:09

during the Trump years, we changed that

13:11

law so that banks of

13:14

a slightly smaller size, the size of Silicon Valley

13:16

Bank and First Republic, both of which just failed, are

13:18

allowed to do a bunch of risky stuff. And unfortunately,

13:21

we changed that regulation at

13:24

a very inopportune time. And

13:26

so

13:27

those banks bet heavily on interest

13:29

rates staying low by buying long-term bonds. Because

13:31

remember, when interest rates go up,

13:33

the price of bonds goes down and the price of long-term

13:35

bonds goes down a lot. That is why those

13:37

bond portfolios took a hit. The second thing

13:39

they did was in order to keep making a profit margin,

13:42

they did not pass along higher interest

13:44

rates to their depositors. So that if you

13:46

had a savings account at Silicon

13:48

Valley Bank or at First Republic, it

13:51

was still only paying you 0.2% interest in

13:53

an era of 4% interest rates.

13:55

So you were losing, they were setting your money on fire. That

13:58

is an invitation to deposits to leave. Remember

14:00

that banks collapse when their deposits leave and when you can't

14:02

pay them out. So when your value

14:04

of your assets goes down, because

14:07

interest rates went up, so bond prices went down, it means

14:09

that you can raise less cash to pay your

14:11

deposit. So that makes it more likely for a bank to collapse.

14:14

And in addition, having depositors

14:16

want to leave, not because of panic, but just because

14:18

they're not getting a good rate of return on their deposits, is

14:21

another incentive for depositors to leave. So

14:24

the era of high interest rates created

14:26

these incentives for depositors to leave. And in March,

14:28

we saw that happen very fast to three banks. The

14:30

idea that first Republic Bank, so first

14:32

Republic Bank officially failed this week, but

14:35

really it failed due

14:37

to deposit outflows that happened in March, because

14:39

it acknowledged those deposit outflows in its quarterly report.

14:42

The first quarter does not include

14:44

April. Really, when we're talking about, you

14:46

think it's one after another because the news stories

14:49

come one after another, but in fact, these all happened at the same

14:51

time. When we talk about a one-off, we don't mean a

14:53

one-off in terms of one bank doing bad business

14:55

models. We mean there was a big

14:57

outflow and then the Fed and the FDIC

15:00

and the Treasury came in and basically backstop

15:03

deposits. And so we haven't seen, we have

15:05

not seen banks fail since that event.

15:07

We may, there's two banks that are quite

15:09

wobbly, Western Alliance and Pacific West

15:12

Bank Corp. And those two banks are

15:15

much smaller. They're about $40 billion in assets

15:17

each compared to $200 billion for both Silicon Valley

15:20

and first Republic. They had some deposit

15:22

outflows, not as bad. They may also fail.

15:24

It's not clear. Since the Fed and

15:26

the FDIC and the Treasury backstopped

15:29

uninsured deposits, we haven't seen

15:31

other bank failures. So I think that the

15:34

idea that, oh, they backstop the deposits

15:36

and yet we're still seeing one after another of these

15:38

bank failures. No, those happened at the same time. However,

15:40

that said, the incentives

15:43

for deposits to flow out of the banking system

15:45

remain. And the only way that banks

15:48

can keep money, keep

15:50

deposits in the system and

15:52

strengthen themselves by

15:54

keeping deposits in the system.

15:56

The only way they can do that is by raising interest

15:58

rates that they pay to depositors.

15:59

And the easiest way to do that is

16:02

to simply make the depositors more aware of

16:04

the high interest rate savings account alternative

16:06

that they already have called a money market account.

16:09

So they can do that, which is also FDIC

16:11

insured. It's basically just better than a traditional

16:13

savings account in every way. And banks have been frankly

16:16

a little

16:17

dishonest in letting people keep their money. And traditional

16:19

savings account, when the money market account is what the traditional

16:22

savings account used to be, it does pay you

16:24

interest. So banks can do that. That

16:26

just weakens the value of their business because that

16:29

means they don't make as much of an interest rate spread. The spread

16:31

between their bonds or their loans,

16:33

whatever their assets are, the interest rate that those pay

16:36

the banks, and the interest rate the banks have to pay their

16:38

depositors goes down. So that difference goes down.

16:40

And then banks get weaker in terms of profitability.

16:43

So banks have to sacrifice profitability in order

16:45

to survive. That will happen. And it will also

16:47

reduce lending because that will make

16:49

them cut back on risky long term loans.

16:52

And it'll just make them cut back on loans in general to sort of

16:54

preserve their cash, just

16:56

in case depositors want to take money out. That

16:59

decrease in loan activity will

17:01

cause

17:02

either a recession or something like a recession

17:05

in that it will decrease aggregate demand. It

17:08

will decrease lending. It will decrease the amount of money flowing

17:10

into people's pockets and

17:12

businesses' pockets. And it will decrease the amount

17:14

that people in businesses spend on things, which will

17:17

hold down inflation, but will also cause a slowdown

17:19

in economic activity, which is probably

17:22

what the Fed wanted in the first place when it raised

17:24

interest rates. In other words, this is kind of

17:26

just how monetary policy works. This

17:29

is this is how we slow the economy

17:31

in order to quash inflation. Is it not the

17:33

case that, yes, the deposits

17:36

flowed because they

17:37

weren't getting any kind of return on those

17:39

deposits and they had a good alternative for the first

17:42

time in a long time? But

17:43

it was also the case that perhaps slightly

17:46

smarter depositors worked out that

17:48

the losses on those bond portfolios

17:51

had on paper

17:53

technically maybe rendered the

17:55

bank's capital ratios skinny

17:57

to the point that they were technically insolvent. there

18:00

was actually a solvency worry as

18:03

well as a chase for yield. Sure.

18:06

Yes, absolutely. So the chase for yield

18:09

is the thing that is still in

18:11

action. So the solvency worry

18:13

was there. It's no longer there because if your

18:16

bank is 100% insolvent and if your bank's assets go

18:19

to zero tomorrow,

18:21

it doesn't matter a bit to you because the

18:23

government has guaranteed

18:25

your entire deposits now.

18:27

You will not lose a penny of your money. Your money is

18:29

fine. Who is responsible for maintaining

18:31

the

18:32

paperwork that gives

18:34

you your money will change as

18:36

it changed with First Republic being acquired by JP

18:38

Morgan, but as a depositor, you're in

18:41

zero danger from your bank going and solving.

18:45

Surgeons keep our hearts beating.

18:48

They do the amazing, help save lives,

18:50

and so can you. Your CSL Plasma

18:53

donation can help create 24 critical

18:55

life-saving medicines that can give grandpa the

18:57

chance for his heart to swell when he meets his

18:59

new grandson or give a bride the chance

19:02

for her heart to skip a beat on her wedding

19:04

day. Every Plasma donation helps

19:06

more than you know. Do the amazing,

19:09

help save lives, donate today

19:11

at your local CSL Plasma Center and

19:13

be rewarded for your generosity.

19:18

So why wouldn't the feds just simply

19:20

say, okay, rather than going through all that rigmarole

19:24

complicated stuff of

19:26

the deal that was done with JP Morgan because

19:28

there are all sorts of cross guarantees

19:30

and loan loss sharing

19:33

potentially going forward. It was a pretty complicated

19:35

setup. Why didn't they just shut First Republic

19:37

and give everybody a check?

19:39

Because that is annoying and disruptive.

19:41

I mean, if your bank account vanishes and then you get a check

19:43

and then you have to go find another bank, open another bank

19:46

account and deposit that check,

19:48

that is annoying and disruptive, possibly

19:50

even scary,

19:51

but certainly would disrupt normal economic

19:53

activity. Whereas if instead now you just have

19:55

a JP Morgan account and your account just switches

19:58

to a different bank.

19:59

That is very... easy and smooth transition.

20:02

So you think that today's Federal

20:05

Reserve interest rate rise is probably

20:07

the last one? That's

20:09

my guess. I think that they will raise

20:11

interest rates to the level they

20:13

said they would raise interest rates to, and I have to go check

20:16

to see if this gets them all the way there, if they need one

20:18

more hike. I think this gets

20:20

them all the way there.

20:21

Where other they're close, and the

20:23

point is that the Fed places an extremely

20:26

high premium on its credibility,

20:28

and people who want to predict the Fed need

20:30

to understand that fact first,

20:32

is that the Fed credibility over

20:34

everything, if the Fed goes back on what

20:36

it says, then it feels rightly

20:39

or wrongly that that will crush

20:41

its ability to make monetary policy work in the

20:43

future. So it places a huge premium on credibility.

20:45

It will raise interest rates a little too high

20:48

if that means, not a lot too

20:50

high, but it will raise interest rates a little too high, even

20:53

if that means hurting the economy a little too

20:55

much, just to preserve its own credibility. Noah,

20:57

could I just ask you about,

20:59

Sam, you've written a lot about globalization,

21:02

free trade, industrial policy, as

21:04

Chris mentioned in recent days. Since 2016,

21:08

I guess, we've seen Trump,

21:10

we've seen Brexit in the United Kingdom. There

21:13

has been a definite backlash

21:16

against globalization and the impact that

21:18

free

21:18

trade has had. How

21:20

do you assess the globalization

21:23

agenda at this juncture? Is it

21:25

irreparably damaged or is this

21:27

just a blip in a long-term trend?

21:31

Globalization in terms of trade as a percent of the economy has been

21:33

going down at a gentle rate

21:35

since the global financial crisis of 2008, which is now 15

21:37

years ago. Weird

21:39

to think that, right? It was such an apocal event and now

21:42

it's 15 years in the past. So the global financial

21:44

crisis

21:45

happened a long time ago, and

21:47

globalization has been shrinking slightly since then. What

21:50

has happened

21:51

in the last decade is

21:53

an increasing series of steps to

21:57

break up some of the specifics of the

21:59

old trading regime.

21:59

that prevailed during the 2000s and early 2010s. So

22:04

people think this started with Trump

22:06

and his tariffs on China, but in fact it did not. It

22:08

started with China. In the early 2010s and

22:10

mid 2010s, Xi Jinping starts, although

22:14

the plans were laid out in the Hu Jintao administration,

22:16

but Xi Jinping comes in and basically

22:19

decides to execute all these plans to

22:21

move China up the value chain by onshoring

22:24

production of high value components and building

22:27

brands, which is just good for

22:29

Chinese companies and value capture, et cetera, and

22:32

also to start securing

22:34

supply chains for the purpose of national

22:36

security. The idea that we don't

22:38

want our supply chains to get disrupted in the event of a war

22:41

is something that Americans are thinking of now and that

22:43

China was thinking of long ago because

22:45

they were actually thinking about the possibility of war

22:48

much earlier and much more seriously

22:50

than we were. And so China,

22:52

because they would start it, and

22:55

so China started making

22:57

efforts toward decoupling long before. Of

22:59

course, everyone knows the story of Trump coming in and slapping

23:01

tariffs on China and all these sort of sometimes

23:04

haphazard attacks

23:05

against China that were then

23:08

kind of refined by the Biden administration. The Biden administration

23:10

kept many of the tariffs, kept and

23:13

expanded export controls a lot and kept investment

23:15

restriction through CFIUS, the

23:17

committee that reviews Chinese investments. The Biden

23:19

administration

23:21

preserved or extended many of the Trump

23:23

administration's efforts toward decoupling

23:26

and then added on a lot of its own.

23:29

And now is adding another layer,

23:31

which is industrial policy. Trump didn't have really

23:33

industrial policy beyond yelling at companies

23:36

to put factories in the United States, which didn't work.

23:38

Biden has passed the CHIPS Act, the

23:40

Inflation Reduction Act, and those two

23:43

bills together basically promote two

23:45

industries, the semiconductor industry and the

23:47

clean energy industry. And those

23:49

two

23:50

industries are the focus of our new industrial policy,

23:53

and that's going to add a layer. And in addition,

23:55

on top of those things, Biden has come up with this

23:57

concept called French shoring, which in practice...

24:00

This is going to mean putting factories

24:02

anywhere but in China and this concept

24:04

of de-risking. So I know we don't

24:06

have that much time, but let me take a minute and say why

24:09

de-risking is the right word for this. Companies

24:13

that put all of their manufacturing in China are

24:15

in an existential danger from a

24:17

war over Taiwan or a war over anything,

24:19

South China Sea anywhere. The minute that the United

24:22

States and China start shooting at each other, the

24:24

value of direct investments in China,

24:26

of factories in China

24:28

goes to zero. It goes poof because you won't be able to get

24:30

your stuff out. You certainly won't be able to get your capital out,

24:32

but you won't be able to even get your stuff out.

24:35

Maybe even not your people out. If

24:38

all your manufacturing is in China and there's a war, which

24:41

is not something ... And because the war

24:43

would be started by China, that's not something

24:45

that American business people can even ... They

24:47

can yell at America to be peaceful and reasonable

24:50

and blah, blah, whatever they want all day, but

24:52

Xi Jinping is just not listening to them. If

24:54

he wants to start a war, he'll start a war and the United States

24:56

will fight if we get attacked.

24:59

And that's just how ...

25:01

There's no way that's not going to happen. Everyone

25:05

now realizes that being in China poses an existential

25:07

risk. The idea of de-risking

25:09

is basically to take your stuff and put some

25:11

percent of it in India, in Vietnam, in Indonesia,

25:14

in Malaysia, or in rich countries, Japan,

25:17

Korea, the United States, Taiwan,

25:19

et cetera. Maybe not Taiwan because it

25:21

could be blockaded. I don't know. But essentially

25:23

to take some portion of production

25:26

out of China, put it elsewhere to lessen

25:28

the existential risk so that even if half of your

25:30

stuff goes poof, half will not and you'll

25:32

survive.

25:33

One of the things that strikes me about the

25:36

extraordinary post that you put up on your

25:38

sub-stack earlier this week

25:40

on all of this when you explained

25:43

this new industrial policy, there were so many

25:45

things that I could talk to you about.

25:48

First is the extent to which over here in Europe,

25:51

I don't think the body politic,

25:53

I don't think policymakers, I don't

25:55

think the business community fully

25:58

gets what Biden done with

26:00

these two acts, I think that it

26:03

is all so new and so

26:05

counter

26:06

many deeply ingrained beliefs, not

26:09

least the belief that free trade is good

26:11

and that we must do everything that we can to protect it

26:13

and anything that we do that might damage free

26:15

trade is a bad. The old free

26:17

trade orthodoxy seems to be much stronger

26:20

here in Europe, particularly the UK, than

26:22

it is in the United States. The

26:25

narrative that economists or some

26:27

economists had about the issues that

26:29

were raised by the consequences

26:31

of the free trade that you talk about in your article,

26:34

in particular, the destruction of

26:37

jobs that went to China and elsewhere.

26:40

The first

26:41

narrow technical question I would ask

26:43

you is, was the destruction

26:45

of all those smokestack industry jobs,

26:48

industrial jobs, as much a function

26:51

of automation as it was going to China? That's

26:53

very difficult to tell

26:54

because when

26:56

China comes here and starts out

26:58

competing you, one way that you stay alive is to

27:00

automate. Those things are hopelessly entangled

27:02

in the data and we can't really tell the difference. Our

27:06

general best

27:07

impression is that up

27:10

through the early 2010s, automation

27:14

maybe was half the size of China

27:17

in terms of this shock to jobs, in

27:20

terms of from 2000 to 2012, for

27:23

example. But then you

27:25

run into periodization issues

27:27

like what period you're talking about. The

27:30

big China shock is generally reckoned to

27:32

be the 2000s mainly, probably

27:34

ending in around 2012 or 2013.

27:37

My bigger question is over

27:39

the consequences of all of this, the disappearing

27:42

jobs, which as well, particularly

27:44

the UK, either to automation or to China

27:46

or to both. The consensus, as I

27:49

understood it, amongst some economists

27:51

was that

27:52

all of that at the very macro

27:54

level was still a good thing. Countries

27:57

benefited from the free trade, from the

27:59

globalized

27:59

from the rise

28:02

of China because we were able to import

28:04

a whole load of cheaper stuff than we were before.

28:07

And that what we messed up was the distribution

28:09

of those gains.

28:10

I sense from you that you disagree

28:12

with that consensus. I

28:15

don't know because it depends on

28:17

what you care about. So I think

28:19

that the lifting of a billion

28:22

Chinese people out of poverty

28:24

is such a good thing for the world

28:26

that I would feel morally

28:28

bankrupt saying that we should not have done this.

28:31

At the same time from a narrowly nationalistic

28:33

interested national interest perspective,

28:36

I think that

28:37

there were a lot of you know, overall, this was probably

28:40

a net negative for the United States and probably will find out

28:42

for Europe as well. It

28:45

not not only destroyed those those smokestack

28:47

jobs or whatever, but you know, I think

28:49

more importantly, what was ultimately more important

28:51

is that it created a superpower competitor

28:54

that does not like us and was never prepared to like

28:56

us. And so national

28:58

security is what is what is driving these things.

29:01

If you tell people, well, now your stuff

29:03

will be slightly more expensive.

29:06

That carries almost no water.

29:08

Economists, you know,

29:10

who do like the IMF

29:12

recently did estimates of consumer surplus

29:14

from losses, consumer surplus losses

29:17

from friends shoring and decoupling or all these things. These

29:19

are all just like in valued in dollars. Americans

29:21

do not think of national security in terms of dollars

29:24

and economists do not think about national security period. And

29:26

no one's listening to economists. And so this

29:28

is something that people need to understand. In 2006, a lot of people listened

29:31

to what economists had to say. And

29:34

now very few people listen to what economists had

29:36

to say during the pandemic. They were completely

29:38

ignored on environmental stuff and climate tech.

29:41

They've been completely ignored. And so

29:43

really very little of what,

29:45

you know, economists have no influence with

29:47

either party right now. The

29:50

people driving economic policy are are

29:52

people at think tanks who may have an economist

29:54

background or may not. And

29:57

so that that is a thing that needs

29:59

to be understood.

29:59

that no matter what economists think about

30:02

free trade now, or no matter what their papers or models

30:04

say, no one's listening to them at all.

30:06

And so- Does that worry you? Do

30:08

you think that economists, even though they're not being listened

30:10

to, still have something to say?

30:12

That's a very good question, and I'm gonna write

30:14

about that soon. But I think that economists

30:17

are fairly behind the curve

30:19

on thinking about these things. No one has,

30:22

only a very few researchers have bothered to think about

30:24

industrial policy, and it's all been in the context

30:26

of developing countries, whether or not,

30:28

say, Indonesia can get

30:30

richer faster by promoting this

30:32

industry or that industry. Very few economists,

30:35

if any, have thought about industrial

30:37

policy at the, for

30:39

rich countries, in terms of national

30:42

security competition and things like that. It

30:44

is just not a thing that economists have thought of. They're

30:46

incredibly behind the curve, and if they're smart,

30:49

they'll scramble to catch up. And if they're not smart,

30:51

then they'll simply keep doing whatever, retreat

30:53

into the ivory tower and do whatever useless theory

30:56

that they're doing that no one cares about, or

30:59

just continue to focus on other issues

31:01

where they do have more clout, like minimum

31:03

wage or some welfare

31:05

policies on which some people still

31:07

do listen to economists. But no one's listening

31:10

to them about trade and about

31:12

national security and these issues. Am

31:15

I worried about this? No, economists,

31:17

it is incumbent upon economists to do the

31:19

work that forces me to worry that no one's listening

31:21

to economists, because so far, there's nothing to listen to them

31:24

about.

31:25

There are a lot of people in the UK who

31:27

would listen to

31:28

you and look at what Biden has

31:30

done, and you've must have seen, it's come out of Brussels

31:32

as well, people complaining like hell about

31:35

protectionism and subsidies and all of

31:37

the things that policy over here is

31:40

geared towards eliminating, trying

31:42

to reduce barriers to trade, the old

31:44

fashioned belief in free trade

31:46

is good. It may well be simplistic,

31:49

it may well be over, and is not being

31:51

listened to in the States.

31:52

But I gotta tell you now, it's still being listened to

31:54

over here. There's a big mismatch

31:58

of what people are thinking about on both sides of the world.

31:59

the Atlantic on this very issue. And

32:02

I don't think people over here,

32:03

they should read what you've written because it's

32:06

so big and it's so different to

32:08

the way policy is still thought about

32:10

here in Europe. Well, right. And I think

32:12

that for Britain, global Britain,

32:15

Brexit was all about free, buccaneering free

32:17

trade agreements all around the world. Britain

32:20

has much bigger problems than the question

32:22

of protectionism versus or industrial

32:24

policy versus free trade.

32:27

Britain just

32:29

has deep economic problems. It's

32:31

in a period of

32:33

potentially secular stagnation. And

32:35

I think that while industrial policy

32:37

might to some degree help it out of that,

32:40

I think that there are probably deeper things

32:42

that need to happen, including an improvement

32:44

in the leadership capabilities,

32:47

because I just saw Britain almost

32:49

select Jeremy Corbyn and actually

32:52

choose Liz Truss as prime minister,

32:54

who is not only outlasted by a head of lettuce,

32:57

but probably could be outgoverned by a head of lettuce

32:59

as well. And she immediately just grabbed

33:01

onto the most, you know, like

33:04

absolutely

33:06

dead 1982 level

33:09

Reaganomics dogma that economists

33:11

themselves would already have tossed out the window.

33:14

And so what they my point is, what they haven't

33:16

tossed out is this fundamental belief

33:19

that they took in with their mother's milk belief

33:21

in free trade, which you say has just

33:24

gone from the American agenda.

33:25

It's gone from the American agenda, but I think Britain

33:27

has bigger fish to fry, honestly. The people

33:29

who need to be thinking about the free trade versus, you

33:32

know, industrial policy kind of thing are primarily

33:34

in Germany and France right now. Those people

33:37

need to be thinking hard. And of course, that affects Ireland

33:39

because Ireland is sort of along for the ride in the EU

33:41

with that, though small. But Britain itself

33:44

is going to be a decade

33:46

digging out from the self-inflicted wounds

33:49

of Brexit and from the dysfunction of its

33:51

political

33:52

class. And I just, it's

33:54

almost daunting to me to tackle the problems

33:58

of Britain because I just.

34:00

Who's listening in Britain?

34:02

We have a strange psychodrama in Britain

34:04

in that what we must talk about

34:06

is anything but the problems.

34:08

I know. Ah, and then

34:10

when you do talk about the problems, the

34:13

vitriol is immediate. No

34:15

matter which side you take, you can even express

34:18

a point of view very reasonably

34:20

and mildly and the amount of vitriol

34:23

that people on social media will respond to is akin,

34:25

is

34:26

just beyond even what you

34:28

encounter in America,

34:30

which is already a very high level of vitriol. Let

34:32

me give you an anecdote. When I was younger,

34:34

I used to really enjoy

34:36

watching negative British

34:39

reviews of video games and books. And

34:41

I would look on YouTube just to find

34:43

people in a British accent trashing

34:46

a video gamer book. Even if I kind

34:48

of liked the video gamer book, it was just funny to

34:50

see it get trashed by a person in a British

34:52

accent because British people were just

34:54

so good at trashing stuff. Unfortunately,

34:57

that appears to have been applied as

34:59

a philosophy of government.

35:02

Noah, can I ask you, Donald

35:04

Trump is in Ireland tonight actually

35:06

visiting his golf resort? I'm

35:09

sorry.

35:10

Exactly. Thank you. I

35:12

just looking ahead to the next US presidential

35:15

election, I mean, how vitriolic

35:17

is that going to be? How divisive?

35:19

Very much, very much. My assessment

35:22

is that the popular unrest

35:24

in America has largely peaked.

35:26

Enthusiasm for

35:28

street actions, riots, insurrections,

35:32

fights with your neighbors, running down protesters

35:34

in a car. It has

35:36

intensified among an increasingly shrinking

35:39

tiny set of people, an ever tinier set

35:41

of people are going crazier and crazier

35:43

like they did in the 1970s and will be violent. While

35:47

the mass of people just think, oh my God,

35:49

is there anything else I can pay attention to other than

35:51

this? I've overdosed on this. I don't just

35:53

want to see people yelling about Donald Trump

35:55

forever and ever. Please make it go away.

35:58

I think people

35:58

obviously have reservations about Biden. because of his age,

36:01

but I think that really very few people, relatively

36:03

few people in America would like to see a return of Trump

36:06

and the years of chaos

36:09

that he brought. And so I think

36:11

that the 2024 election is a big

36:14

danger, but if America

36:16

can make it past the 2024 election safely without either

36:20

electing Donald Trump or falling into some sort

36:22

of constitutional crisis or chaos, I

36:24

think that you will very rapidly see

36:26

a move toward

36:27

a relative amount of social

36:32

rest or peace or whatever the opposite of unrest

36:35

is in America,

36:36

because most people are just so

36:39

tired of it. And I think you saw this in the

36:41

late 70s and early 80s, you

36:44

saw, although there were still some

36:46

extreme crazies who were blowing stuff up and killing

36:48

people and tried to kill Gerald twice

36:51

in one month, you definitely saw this movement

36:53

of the mass of people of like, oh my God, can we

36:55

just stop this? And I think you'll

36:57

see that again, but 2024 is a big hurdle

36:59

that we have to make it past.

37:00

This is the last thing I will say

37:03

and leave the concluding remarks

37:05

and all questions to Jim. But one of

37:07

the things towards the end of your piece on industrial policy

37:09

that struck me consistent with what you're

37:12

just saying about things shifting is that,

37:14

you know, over here, we assume that America is fundamentally

37:16

divided, can't agree on anything, and

37:19

that it's going to be horrible. Whoever wins the election,

37:22

all this industrial policy stuff, with

37:24

one notable exception, which we won't

37:26

have time to go into, but most of this industrial

37:29

policy stuff, one of the jaw dropping

37:31

aspects of it that I hadn't fully appreciated

37:34

is that it's bipartisan.

37:36

That's right. And the entire reason is

37:38

national security because both Democrats

37:40

and Republicans have become freaked out

37:42

by China. Democrats more because

37:44

they're wedded to labor and they see the smokestack

37:47

job disappearing, but also because they're upset

37:49

about China's human rights abuses, Republicans

37:51

because they see a threat to, you know, Western

37:54

civilization and power and whatnot,

37:56

and also because Republicans increasingly draw

37:58

their political support for the world.

37:59

from the working class in America, a shift

38:02

that has happened in the UK as well, I know, with the

38:04

working class in general drifting to the right. They're

38:07

bipartisan because they both perceive

38:09

the same threat for some different

38:11

reasons and some overlapping reasons. And

38:13

as long as that threat is there, there

38:16

will be some amount of unity in policy.

38:18

And that's why these bills, the Inflation

38:20

Reduction Act was not a bipartisan bill.

38:22

It was passed by only Democrats

38:25

with a slim majority. And because the

38:28

Republicans and Democrats do not see eye

38:30

to eye on climate change yet, which is unfortunate

38:32

in my opinion. They also do not see eye to eye on

38:34

things like poverty reduction or whatever, but

38:39

they definitely see eye to eye on China. It

38:42

is an amazing unifying force in American

38:44

politics to watch everybody. And it's

38:46

scary to a certain extent. I've seen even

38:48

sort of national security hawks a

38:51

little freaked out by the degree to which there

38:53

has

38:53

been a sort of unshakable anti-China

38:56

consensus taking hold within our halls

38:58

of policy. And I think that it's

39:00

a dangerous thing, but

39:03

it presents opportunities for actually getting

39:05

bipartisan stuff done and building infrastructure

39:08

and building some of the capacity

39:10

to increase industry again, provide

39:13

broad-based middle-class jobs and things. So

39:16

it's a danger and an opportunity, this bipartisan

39:18

consensus about China. Jim, I'll leave the last word

39:20

to you. In 30 seconds, I

39:22

have a brother living in San Francisco. I visit

39:25

regularly. And when I travel

39:27

into the city,

39:29

I am appalled at what I see. I

39:31

mean, it seems like a city that

39:33

the whole liberal agenda

39:35

has really destroyed over

39:37

the last couple of decades.

39:39

Am I being harsh? Harsh but fair. I

39:41

wouldn't say that it's necessarily the same liberal agenda

39:44

as you'd find in New York, but I'd

39:46

say there are some things that are uniquely bad about

39:48

San Francisco and dysfunctional and have been that way for

39:50

a while and are just reaching a breaking point.

39:53

And I would say that there's other

39:54

ways in which progressive ideas that are

39:56

common throughout the country have not helped San Francisco.

39:59

So I think we...

39:59

We are heading for a reckoning.

40:01

We are heading for sort of a breakdown

40:04

in San Francisco over urbanist

40:06

issues and over how a city

40:08

can survive and thrive in the modern

40:10

world. And that breakdown is happening now. It'll

40:12

be interesting to see how it turns out. Listen, Noah,

40:15

on behalf of myself and Chris,

40:17

I'd like to thank you very much for that. It was fascinating.

40:20

You've been listening to Noah Smith, who

40:22

has a blog called

40:24

Noah Opinion. I would strongly recommend

40:27

people check it out. It's fantastic

40:30

stuff. You'll hear a lot more

40:32

of what we've been discussing tonight. So, Noah,

40:34

thank you very much. Yes, thank you, Noah. All

40:36

right. Thank you very much. It

40:38

was great. And one thing, the blog is Noah Opinion with Noah

40:41

in the middle.

40:42

Okay. And it's a sub stack. You can

40:44

find it on sub stack. It is sub stack. Please,

40:46

everybody, sign up.

40:57

You have been listening to Chris Johns and Jim

40:59

Power on the other hand. We

41:04

hope you enjoyed it. Our

41:07

back catalog of podcasts can be found

41:09

on our sub stack count, www.cjpeconomics.substack.com.

41:17

Or on podcast platforms such

41:19

as Apple and Spotify. If

41:21

you would like to listen to the podcast free

41:24

of advertisements, you can sign up

41:26

to our sub stack account. Comments

41:29

and feedback are much appreciated.

41:38

Surgeons keep our hearts beating.

41:40

They do the amazing, help save

41:42

lives. And so can you. Your

41:44

CSL Plasma donation can help create 24

41:47

critical life-saving medicines that can give grandpa

41:50

the chance for his heart to swell when he meets

41:52

his new grandson or give a bride the

41:54

chance for her heart to skip a beat on her

41:56

wedding day. Every Plasma donation

41:58

helps more than you know. Do the

42:01

amazing. Help save lives. Donate

42:03

today at your local CSL Plasma

42:05

Center and be rewarded for your

42:07

generosity.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features