Podchaser Logo
Home
Mesmerized: With Guests Mara Rockliff & John List

Mesmerized: With Guests Mara Rockliff & John List

Released Monday, 7th November 2022
Good episode? Give it some love!
Mesmerized: With Guests Mara Rockliff & John List

Mesmerized: With Guests Mara Rockliff & John List

Mesmerized: With Guests Mara Rockliff & John List

Mesmerized: With Guests Mara Rockliff & John List

Monday, 7th November 2022
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hi, Troy Sology listeners. It's Katie.

0:02

We have an opportunity for you to

0:04

help the show. Stick around until

0:06

the end of this episode, and I'll give you the details

0:08

on how to do that. Now

0:10

on to our show.

0:19

When it comes to the effect of coffee

0:21

on our health, some studies have linked consumption

0:23

to heart risk factors such as raised

0:25

cholesterol or blood pressure sensitive. those

0:27

who drank

0:27

six or more cups of coffee a day had

0:30

a twenty two

0:30

percent higher risk of developing

0:32

a cardiovascular disease.

0:33

After twenty t five cups of coffee

0:36

a day had no ill effects on your

0:38

arteries. The study out of Australia found

0:40

that two to three cups of coffee a day

0:43

is not only associated with a lower risk

0:45

of heart disease and dangerous heart rhythms,

0:47

but also with living longer.

0:55

It

0:56

seems as if every year there's a new study

0:58

on whether coffee is good or bad for

1:00

you. You've

1:01

probably seen other contradictory reports

1:04

on how vitamins or water consumption

1:06

or salt or sugar affect some other health

1:08

outcome you care about.

1:10

So why is it that we see so many conflicting

1:12

reports in the media? Why

1:14

is it so difficult to determine

1:16

whether something like coffee is good or bad

1:19

for, say, your heart?

1:21

It has a lot to do with the challenge of separating

1:23

out all the factors that influence our health.

1:26

Things like age, weight, height,

1:29

genetics, mindset, sleep

1:31

patterns, stress, exercise,

1:34

and, well,

1:35

you get the idea.

1:36

Health is the sum of many things. and

1:39

coffee is just one small piece

1:41

of a much larger puzzle. Teasing

1:44

out its influence is no easy feat.

1:46

In

1:46

this episode, we'll dig into a tool

1:48

for tackling a common mistake that affects

1:51

how we think about everything from

1:53

coffee to medicine, to

1:55

education. You'll hear

1:57

a colorful story from history that illustrates

1:59

how this tool and I'll speak with

2:02

renowned economist John list about

2:04

how the very same tool can

2:06

generally help us all think more

2:07

clearly and make better decisions.

2:10

and

2:13

I'm

2:15

doctor Katie Milkman, and this is Choiceology,

2:18

an original podcast from Charles Schwab.

2:21

It's a show about the psychology and economics

2:23

behind our decisions.

2:24

We bring you true stories involving

2:27

dramatic choices and then we explore

2:29

how they relate to the latest research and behavioral

2:31

science. We do it all to

2:33

help you make better judgments and avoid costly

2:36

mistakes.

2:52

Think of somebody with an

2:54

enormous powdered wig that goes several

2:56

feet above their head and maybe has a ship in

2:58

it. That's that's the time that we're talking about.

3:00

This is Mara. Hi, my name is

3:02

Mara Ratcliffe. I write books for

3:04

kids. usually about strange

3:06

and fascinating things in history that

3:08

have been forgotten.

3:09

This particular strange and fascinating

3:12

story takes place in France.

3:13

So this is not very

3:16

long before the French revolution when

3:18

Louis the sixteenth was in Paris and his

3:20

queen Marie Antoinette they

3:22

wore these very fancy

3:25

outfits, and the ladies of the court

3:27

had these giant hairstyles

3:29

The year was seventeen seventy

3:31

eight, and the French aristocracy was

3:33

about to discover an extraordinary German

3:36

physician.

3:37

This guy came to Paris from

3:39

Vienna, and everybody goes a little bit crazy

3:41

over him. His name was Franz

3:43

Mezmer, He's a very dramatic

3:46

kind of character. He's elegant and

3:48

mysterious. He thought he was pretty

3:50

important. He wears a powdered wig

3:52

and a fine coat of purple silk. and

3:54

he carries an iron wand. And

3:57

he says he's discovered this astonishing

3:59

new

3:59

force, a force called

4:01

animal magnetism. This

4:04

force worked on people the way magnets work

4:06

on metal. And he said it

4:08

was this invisible force

4:10

that you couldn't see or smell

4:12

or taste, but it was all over the universe

4:14

and it just flowed from the universe

4:17

into his body and then out through his

4:19

magic wand. and

4:21

he got this reputation for being

4:23

able to perform miracle

4:24

cures.

4:25

Messmer said I dare to flatter myself

4:27

that the discoveries I have made will

4:29

push back the boundaries of our knowledge

4:31

of physics as did the invention

4:33

of microscopes and telescopes for the

4:35

age preceding our own. So

4:37

he thought that he was probably

4:39

the most

4:40

important scientist in the world. It

4:42

might sound a little strange in today's

4:44

world, but in eighteenth century,

4:46

France, A phenomenon like animal

4:48

magnetism didn't strike people

4:50

as terribly far fetched. There

4:52

was a good reason why people

4:54

might have believed that

4:56

something like animal magnetism could

4:59

exist. So many unbelievable

5:01

things were actually happening. that

5:03

it was really hard to know what

5:06

to believe. It was hard to know what could

5:08

be true and what couldn't be true.

5:10

For instance, Antoine Lavoisier

5:13

was a famous French scientist, he's

5:15

known as the father of modern chemistry, and

5:17

he had just done experiments with

5:20

hydrogen and oxygen, to secure these

5:23

things that nobody can see or

5:25

smell or taste and yet

5:27

suddenly he's setting

5:29

fire to them and, you know,

5:31

what appears to just be air is actually

5:33

this invisible force.

5:35

If somebody said, hey, there's this

5:37

force out there. You can't

5:39

see it. You can't tell

5:41

that it's there, but it's there and it has

5:43

these powerful effects. It

5:46

was plausible

5:47

mesmer claimed he could use the invisible

5:49

force of animal magnetism

5:50

to cure any sickness.

5:53

And there was some evidence that it worked.

5:56

Wealthy patients flocked to mezmir and

5:58

eventually named this treatment after him.

6:00

They called it

6:02

mezmirism. So

6:04

pretty soon, everybody has anybody in

6:06

Paris wants to be mesmerized. So

6:09

dukes and countesses pull up it

6:11

Dr. Messmer's door in their fancy

6:13

carriages and they disappear

6:15

into this room. He would be doing

6:17

this music

6:19

to create a spooky atmosphere. Others

6:22

use velvet curtains and the lights

6:24

would be low and it would be kind of

6:27

airless and everybody

6:29

sits around this sort of big

6:31

wooden tub with iron rods.

6:33

It's this very odd looking thing and looks

6:35

very scientific. and so people

6:37

would come into this very

6:39

dramatic scene with

6:41

this very dramatic guy and then

6:43

he's staring into their eyes and he's waving

6:46

hands and they start having

6:48

all these reactions. Shrieks, tears,

6:50

hiccups, and excessive laughter. People

6:53

were fainting and screaming and

6:55

falling all over the

6:55

place. And then they would

6:58

say that they felt better. And

7:00

maybe they did but

7:01

some people were not fans of

7:03

this new trend. Not everybody

7:06

was absolutely delighted by what was going on

7:08

with doctor Messmer and people whose noses

7:10

were really out of joint were the doctors

7:12

because nobody wanted their treatments

7:15

anymore. So they went and complained to

7:17

the king.

7:18

King Louis the sixteenth decided

7:20

to establish a commission to

7:22

investigate this new

7:23

medical phenomenon, and

7:24

he appointed a famous outsider to lead

7:27

it. His name was

7:29

Benjamin Franklin. Benjamin

7:30

Franklin who was

7:32

a celebrity in France.

7:35

He was very respected by

7:37

all the best scientists in France and at the

7:39

same time who's super popular with the

7:41

people. Ben Franklin had been in

7:43

France for

7:43

a couple of years. where he had

7:45

helped achieve official diplomatic recognition

7:47

for the United States in the revolutionary

7:50

war. So Franklin

7:52

was a pretty old man at this point

7:54

he had gout, he had kidney stones,

7:56

and he was not able

7:58

to get into a carriage and go

8:00

jostling over the cobblestones in Paris.

8:03

to go see mezmir. He was living

8:06

outside of Paris in the country. So

8:08

he asked for mezmir to come to him and

8:10

mezmir refused because

8:11

Messmer was in

8:14

his own eyes an extremely important person,

8:16

and

8:16

he wasn't gonna go to him. But

8:19

Messmer's

8:20

second in command, Charles Des escalation

8:23

went out there to demonstrate

8:26

for Franklin and the other members of the commission

8:28

how mesmerism worked. Franklin,

8:30

of course, the first thing he did

8:33

was had it tried on himself. It

8:35

must have been quite the scene with

8:37

Benjamin Franklin submitting to this strange

8:39

mesmerism from Charles Deslon.

8:41

It

8:41

sort of makes some woo woo gestures at

8:44

you, either with his hands or his

8:46

wand, Ben

8:46

Franklin and some of the members of the commission

8:49

just stand there and say, I don't

8:51

feel anything. And so the word

8:53

got back to doctor Messmer and doctor Messmer

8:55

said, well, there must be something strange about this

8:57

American because it's not

8:59

working on him for some reason.

9:01

At

9:01

this point, Ben

9:02

Franklin observed Charles Dezelan mesmerizing

9:04

some regular patients of doctor

9:06

Messmer's. And the people

9:09

would scream that they felt like their body

9:11

was burning all over and, you

9:13

know, they would fall down in a fade and all

9:15

this kind of thing. So Franklin and

9:17

the Commission were at a crossroads. On

9:19

one hand, the procedure seemed to

9:21

work on some patients. On the

9:23

other hand, when Charles Des laude

9:26

attempted to mesmerize Franklin and the

9:28

other commission members, they felt

9:30

nothing. Benjamin

9:31

Franklin needed a way to figure out what

9:33

was going on. Franklin

9:35

was skeptical in the

9:37

first place, but he kept an open

9:39

mind even though he was the world's most

9:41

famous scientist, he was open

9:43

to things. He just wanted to find out, well,

9:45

is it real or is it not real?

9:48

he observed what was happening to

9:50

himself and other people. And

9:52

he asked himself, well, could this

9:54

be in their minds? So

9:57

he's created this hypothesis

9:59

and he needs to figure out, well, how can I

10:01

test that? That was when

10:03

Franklin and the other members of the commission came

10:05

up with the idea of blindfolding

10:07

the patients so that

10:09

they wouldn't know what was being

10:11

done. This decision to

10:13

blindfold the patients was important to

10:15

the tests. into the future of

10:17

scientific research. So one

10:19

of the tests that Franklin

10:21

ran was they took this young boy

10:23

who was one of doctor of mesmer patients

10:25

and they blindfolded him and they took him outdoors

10:28

into a grove of apricot

10:30

trees. This boy was

10:32

supposed to be especially sensitive.

10:34

to animal magnetism. They

10:36

told him that one of the

10:38

trees had been mesmerized, meaning

10:41

this tree had had a wand

10:43

waved over it and had had

10:44

this invisible force funneled

10:46

into it. And they said, find the

10:48

tree. It's been mesmerized. he

10:50

started moving from tree to tree.

10:52

And first he's coughing,

10:55

and then he complains of a

10:57

headache. And then he says, 0II

10:59

feel really dizzy. It must be getting

11:01

closer. And finally, he

11:03

gets to the last tree and he just

11:05

faints dead away. It

11:07

was a dramatic moment, but

11:09

In fact, he hadn't gone

11:12

anywhere near the particular avocado

11:14

tree that had been quote

11:16

unquote mesmerized. It

11:19

seemed that the boy couldn't detect where the

11:21

mesmerized tree was at all if he was

11:23

blindfolded. Ben Franklin

11:25

repeated the experiment on several other

11:27

patients. There was this one patient

11:29

who reacted very strongly

11:31

when he wasn't blindfolded and

11:33

was being mesmerized. So they blindfolded

11:35

him and Franklin

11:37

said to him, hey, you're being

11:39

mesmerized right now. Can you feel it?

11:41

And he said oh, yes. Yes.

11:43

And in fact, Charles Des escalation

11:45

wasn't even in the room at that time.

11:47

And so then they had Charles come

11:49

back into the room very quietly

11:52

without this patient knowing he

11:54

was there. You can imagine this

11:56

patient just standing there blindfolded while

11:59

Charles Des escalation is just pulling out

12:01

all the stops, going around him and waving

12:03

his hands, and pointing his

12:05

wand, and staring at him in a

12:07

mezamer kind of way. And the guy

12:09

has no idea that he's there and he's just not responding

12:11

at all, which was

12:14

really a shock because Desilne

12:16

was a true believer in mesmer

12:18

and he expected him to

12:20

respond as people normally did. It

12:22

had never occurred to him that they were

12:24

reacting that way because they expected to.

12:26

Ben Franklin had his answer.

12:29

Through multiple blindfolded

12:32

tests like that, they were able to

12:34

show that it wasn't actually

12:36

what Charles was doing that mattered. It

12:37

was what the patient believed. All

12:40

of these patients who had responded

12:42

so dramatically to mesmerized

12:44

treatments once they're blindfolded that

12:47

just went away or their response was

12:49

clearly not a direct

12:51

result of the

12:51

treatment. They proved that

12:53

animal magnetism as such didn't really

12:56

exist, but that there was

12:58

something going on that came out of

13:00

the

13:00

patient's minds rather than in an

13:03

invisible force that was flowing out of a

13:05

wand. This may

13:06

be the first time a blind

13:09

trial was used to test a scientific

13:10

hypothesis. This

13:12

kind of test

13:13

would become the basis for proving that

13:16

one thing causes another.

13:18

Instead of simply examining

13:20

whether mesmerizing people seemed to

13:22

work.

13:23

Franklin realized it was critical to

13:25

assign some people to be mesmerized and

13:28

others not to be. and it

13:29

was crucial that people not know which

13:31

group they were in. This

13:33

was in

13:34

essence a controlled scientific experiment.

13:38

Today

13:38

we run these kinds of tests with far more

13:40

people, and we use random number

13:42

generators to decide who will get a treatment,

13:45

like being mesmerized, and who will

13:47

be in a control

13:47

group. In

13:48

this case, merely being told they're

13:51

mesmerized. The procedure

13:53

allows us to tease out cause and effect.

13:56

if there different outcomes for the people in the

13:58

treatment and control groups. Well,

14:00

then we know it's due to the treatment.

14:03

If

14:03

not, well, It

14:05

was all in our head. The

14:07

scientific method was well established

14:09

already. The idea that

14:11

you observe situation and then you ask a

14:13

question and then you make a hypothesis

14:15

and then you you test that

14:17

hypothesis. But what

14:19

had not been done before that Franklin

14:22

invented here was the blind protocol,

14:24

the blind test, and that

14:26

was really an important

14:28

development Ben Franklin

14:30

and the Commission published their findings in

14:32

a report. It was an

14:34

immediate bestseller. Twenty

14:36

thousand copies were snatched up right

14:38

away. And so

14:40

mezmer who had been a celebrity in a good

14:42

way was now sort of infamous

14:44

and mocked. He was the subject of parodies.

14:47

was this one stage play where

14:49

they show mesmerizing the patient and

14:51

the patient says, please doctor

14:53

tell me does animal magnetism really

14:55

do any good and the guy playing mezmer

14:57

jingles some coins and says, well, I can

14:59

assure you it does me a lot of good.

15:01

So this report of completely

15:04

ruined his reputation. He fled

15:06

Paris. He went back to

15:08

Germany where he spent his

15:10

last days with at Canary which would

15:12

wake him up every morning by landing on his

15:14

head. Mezmarism fell out of

15:16

favor. Mezmarism got

15:19

such a bad reputation that

15:21

eventually when scientists wanted

15:23

to go back and work with it somewhere, they

15:25

had to rename it hypnotism. so

15:28

that it wouldn't be associated with this

15:30

big scandal. But the

15:32

real lasting impact of this story

15:34

comes from Franklin's use of

15:36

blind trials. The blind

15:38

protocol is basically the gold standard

15:40

now for any kind of new medication.

15:43

You have to test every medication

15:45

against a placebo. It needs

15:47

to be blind, meaning the

15:49

patient needs to not

15:51

know whether they're getting the placebo

15:53

or the vacation. And that way,

15:56

if the medication has better effects,

15:58

then you know it's not

15:58

just because

15:59

they believed that it was going to help.

16:02

And today, we have the double blind protocol,

16:04

which is even better in which

16:05

the doctors who are giving

16:08

out

16:08

the medication don't know whether they're get giving

16:10

the medication or placebo so that they can't

16:13

have an impact on the results

16:15

either. It's what the FDA requires

16:17

before any new medication would come on

16:19

the market. It's super important.

16:23

Mara Rockliff is the author of several

16:25

historical books for children and teens,

16:27

including the award winning mesmerized, how

16:30

Ben Franklin solved a mystery that

16:32

baffled all of France. You can

16:34

find links in

16:34

the show notes and at schwab dot com

16:36

slash podcast.

16:44

The

16:44

story of Benjamin Franklin debunking

16:47

Franz Messmer's discovery of

16:49

animal magnetism may be the

16:51

earliest recorded example of a blinded

16:53

experiment. And

16:54

today, I want to focus on the amazing

16:56

power

16:56

of experiments like Franklin's to

16:58

cut

16:58

through the challenges we usually face

17:00

when we try to understand cause and effect

17:02

in the world. It's easy to

17:05

be tricked just like Messmer's

17:07

patients into

17:07

thinking one thing causes another

17:09

when in fact it doesn't.

17:11

For example, maybe you

17:13

heard for years that coffee was great

17:15

for some health outcome, only to read

17:17

a headline later that, oops,

17:20

that wasn't true.

17:21

What happened? Well, the

17:23

kinds of people who drink coffee aren't

17:26

exactly the same

17:26

as other people. If

17:28

they're just a little say

17:30

wealthier than average, It's easy to

17:32

form the false impression that

17:35

coffee leads to great health outcomes

17:37

when it's actually wealth that's so good

17:40

for you.

17:40

And when researchers around to doing an experiment,

17:42

cause and effect can be

17:45

untangled. In

17:47

an experiment, some people are randomly assigned

17:50

to drink

17:50

coffee and others aren't, and then

17:52

health outcomes are measured.

17:55

Differences in things like wealth are all

17:57

washed away by a random assignment experiment

17:59

because wealthy people are just as

18:02

likely to be randomly assigned to

18:04

drink coffee as to abstain from

18:06

it. And

18:06

so you're left

18:07

with the truth, which

18:08

is often that there's no causal

18:10

relationship between a food or

18:12

drug that was believed to have superpowers.

18:14

and the outcomes we seek. I remember

18:16

one headline pronouncing that abstaining

18:18

from alcohol entirely leads

18:20

to a shortened lifespan.

18:23

maybe. But

18:24

a much easier explanation is that

18:27

people who abstain from alcohol completely

18:29

do so because they're a little different.

18:32

Maybe a decent

18:32

chunk of abstainers have a health issue

18:34

that prevents

18:35

them from drinking, and that's the reason

18:37

they, on average, die younger.

18:40

A

18:40

full proof way to get around all this

18:42

mess is with the experimental method.

18:45

My

18:45

next guest is a renowned economist

18:48

and his area of expertise is

18:49

acting truly experimental economics.

18:52

That means he uses the experimental

18:54

method in essentially all of his

18:56

work. John list

18:58

is the chief economist at Walmart

19:00

and the Kenneth C Griffin distinguished

19:02

service professor of Economics at

19:04

the University of Chicago.

19:06

Hi, John. Thank you so much for joining

19:08

me today. Hey, Katie.

19:10

Thanks so much for having me. How's

19:12

everything going? Everything is great.

19:15

Okay. I'm gonna dive right in because I have

19:17

so many questions for you today.

19:21

So

19:21

my first question is if you could just describe what

19:23

it

19:23

means for two variables to be

19:26

correlated with one another. Howard Bauchner: Yeah, that's

19:27

a good question. I think if you

19:30

ask people To

19:31

define correlation, if you ask thirty people,

19:33

you'd probably get thirty different answers.

19:36

My preferred definition is

19:39

two variables that move

19:41

together, either directly,

19:43

like when one goes up, the

19:45

other goes up, like, crime and ice

19:47

cream sales, say, exactly

19:50

or like ice cream sales

19:52

and drownings. That would

19:54

be something that one one goes

19:56

up, the other goes up, And correlation

19:59

can

19:59

also be when one goes up, the

20:02

other goes down. Something

20:04

like when the price of a

20:06

good goes up, the

20:07

quantity demanded in economics goes

20:10

down. Now, I would say that's causal.

20:13

But, of course, correlations

20:15

simply means that two variables are moving together. Great.

20:17

I love that you brought up causality

20:20

because that was my next question. Could

20:22

you describe what it means for two

20:24

variables to be causally related, meaning one

20:26

causes the other. Causality is

20:28

a special form of correlation where

20:32

when one variable moves,

20:35

that

20:35

causes another variable

20:38

also to move. So

20:40

again, it could be one variable

20:42

goes up that could cause

20:44

another variable to go

20:46

up. or when one variable

20:48

goes up, it causes another

20:51

variable to go down. A

20:53

causal relationship now is

20:56

fundamentally different than

20:58

a relationship that is merely

21:01

correlational. And course,

21:03

a huge amount of your work and

21:05

mine too is about trying to

21:07

figure out what's

21:08

causal and what's not. and

21:10

I'm gonna get there in just a second.

21:13

But I wanted to ask

21:15

why you think it is so hard

21:17

to disentangle causation and correlation

21:19

and why people get them mixed

21:21

up?

21:21

Yes. So I think it's hard because

21:23

in many cases,

21:25

you don't know

21:27

assignment. So what

21:28

I mean by that is if

21:31

you ask yourself does

21:33

head start work? Head head

21:35

start being the early childhood education

21:37

and health program for low income

21:39

children and families. Exactly. So

21:41

what head start did early

21:43

on was they looked at

21:45

outcomes third

21:46

grade test scores, kindergarten readiness,

21:49

etcetera, of kids who went

21:51

to head start versus

21:53

kids who did not go to head

21:55

start. And

21:56

what they reported was that

21:58

kids who went to head start

22:00

had

22:00

better outcomes. So

22:02

they argued that head start

22:05

is good

22:05

for kids.

22:07

now Now what

22:08

you have mingle in

22:10

here is that parents

22:12

who really care about their child's

22:15

education are more

22:17

likely to put their child in head

22:19

start. So that's the

22:21

assignment mechanism that I'm talking

22:23

about. It's actually parents

22:26

choosing or kids selecting

22:28

into that particular

22:30

program And in many cases, it's

22:33

that that happens to be the most

22:35

important not bad start. People

22:37

think just because there's

22:39

a relationship there, that

22:41

it's causal. So it kinda

22:42

makes sense. Yeah. That's such a

22:45

great example of a situation where

22:46

a random assignment experiment could

22:48

help clarify things. So the

22:51

beautiful aspect of randomization is

22:53

you can go into

22:55

a really dirty environment.

22:57

And

22:57

as long as you control

23:00

the assignment mechanism, and what I

23:02

mean by that is I

23:05

control who

23:05

goes into control, And

23:08

who goes into treatment? If you wanna think

23:10

about a medical trial, you can. If

23:12

you wanna think about an early childhood

23:14

program, you can. So who

23:16

gets had start, who doesn't, as long as you control

23:18

that assignment. Exactly.

23:20

You have a bunch of parents who say,

23:22

I want my kid in head start,

23:25

and let's say I'm oversubscribed so

23:28

that I wanna be

23:30

fair, so

23:30

I use a lottery system.

23:32

and I randomly put some of them

23:34

in control and some in treatment. I

23:37

I realized the world is a

23:39

messy environment But what's nice

23:41

about randomization is it

23:43

balances that dirt

23:45

across

23:45

the treatment and control groups

23:47

So then when you difference off the

23:50

outcomes, you also

23:52

difference off the dirt because the

23:54

dirt is equally represented in

23:56

each of the two groups. Meaning like the

23:58

the kids that, you know, the lottery

23:59

assigned higher numbers and lower numbers

24:02

whose parents

24:02

all wanted them to be in head

24:04

start. The same number kids have older parents

24:06

and younger parents and kids who have high IQs and

24:08

low IQs because it's just a flip of a

24:10

coin. There's nothing different about the two

24:13

groups. a hundred percent. So you have ambitious

24:15

parents represented in both groups.

24:18

You have siblings of

24:20

two or two boys and two girl siblings

24:22

represented in both groups.

24:24

So these are the background features

24:26

that might matter And in

24:28

many cases, we want to measure those as

24:31

well because those will give us an

24:33

indication of You

24:36

know, who who does the program work the best

24:38

for? And are there

24:40

certain moderators of

24:42

the relationship that we need to be

24:44

aware of, that then when we

24:46

roll it out to the big time, we scale

24:48

it up, we sort of know which

24:50

types of families our program

24:52

work the best for? Who should we scale to?

24:55

And which kinds of families that really doesn't

24:57

work that

24:57

well for? And maybe then we need to scale

24:59

a different program

25:00

to those types of families. So

25:03

the nice thing

25:04

here that we're talking about is, first

25:07

of all,

25:08

I have an approach here that you

25:10

and I and many others are

25:13

doing called experimentation, and

25:15

it allows us to control

25:17

the assignment mechanism, which allows

25:19

us to establish an

25:22

estimate that has internal validity.

25:24

Do

25:24

have a favorite research study you've run

25:27

that doesn't

25:27

tangles correlation from causation? Oh,

25:29

gosh. I

25:30

would say everyone. So

25:33

let's think about charity. A lot of

25:35

your

25:35

listeners might be interested in charitable

25:38

giving. And

25:39

and what

25:41

happened when I started my

25:44

own research in the late nineties

25:46

on charitable giving is that there

25:49

was sort of this bible

25:51

that was written by Kent Dove.

25:53

And the bible basically

25:56

said that when

25:56

you're trying

25:57

to raise money, you

26:00

should

26:00

use a matching

26:03

grant And

26:03

what that basically means is we all hear this on NPR

26:06

when they fundraise. If you give a

26:08

one hundred dollars today, we

26:10

will match it with a hundred dollars from

26:13

an anonymous donor. Okay.

26:15

So the

26:18

fundraising Bible argued that

26:21

if you use a two:one

26:24

match, that will be better

26:25

than a one:one match And

26:28

if you use a three to one

26:30

match, that

26:30

will be better than a two to one match.

26:32

So three to one's really good. Right? You

26:34

give a hundred bucks. It's matched for

26:36

three hundred bucks. Now,

26:39

I talked to fundraisers back then

26:41

and I said, what is

26:42

the empirical evidence? And

26:45

they would show me evidence that

26:47

was, you know, in some cases, like

26:49

around Christmas time, they do

26:51

three to one, worked really

26:54

well, whereas in the summer, they

26:56

did one to one, and it didn't

26:58

work so well. And I said, well, do you ever

27:00

have data in the same time period?

27:03

They said, don't any studies like that.

27:06

So I tried

27:06

it. I worked with Dean Carlin

27:10

from Northwestern. And

27:11

Dean and I decided to help a

27:14

charitable organization raise money.

27:16

And we wanted to

27:17

test the theory about one

27:19

to one versus two to one versus three to one.

27:21

So what we did is we took

27:23

thousands of households and

27:26

put some in a control group, which

27:28

just received a letter with no

27:30

match. And then another group was one

27:32

to one, another group was two to one, another group

27:34

is three to one. And people were randomly

27:36

assigned to those groups. Yes. And we find

27:38

kind of two things that jump out.

27:40

First, having a match

27:42

matters a lot. So

27:44

if you just have match money

27:47

available, you raise more money.

27:49

And we can say that in a

27:51

causal sense, because we found that

27:53

the one to one, two to one and three to

27:55

one groups raised a lot more

27:57

money than the control. Okay.

27:59

Now

27:59

what about the one to one versus two to one

28:02

versus three to one? What we find

28:04

is that makes no

28:06

difference. So

28:07

the one to one raises

28:09

the exact same amount of money as

28:11

the two to one, which raises the exact

28:13

same amount of money as the three to

28:15

one. So what we can say

28:17

now is what they were telling

28:19

us before

28:21

was a correlation and they

28:23

were finding a result because of Christmas

28:26

cheergiving. The three to one

28:28

was working better in

28:30

December because a

28:31

lot of people give more anyway,

28:33

and they never really had a control

28:36

group to compare their three to one

28:38

width. So now we can say in a

28:40

causal way that

28:41

a higher match rate does not influence

28:43

giving patterns, but

28:46

having a match does. And

28:48

because we used a field

28:50

experiment can make a strong

28:52

causal statement. Yeah. I love that. That's a great

28:54

example, and it's one that really matters. Right?

28:56

Because we don't want to

28:58

be matching three to one and

29:00

trying to raise that kind of extra

29:03

capital when it's actually not necessary

29:05

to motivate our donor base if we're

29:08

an organization. No,

29:09

a hundred percent. So really, you're just throwing

29:12

away,

29:14

let's say rewards. that

29:16

you don't realize it, but

29:18

they're not helping you. You could take the three

29:20

to one dollars, give everyone one to one,

29:22

and then you can use those dollars for

29:25

for a new drive. And those

29:28

dollars, the parts go a long

29:30

way. So what

29:30

that means is I can make

29:32

a strong causal statement If

29:35

I understand the assignment

29:37

mechanism, I need a few other

29:39

assumptions too, like compliance

29:41

and attrition. Those are our exclusion

29:43

restrictions in the experimental world.

29:45

After those are in place, I

29:47

can be

29:48

confident that I'm estimating a causal

29:50

relationship. I also find it.

29:51

I'm curious if this is true for you

29:54

too that once you learn to think this

29:56

way, it helps you be more skeptical

29:59

of

29:59

the information that

29:59

others are feeding to you. And and it's

30:02

easier to poke holes and when someone's

30:04

giving you use

30:06

full data or useful information or

30:09

information that they've merely constructed to

30:11

align

30:11

with their goals.

30:14

So I'm sort of curious to what

30:16

extent doing this kind of work has changed the way you think about

30:18

the information the world is feeding you?

30:21

No. I think you're a hundred percent

30:23

correct. Another side

30:26

benefit of understanding

30:28

correlation versus causality

30:30

is that it's really

30:33

much easier for

30:35

you to understand what's happening

30:38

in the world. You could ask yourself,

30:40

what are the incentives that the

30:42

person has who's giving

30:44

me the information or who

30:46

has generated the information to they

30:48

have certain incentives to

30:51

give me a particular result.

30:54

If so, you should

30:56

think twice whether

30:57

the decision making is to vote

30:59

for a particular candidate or to

31:02

believe the information that

31:04

you're receiving to

31:06

think about is that truly a

31:08

causal result or is there a lurking

31:11

variable? I think all of this really

31:13

helps make you a better

31:15

decision maker as well.

31:16

That is a wonderful place I

31:18

feel to wrap. John, thank you so much

31:20

for taking the time to talk to me today. I

31:22

really appreciate it. Katie, it was so

31:24

great to be here I can't wait to come back. Job

31:27

list is the Kenneth

31:29

C Griffin distinguished service professor

31:31

of Economics at the University

31:33

of cargo. He's also the chief

31:36

economist for Walmart. You can

31:38

find a

31:38

link to his terrific new book, The Voltage

31:40

Effect how

31:42

to make good ideas great and great ideas

31:45

scale in the show

31:45

notes and at schwab dot com

31:48

slash podcast. Whether

31:51

you're interested in better understanding correlations between

31:54

market and economic data or just

31:56

want to make smarter financial decisions,

31:59

say around your

31:59

own charitable giving. Check

32:02

out

32:02

the financial decoder podcast.

32:04

It's a

32:04

great resource for digging into the

32:06

financial implications of the phenomena we

32:08

or here on Choiceology. You can

32:10

find it at schwab dot com slash

32:13

financial decoder or wherever you get

32:15

your podcasts. As

32:17

John List mentioned, becoming

32:20

more discerning about the distinction between

32:22

correlation and causation can help

32:24

you better understand the world and make

32:26

stronger decisions. When I teach my

32:28

wharton MBA students about the

32:30

experimental method and its power to help

32:32

disentangle correlation from causation,

32:34

The

32:34

first thing I suggest is that they

32:37

start greeting causal claims in news

32:38

headlines and from friends and colleagues

32:41

with a healthy dose of skepticism.

32:43

When

32:43

someone tells you that eating garlic can prevent headaches

32:45

or that owning more

32:46

books will improve your kid's lives.

32:49

No matter how plausible their

32:51

story, it's important to ask yourself,

32:53

could there

32:54

be some other explanation for this

32:56

besides a causal one?

32:58

Next,

32:58

if the claim is important enough,

33:01

I ask, what

33:02

kind of evidence would convincingly prove

33:04

this is true? The ideal

33:06

evidence, of course, would be experimental.

33:09

and sometimes you'll find it. After

33:12

all, experiments are how

33:14

doctors test new vaccines and medications.

33:17

Increasingly, experiments are being used

33:20

by economists and companies

33:22

to test everything from the value of

33:24

charitable matching campaigns

33:25

to micro finance.

33:27

In

33:27

fact, the economic Nobel

33:30

Prize in twenty nineteen was

33:32

awarded to a group of development economists

33:34

for their new experiment based approach to

33:36

fighting global poverty. And

33:39

pioneers

33:39

like John List are bringing experiments

33:41

inside big companies like Walmart.

33:44

to improve decision making. But experiments

33:46

are still rarer than they should be in business

33:48

and policy making given the importance

33:50

of understanding cause and effect.

33:53

Experiments can be

33:54

costly and complex, but it's

33:56

often worth the effort to determine what's

33:59

real and what's

33:59

not when you're facing high stakes

34:02

decisions. A key takeaway is that you

34:04

should constantly be on guard for correlations

34:07

misrepresented as causal relationships. Just

34:10

remind yourself how easy it is to be mesmerized and

34:13

don't fall for it.

34:18

You've been

34:24

listening to

34:26

Choiceology. an original

34:28

podcast from Charles

34:30

Schwab. If you've enjoyed the

34:31

show, we'd be really grateful if you'd leave

34:33

us a review on Apple

34:36

podcasts. You can also follow

34:36

us for free in your favorite podcasting app.

34:39

And if you want more of the kinds

34:41

of insights we bring you on

34:43

choice ology about how to improve your decisions. You

34:45

can order my book, how to change,

34:47

or sign up for my monthly newsletter,

34:49

milkman delivers at katie

34:52

milkman dot com slash

34:53

newsletter. That's it for this season. We'll have new episodes

34:55

for you in early twenty twenty

34:57

three. I'm doctor

35:00

Katie Milkmann,

35:01

Talk to you soon.

35:08

For disclosures, See the

35:10

show notes, or visit schwab dot com slash podcast.

35:13

As I mentioned at the

35:15

beginning of this episode, we're looking for

35:17

a little help with trisology. We're

35:19

launching a listener survey, so you can share your thoughts

35:21

on the show and help us better understand

35:23

our audience. Check it out at schwab

35:26

dot com slash

35:28

podcast survey.

35:28

We'd really appreciate you taking a few minutes to provide feedback. That's

35:32

schwab dot com

35:32

slash podcast survey.

35:34

Thanks, and we hope to hear

35:36

i'm healthier for me from you.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features