Podchaser Logo
Home
Evolution Has Pros and Cons

Evolution Has Pros and Cons

Released Tuesday, 14th November 2023
Good episode? Give it some love!
Evolution Has Pros and Cons

Evolution Has Pros and Cons

Evolution Has Pros and Cons

Evolution Has Pros and Cons

Tuesday, 14th November 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Hey,

0:04

this is DeRay. I'm going to Pod Save the People on this episode.

0:06

It's me, Miles and Kaya talking about the

0:08

news that you didn't hear about

0:10

in the past week or that you didn't think about from

0:12

a lens of race, justice and equity.

0:15

And we are offline next week for the holidays,

0:17

but we will be back.

0:19

Here we go.

0:23

Welcome, welcome family to another episode

0:26

of Pod Save the People. My name is Kaya

0:28

Henderson and you can find me on Twitter at HendersonKaya.

0:31

My name is Miles E. Johnson. You can find me at

0:34

PharoahRapture at Twitter, Instagram,

0:37

threads, Black Planet, Live Journal,

0:39

MySpace, Facebook. This

0:43

is DeRay at DRAY on Twitter.

0:45

So shout out to D'Ara

0:48

who's not with us this week, but

0:50

we'll see her next week. And

0:53

I wanted to start out by

0:55

getting your reactions to this clip

0:58

that DeRay posted in

1:00

our group chat. It always goes down in the

1:02

group chat about

1:05

The View. It's an episode from The

1:07

View where the ladies are

1:09

discussing a recent report around

1:12

why millennials are not having

1:14

children. And they all have some very interesting

1:17

perspectives. But

1:19

apparently Auntie Whoopie Dunn got the people all

1:21

riled up. And so DeRay,

1:23

you put this in the group chat. Talk about it.

1:27

Honestly, I think I want to hear Miles' comments

1:29

first. Miles

1:33

who said he saw it earlier in the week and it pissed

1:35

him off. Say more. Miles,

1:37

lead us please. Well, hold

1:39

on. Hold on, because

1:44

I felt like y'all sending me over the mountain

1:46

first for Auntie Whoopie. Let

1:48

me give a little background.

1:51

The report basically says, there's

1:53

a new report out from

1:55

Pew and somebody else that basically

1:58

says that millennials... There's

2:00

a lot of data that shows that the

2:03

US birth rate is the lowest that it's been and

2:06

that millennials are not having children.

2:08

And when asked why, some

2:11

of the reasons are things like the student debt

2:13

crisis, the economy, the housing

2:15

crisis, the fact that these folks are basically

2:18

not set up for economic

2:20

success and many of them feel like they

2:22

can't support a family.

2:25

And so they are choosing not to have children. They

2:28

say that the economic climate has made

2:31

millennials feel like big life

2:33

milestones like buying a house and having a

2:35

family are out of reach for them.

2:38

So that's the background. And the ladies begin

2:40

to discuss this with different people having

2:42

different takes

2:43

on why this is the case.

2:47

Whoopi's takeaways. Whoopi's takeaways. Well,

2:50

okay, wait. Including,

2:56

before I go to Whoopi, including, you

2:58

know,

3:00

Farrah, is that the lady's name,

3:02

DeRay? Farrah Haynes, who we don't know. No,

3:04

no, not me. We all just Google Farrah. The

3:06

Farrah Alyssa somebody. Alyssa

3:08

Farrah, who is the former head of strategic communications

3:11

in the Trump White House. Meghan McCain understudied.

3:14

Who is making a

3:16

halfway

3:17

reasonable case for people

3:20

making an informed decision around

3:23

not having kids. And

3:25

she does not sound super crazy. Then

3:28

there's the Sarah Haynes lady who I never

3:30

heard of before, who says, I think

3:32

this is great because there are so many people

3:34

who basically shouldn't be having kids. What

3:36

about all the kids who get beaten and abused and

3:39

blah, blah, blah? I think it's great that people

3:41

are choosing not to have kids. And

3:44

that sounds crazy.

3:47

And then Auntie Whoopi comes on and says,

3:50

millennials, basically, she basically

3:52

says that millennials are lazy and

3:55

that they

3:57

cannot access

3:59

the economic.

3:59

economic benefits of

4:02

the American dream because they only

4:04

want to work four hours a week

4:06

and and

4:09

that in her generation people

4:11

just got it together and Worked

4:13

really hard and bought houses and had kids

4:15

and just do did the thing. Is that a fair?

4:18

Is that a fair recap

4:20

and there was this like flip it because

4:23

you always know it's tone You

4:25

always know it's tone. There was that flip it response

4:28

that Essentially, she says well millennial

4:30

wanted to work more than four hours a week Then

4:33

maybe they will be able to get a house with some kids too Which

4:36

kind of minimized the

4:38

millennial struggle and also

4:41

it just divorces how

4:44

generational conversation start there's

4:46

always a elder authoritative

4:49

generation who says you know

4:52

what all the problems that this generation inherited

4:54

is because of their

4:57

own self individual depravity

5:00

or Laziness, so this is the

5:03

true when people were saying the same thing about those

5:05

black pantses and those hippies This

5:08

is true when people were saying The

5:10

same things about this it's

5:12

literally just happens in every generation Just look look

5:14

at what was happening with rock and roll and

5:16

look at the generation before and see what they says So

5:19

it just hurts because

5:21

Whitby You

5:24

it's just in my head Whoopi had to have a conversation

5:26

with God right and she said and looked

5:28

in the mirror and said God and God

5:30

said hey Whoopi and whoopi

5:33

said hi back. I'm about to shave these eyebrows

5:35

off Give me some um

5:37

locks and God I need to ride

5:40

on your grace in order to be a star

5:42

And I'm also gonna talk about race. I'm gonna talk

5:45

Have a watch her one woman show every

5:47

year. It's one of the most brilliant performances

5:50

of a woman on stage She was in sister

5:52

acts. She's the one who let Lauren Hill

5:54

know she can sing when we were when Lauren

5:57

Hill's mean ass mama. Oh her she can't sing ain't no

5:59

money is thing. It was to the one who restore her magic.

6:01

So it hurts to see

6:04

that woman who's kind of the, she was clearly

6:06

in color purple. So

6:08

it hurts to see that kind of woman who

6:10

was just this embodied rebellion

6:13

say something so utterly conservative

6:16

and say something so utterly disconnected,

6:19

which proves that doesn't matter how

6:22

radical the roots might be. Enough

6:25

money and enough Barbara Walters sign checks. You

6:31

two might forget that you don't got no eyebrows

6:34

and you got locks. And if you were born in 1991, you'll

6:37

find it hard to work too. You'll

6:40

find it hard to work too. You

6:43

don't look like a good culture for we work.

6:46

You don't look like a good culture for any of these

6:48

things. You're right

6:51

with the we work. So

6:53

it's just wild because I'm

6:55

aware of a millennials of every generation

6:57

for mostly I'll speak for myself and so many other

7:00

people. We had to collage our legacies

7:02

in our career opportunities together. So my

7:04

career opportunities and my legacy

7:06

is collage over a whole bunch of underpaid,

7:09

under benefited jobs that I

7:11

maximize on in order to make my

7:15

way monetarily and also

7:17

to for my voice to be heard. And I know so

7:19

many people who are like that, who are doing art

7:22

in these gigs and these gigs and these gigs and then

7:24

also do entrepreneurship just so they can

7:26

leave a little footprint in their lifetime. So

7:29

for that to be reduced to us wanting four

7:31

hours a week jobs

7:33

and that's how we can't have families. Traditionally,

7:36

it's just so disrespectful

7:40

and so disconnected.

7:43

What you gotta say, Dora?

7:44

Miles, I'm just gonna plus one

7:47

the whole thing there. But I'll add

7:49

a few things that made me think of. One is that,

7:51

you

7:51

know, cause she really does do pull

7:53

yourself up by the bootstraps. If you just worked

7:56

hard enough, you would get it. Like we did

7:58

it too and y'all need to do it.

7:59

and every generation had a heart. That's

8:02

like her mantra. And I think about, I was

8:04

having a really good conversation with one of my friends this weekend

8:06

about the trauma that our parents

8:08

had. And I think about my father, somebody

8:11

who he, I wish he

8:13

had had a four day work week. I wish he had had

8:15

the like time to

8:17

do field trips and all that stuff when

8:19

we were kids. I wish that he had

8:21

had had grown up in a time where people

8:24

weren't, and my great grandparents, I wish that like

8:26

working until you die wasn't the way that

8:29

we told people that was success. Like that is

8:31

what we did to people. And I think that you see

8:33

that in the way that people were raised and like what

8:35

were we loved? Yes, but Lord

8:37

knows I came from a line of blood people who killed

8:39

themselves because they were told

8:42

that like to do work meant that you worked

8:44

this many hours a week and you stayed in one place, mind

8:47

you worst bosses they ever had. People

8:49

treated them like crap at the workplace every

8:51

day, but it was sort of just what work was. And

8:53

you're like, no, Whoopi, yes, people did do this

8:55

for a long time. That doesn't make it

8:57

right. It doesn't make it make sense. And all

8:59

of us have been managers of people at some

9:02

point. And I'll tell you managing people today is very

9:04

different than it was, you know, 20 years

9:06

ago for a lot of people that like, yeah,

9:09

I am listening to you when you complain about

9:11

the thing that I made an expectation about

9:13

because like we're in a workplace together. Nobody

9:15

was listening to my father when he was starting his career.

9:18

It was like, boy, go do the thing that we hired you to. So

9:20

that's like my kinder thing

9:22

I have to say about her. What

9:25

my heart said, as soon as I saw it was,

9:28

I'm reminded that the reason we like her

9:30

is that she performs the words and thoughts of other

9:32

people. And every

9:34

time I'm confronted with her words and thoughts, I'm

9:37

disappointed.

9:38

Let me read this.

9:40

When we get

9:43

to Ted Danson and her defense of

9:45

blackface, honey, this is where I was going. And

9:47

I quote, it takes a whole

9:50

lot of courage to come out in blackface. I

9:52

don't care if you don't like it, I

9:54

do. I will then

9:57

take you to Ray Rice's,

9:58

domestic abuse scandal.

10:02

Don't be surprised if you had a man, he hits you

10:04

back. I know I'm gonna catch a lot of hell and I don't care,

10:06

but you have to teach women, do not live with this

10:08

idea that men have this chivalry thing with them. Don't

10:11

assume that that is still in place.

10:13

See, like that, that. I will

10:15

bet.

10:15

See, like that.

10:17

See, like that. See, they didn't read the book.

10:19

Like what? Let me take you to

10:21

the Oscar so white controversy. See said,

10:24

this is not the academy. Even if

10:26

you feel the academy with black and Latino and Asian members,

10:29

if there's no one on screen to vote

10:31

for, you're not gonna get the outcome that you want. And

10:33

she ends with I've won once,

10:36

so it can't be that racist.

10:39

And then let's remember her historic

10:42

defense of Bill Cosby.

10:44

I bring all this up just

10:46

to say that I am reminded again

10:49

that why people love her is that she performed

10:52

the words and thoughts of other people, which

10:54

are way more progressive than her own. And when

10:56

she speaks, I'm continually disappointed.

10:59

And it has pushed me to remind myself that like

11:02

actors and performers are delivering

11:04

other people's words and other people's thoughts. And

11:06

Lord knows she's a case study for the disappointment

11:09

of someone's own.

11:10

Oh, Whoopi, I wasn't even gonna do it

11:12

to you. I was trying to keep

11:15

it. If it

11:17

didn't happen before 14 days, I

11:19

wasn't gonna bring it up. If this

11:21

episode is not titled the take down, take

11:24

down a Whoopi Goldberg, I don't know what it

11:26

is, but that was brilliant.

11:28

That was a brilliant deconstruction.

11:31

You know, what was so interesting to

11:33

me was, and I am probably

11:35

somewhere between Whoopi's generation

11:38

and y'all's generation, but what was

11:40

so interesting to me is that it provoked

11:42

so much emotion

11:44

from you all. And I was just, I mean, on the

11:47

one hand, I'm not a millennial,

11:49

but I made the choice to not have biological

11:52

children. And that's my business and I don't

11:54

really care what people think about that or why

11:57

or whatever, whatever. And so

11:59

there's a part of me. that's like, whoopie, you can think whatever you

12:01

want about these millennials making their own choices, but

12:04

we's free boss and we get to do whatever

12:06

we want to do. So who cares

12:08

what you think about them, whether you think they're

12:10

lazy or whether you think they're whatever, none of

12:12

your business, say number one. B number

12:15

two, I realized I was done with Whoopie

12:17

as a serious

12:20

influencer when the whole Ted Danson

12:22

thing happened because there

12:25

is just no way on my,

12:27

I've been brown my whole

12:29

entire life's earth

12:31

that there

12:33

would ever be a time where I thought that

12:35

bringing my white boyfriend to the club in

12:38

blackface, saying a whole bunch of N words

12:40

and eating from a watermelon tray and talking about our sex

12:42

life would ever be appropriate. So

12:44

at that point, there was a psychic break between

12:46

me and Whoopie Goldberg because there's no

12:49

way that I could ever understand

12:52

how she thinks. And so for me, this was Whoopie

12:54

just whooping and

12:56

me just moving on along, cause

12:59

whatever. But I do think Miles, that

13:02

this generational thing is, I

13:04

mean, I

13:05

think about how my girlfriends and I respond

13:08

to

13:09

what it means to manage millennials. It is

13:11

a completely different

13:13

ball game. And I think what

13:15

we have to recognize is the world of work

13:17

has changed, expectations

13:19

have changed. And like, we just got to roll

13:22

with it. This is how like humanness

13:25

works. Every generation feels

13:27

like the previous, the successive

13:29

generation isn't like them. And

13:31

they can't be because context

13:34

has changed significantly. So

13:36

yeah,

13:37

I mean, this has inspired

13:39

much more conversation than I ever thought, cause

13:42

I was like, Whoopie, whatever. But

13:44

here we are. Cause you can feel

13:46

it, you know, Auntie Kaya, you can feel it

13:49

because I'm like, I just came back

13:51

from the mountains. We all had to come together

13:54

as grownups and young people and pool

13:56

money and resources and cooking skills.

13:58

And this is what we're doing. what we've decided to

14:00

do with our partners, with other single

14:03

people, all, you know, this

14:05

is how we decided to do family and

14:07

we're making the best of it and we love it, but

14:09

it's also what we're making

14:11

this lemonade. And

14:13

for somebody to say, well,

14:16

you're just making that lemonade like that because

14:19

you're too lazy to go get it. No, there were

14:21

just no lemons. So we had to use this little

14:23

juice and we had to use this sweet and low and

14:26

now we're getting it together and it tastes

14:28

good, but don't make it seem like we didn't

14:31

get it because of that. And then again,

14:33

I believe in media symbolism.

14:36

There's a way that she kind of represents

14:38

the transgressive black

14:41

women for so many people because of the roles she decided

14:43

to pick. So a lot of people don't know about

14:45

the things that the Ray list did and

14:48

that you were all disgusting. But

14:50

there's, but there's, they're

14:53

finding those things out.

14:55

I mean, my, like you just

14:57

said it, right? You went to the Catskills, you did

14:59

family your way. And for

15:01

me, like that is what freedom really is.

15:04

Freedom is the ability

15:06

to live and be however you wanna

15:08

be. And the thing is, any time

15:10

you are living and being in a way

15:12

that is different from the status quo, you

15:15

know, people got something to say about it. Well,

15:17

that's just life. My grandmother used to say, when

15:20

people are not talking about you, that's when

15:21

you should be worried. I'll segue this into

15:23

my news, but I'm always reminded, especially people in

15:25

these big media platforms, that they are

15:28

often, unless they're talking about random

15:30

cultural things, they are often making

15:32

political statements, even if they don't understand

15:35

or care about the significance of the

15:37

power dynamic. And

15:40

it was wild to watch the view thing and

15:42

see Alyssa Farah, the former Trump woman,

15:45

making a pretty cogent defense

15:47

of millennials, right? Being like, you know, people

15:49

ain't got money, inflation's high, da, da, da. And

15:52

Whoopi's like, they always said that. And you're like, whew, not the

15:54

Trump people. Don't go

15:56

anywhere. Ourbroadhiba AF. At

16:04

Cricut we love Carioma and their comfortable, cool,

16:06

sustainably made sneakers. Cricut loves

16:08

them so much that we just released our second

16:10

collaboration with them, a Love It or Leave It sneaker. They

16:13

come in pink and black and have a really fun LA inspired

16:15

design with lots of details Love It or Leave

16:17

It fans will recognize. Now is the perfect

16:19

time to step up your shoe game with these

16:22

super comfortable sneakers crafted with consciously

16:24

sourced materials. Plus Carioma plants

16:26

two trees in the Brazilian rainforest for each

16:28

pair purchased. Head

16:30

to Cricut.com slash store to grab a

16:32

pair.

16:43

But I wanted to talk about Trump because there

16:45

is something happening in this moment where

16:48

people are, some people are disillusioned

16:51

with Biden, definitely young people. I think the

16:53

older black people I talked to are like, it is what it

16:55

is. But definitely some younger black

16:57

people are disappointed and certainly

17:00

organizers are. Now

17:03

I wanted to bring though what Trump

17:05

has said he is going to do if he is reelected.

17:08

He has said, and I'll just let, my news is just

17:10

to bring them here, is that he

17:12

is going to start rejecting asylum claims

17:14

from countries

17:18

using the change in the law that happened

17:20

with COVID-19. He will not use COVID-19

17:23

as the claim anymore. He will just claim

17:26

other infectious diseases are coming in from

17:28

migrants. He said that he is going

17:30

to deputize local police and

17:32

the National Guard troops to deal with immigration.

17:35

He is going to get over the restrictions

17:38

around ICE because he is just going to build huge concentration

17:41

camps. He is going to do

17:43

one of the biggest redirection of Pentagon

17:45

funds in American history to

17:47

redirect to deal with immigration. He

17:50

is, and I quote, going to build the largest domestic

17:52

deportation operation in American history. He

17:55

wants to redo a 1954 campaign. He

17:58

does not name the campaign, but the campaign. campaign was

18:00

called Operation Wetback, and

18:02

it was a campaign to expel Mexicans.

18:05

He wants to mimic that in this moment.

18:08

He also wants to end birthright citizenship

18:10

for undocumented parents. As you know, if you are

18:12

born in America, you become a citizen.

18:14

He does not want

18:16

the parents of kids

18:19

born from

18:21

some countries to become citizens, and then

18:23

he wants to revoke temporary protective status

18:25

of people from countries he deems unsafe.

18:29

These are all categorically bad things,

18:31

and they will be the worst

18:33

things for black and brown people. End

18:36

of story. That is just true.

18:39

There's something about the caricature of Trump

18:41

that people either don't take him seriously or don't think

18:43

about it. That's one thing. I want

18:45

to bring it here because we've got to contend with

18:47

that and figure out how we tell people that story. It's

18:50

juxtaposed by the arrogance

18:53

of the Democrats that are sort of like, well,

18:55

he's just so wild. They will have to choose

18:57

us. I actually think that made a

18:59

lot of sense 20 years ago. It made a lot of

19:01

sense 50 years ago. It worked. When there were

19:04

three stations, two

19:06

news programs, you really could just tell

19:08

one story, and it just is the story. It

19:12

worked. We live in a moment

19:14

where you just can't do that anymore.

19:16

I think the party has not realized

19:18

that he is the

19:21

wildest thing I've ever seen, doesn't cover it

19:23

up. He's not speaking in code. That

19:26

doesn't mean that people will just vote for the Democrats,

19:28

and the Democrats are playing in a world that

19:31

has changed. I am nervous about

19:33

that because they just don't realize there's not one

19:35

story. I think about Corinne, the

19:38

White House press secretary. I don't know that woman,

19:41

but her calling the people

19:43

who call for a ceasefire repugnant, the

19:45

black and brown congresspeople, she

19:48

will never, ever be a credible source to me

19:50

ever again. She just lost all of her credibility

19:53

in that moment by calling them repugnant. 20

19:57

years ago, I would have never seen that woman. I would have maybe

19:59

thought that clip on one news program.

20:02

I've seen that a lot now a lot because people are like,

20:04

is this the black woman they put out there to

20:06

call the black Congress people? That's,

20:09

and that's all I think that is literally who she

20:11

has become to me because I can't defend

20:13

that. That is wild to me. And I do,

20:16

I worry because he is, he sort

20:18

of played fast and loose when he ran the first time. It was like,

20:20

oh, he's not playing fast and loose. He's like,

20:23

I'm going to lock up my political opponents. I'm

20:25

going to do these things. And we

20:27

cannot have that again. And the Democrats

20:29

are just arrogant. I do think we should be

20:32

having a serious conversation about a nominee.

20:34

I ran across an article this week

20:36

that talks about this agenda 47,

20:41

which is Mr. Trump's policy platform,

20:44

and this project

20:46

where they are literally trying to

20:49

pre-screen and vet thousands

20:52

of foot soldiers to

20:54

join the Trump administration the

20:56

moment that he wins, who

20:58

are ideological purists. He feels

21:00

like he hired a bunch of regular

21:03

Republicans the last time and they put constraints

21:05

on him. And so they are literally

21:08

pre-screening. They have thousands of resumes

21:11

and they're pre-screening people. They want

21:13

up to 54,000 loyalists

21:17

to

21:17

come into the Trump administration

21:20

across

21:21

lots of different dimensions of government.

21:25

And literally they're asking these people

21:27

like, who's most influenced your political

21:29

philosophy? Name one living

21:32

public policy figure whom you greatly admire

21:34

and why. There's a ton

21:36

of attention being paid to these people, social

21:39

media histories, literally

21:42

anybody who, and you just

21:44

got to be an all out Trumper. And

21:47

their plan is to flood the government

21:49

with these

21:50

extreme people. And if

21:52

Trump doesn't get elected, they are all set

21:54

to

21:55

push this apparatus to Nikki

21:57

Haley or to Rhonda Santos or whoever the not

21:59

many is. And again,

22:02

it just goes to show, I mean,

22:05

they're using generative AI,

22:08

they've contracted with Oracle to help

22:10

them, you know, vet all of these resumes,

22:12

like, this thing is happening.

22:15

The Heritage Foundation is running it.

22:18

And like, once again, I feel like

22:20

the conservatives

22:24

are playing chess, and the Democrats are

22:26

playing checkers, to think about having

22:28

a legion of thousands of people

22:31

who all believe Trumpism

22:34

going into the government and literally breaking

22:37

government the way you described

22:39

array is hella

22:41

scary, and very,

22:42

very possible. And so I

22:46

thank you for bringing this to the pod because I feel like people

22:48

are in a fog, they do not realize

22:51

how serious these people are, how organized

22:53

these people are, and how many people are

22:55

ready to go into the government and pull

22:56

it apart. We got a sneak preview of it on

22:59

January the sixth, but it could be way

23:01

worse than that.

23:02

When I saw Biden's Veterans Day post

23:06

on Twitter, I kind of understood

23:08

even more deeply the disconnect. So

23:11

on the on the post that Biden

23:13

did, it's basically a

23:16

collage of all the

23:18

horrendous things that Trump

23:21

has said about the veterans. And

23:25

that was the Veterans Day post.

23:29

And to me, I would think

23:32

that we're learning that

23:34

being anti Trump hardly

23:37

helped us win when we were

23:39

fresh off our presidency. And now

23:42

doing that now that we're so many years

23:44

removed from that presidency is definitely

23:46

not going to help us win. And definitely

23:49

not going to help us on the left be unified.

23:51

And I don't describe myself as a cynical or

23:54

pessimistic person when it comes to the

23:57

political destiny of a man. or

24:00

the globe. I'm kind of like eternally

24:02

optimistic around those type of things. However,

24:05

I understand that there are

24:07

a lot of people who are cynical and that this, how

24:10

bad Trump is, is not it anymore.

24:13

And even when I think about it, I have a friend

24:16

Richard Berkshire, he founded something

24:18

called Black Veterans Project, got invited to the White

24:20

House. That was to me such an easy

24:23

way. Like obviously they're aware of him. He

24:25

got invited, somebody's aware of him. I was saying Biden themselves

24:28

was aware of him. That to me would have resonated

24:31

some doing something around black veterans for

24:33

that day and highlighting that that would

24:35

have resonated with the millennials. That would have resonated

24:37

with Gen Z. That would have resonated with

24:39

the people who are being activated.

24:41

And I think that's a broader statement about

24:43

how the Democrats are doing this all

24:46

in all. It can't be another anti Trump

24:49

movement. I don't necessarily understand

24:51

what

24:52

Democrats are

24:54

thinking. Like

24:57

I wish I knew I understood. I think we

24:59

focus so much on what Republicans

25:02

or Trumpers or whatever language you want

25:04

to put on them are thinking. And I'm

25:06

really curious, like what are Democrats thinking? Like what

25:09

do they think this is working? Are

25:12

they jumping ship? Are they panicking? I'm

25:15

so at loss of why did they think that

25:19

running the same campaign from 2020 was

25:22

going to happen again

25:26

or work again. So my

25:28

news is Kiki Palmer and

25:30

Darius Jackson. I'm going to

25:33

say Darius Jackson a lot

25:35

of times because I think when

25:37

patriarchal abuse happens,

25:40

we overuse the victim's

25:42

name and we underuse

25:46

the perpetrator's name. So Darius Jackson

25:50

has been accused by Kiki Palmer,

25:52

his ex partner of just heinous physical

25:55

abuse, including shoving her

25:57

breaking prescription glasses, choking

26:00

her threats to kill himself,

26:02

all types of bodily threats to her

26:04

physically, and then also to her psychically.

26:07

I think one of the reasons, because I always

26:10

play like, there's always a

26:12

Wendy Williams and a Malcolm X in my

26:15

head of why I bring

26:17

things to the podcast. And

26:19

this

26:20

is gossip,

26:22

but I wanted to bring this in because I think

26:24

we all love Kiki Palmer. And we've talked

26:26

about her for the last year so

26:29

many times because she's had moments like,

26:31

nope, she's had just the

26:33

best viral moments because

26:37

of Darius Jackson and not because of Darius Jackson.

26:40

And I wanted to bring

26:42

her in because we've seen her smile so much. We've

26:45

seen her be so comfortable. We've seen

26:47

her be the light in our day and

26:49

create products. And all the while she was being

26:51

abused. Her mother got

26:54

on camera and said that she talked to Darius Jackson's

26:56

brother a year ago about

26:58

the physical abuse. At least for that

27:00

year, this was something that was happening behind the scenes. And

27:02

of course we don't know, but goodness.

27:06

It is so, you can hear things theoretically,

27:08

like you don't know what somebody's going through or

27:12

the strong friend needs help or the strongest

27:15

woman can be going through the most heinous

27:17

of events you would never know. But literally

27:20

the strongest woman was going through the

27:22

most funny jovial jolly

27:26

woman was going through something so heinous.

27:28

And we only saw a crack towards

27:31

the end of it, towards that usher controversy

27:34

and how he responded. And it was really, he showed

27:36

us the crack, right? So it wasn't something that was

27:38

exposed. He was really well-hitted. He

27:41

actually had to work to be

27:43

found out as abusive because he could have just been

27:45

quiet and been quietly abusing her for even

27:48

longer. He did that. I

27:50

wanted to bring this to the podcast and get y'all's

27:53

opinion on this obviously

27:55

very viral topic that's been

27:57

going around the internet, not just because it's... sensational

28:00

because of celebrity involved, but also

28:03

just you all as black people, Auntie

28:05

Kaya as a black woman, I

28:07

think we're always asked to show

28:09

up as our most excellent

28:11

selves, as our most jovial selves,

28:14

and things crumble in the background. So

28:17

I just thought there was no more interesting people

28:19

to ask around what

28:21

they felt about this and the dynamic

28:24

of performing something in public while

28:26

suffering private and how has that tension

28:29

affected you all's personal life too. So I didn't want

28:31

to keep it in the Wendy Williams gossip

28:33

place. I wanted to make it a Super Soul

28:35

Sunday moment by

28:40

redirecting the conversation to our own personal

28:43

vulnerable moments.

28:45

I will say, you know, it was

28:48

the team of people around her, it is clear

28:50

that there's a little bit of the Beyonce playbook

28:53

that people are like, Beyonce is like, there's one

28:55

narrative in it, it's mine. And you're like, I

28:57

got it. It makes sense. And

28:59

I was so sad that she had to

29:01

release those images. Like

29:03

that made me really sad that she had to

29:05

and I can see the thought process from our team,

29:08

because it happened immediately. People were like,

29:11

didn't happen. Why was she with them so long?

29:13

Why would you have kids or somebody who beat you that

29:15

like that was immediately what happened.

29:18

And then the pictures come out and everybody's like,

29:20

Oh, she wasn't lying. And you're like, wow,

29:23

that is clockwork. Like,

29:26

the first thing people said is but Kiki gave

29:28

that interview talking about how they met. Kiki

29:31

said this about them. Kiki has the pictures of

29:33

the baby and they laugh and a joke and and that.

29:35

And and so like

29:38

that was the pictures made me I was like,

29:40

I this sucks for you Kiki and

29:42

to have to prove to people

29:44

on the front end that

29:46

this is even real that you didn't just make this

29:49

up is so disappointing. So there's that.

29:52

The second thing is, you know,

29:55

Darius,

29:56

I was I'm never

29:58

shocked by men because Lord knows

30:01

people do all types of things. So

30:04

Darius made a post that

30:07

has him in it, the baby, and it ends

30:10

with this image of Homelander from The Boys.

30:12

Now I have a lot of critiques about Homelander. I don't actually

30:14

love Homelander, and I think that The Boys is actually

30:17

like a, I think The Boys is a

30:19

not great show around recent justice. It is

30:21

an entertaining show, but that is another episode. But

30:24

if you've seen The Boys, him stitching

30:26

himself into an image of Homelander is, it

30:29

is so beyond nuts that

30:31

I don't even know what to say, because in The

30:34

Boys, Homelander does have a kid

30:36

who has the same powers as him, and it's a big deal,

30:38

because he

30:38

has a kid with a

30:41

mortal.

30:42

But he has a kid because he rapes her. So

30:45

he rapes the baby's mother, and

30:47

essentially threatens to kill her if

30:50

she doesn't exist

30:52

in this random place with the little boy. So

30:55

Homelander goes and sees the boy often, and

30:57

eventually takes the boy from the

31:00

mom. Like that's, that

31:02

whole, that storyline is central to

31:04

who Homelander is, because the boy is sort of the

31:06

only thing that makes him a mildly

31:09

caring, it's like he's 99% evil or bad or awful, and

31:13

then this 1% is he sort of likes the

31:15

boy. But even his love of the boy, he like pushes him too hard,

31:18

and the boy's not gonna, whatever. But

31:20

the relationship between Homelander and the baby's mother

31:23

is rape. That is, it's like the central storyline

31:26

of the child's existence is that he

31:28

rapes her, and she knows

31:31

it's rape. It's not like she is like, this is wrong,

31:33

this is bad, but I love my son. So

31:35

she's trying to figure out what to do, and he's Homelander.

31:37

So he's like one of the most powerful people in the universe, so

31:39

she can't just like hide, and she does try

31:42

to hide, and he has supersonic hair. But

31:44

for him to stitch himself

31:47

into an image of Homelander, I'm like, I mean

31:49

that is,

31:50

the lawyers are having a field day

31:52

with that one, and if I've never seen somebody

31:54

tell on themselves, I don't know what is. So

31:56

I'll just stop there, but the Homelander thing got

31:58

me as like.

32:00

I didn't doubt her before and

32:02

I believed her. But

32:05

now I believe you because you just

32:07

told on yourself too. You know, one of the

32:09

reasons that I love being on this podcast

32:11

is I learn new things all the time. No

32:14

idea who Homelander or

32:16

the boys or anything about that

32:18

is, so I'll have to look that up. But

32:22

I mean, one, this,

32:23

you know, my first reaction

32:26

was here

32:29

we go again, right? And I think

32:31

that, you know, when I think about

32:34

black women celebrities and domestic

32:36

abuse, you know, there

32:39

are tons, the most recent sort of

32:41

the ones that popped to mind quickest for me

32:43

are Rihanna,

32:43

Megan Thee Stallion and

32:46

now

32:46

Kiki. And what

32:49

I

32:50

will say is I'm thankful

32:52

that,

32:54

I mean, we don't know how long this has

32:56

been going on,

32:58

but she

33:01

was like clear. He

33:03

did this, he has the evidence, da

33:05

da da. And

33:08

I, you know, I guess like

33:10

my one sort of hope is that as

33:14

we learn about these things

33:16

that women will understand that they have

33:18

a choice, that they

33:19

have options like

33:21

domestic abuses, horrible,

33:24

terrible.

33:25

You know, Miles, you asked about our personal experiences.

33:28

Thankfully, this hasn't been my experience.

33:31

And so I can't really speak to, you

33:33

know, how people respond. And I think everybody

33:35

gets to respond in whatever way makes sense for

33:38

them. But we love Kiki, she's like

33:40

our little sister. We watched her grow

33:41

up and we all want what's best for

33:44

her. And I think that's

33:45

what you see in the public response

33:47

finally, or I think there's more, there's

33:50

been more of a public response that is positive

33:52

and supportive of her.

33:54

You know, even I think about how

33:56

Meg was treated when, I mean, the dude shot

33:58

her in the toe. And

34:01

it also makes me think a lot about the

34:03

power dynamics between women

34:05

and men, especially when women are,

34:07

are when their careers are further along

34:09

or when they make more money, that

34:12

that

34:12

oftentimes seems to be a

34:15

real indicator of a potential

34:18

problem in relationships that escalate

34:21

sometimes. And so I

34:23

don't know, it's just so sad

34:25

to me that like these poor

34:28

women who have beat the odds,

34:31

are talented, have done all of the things,

34:33

you know, are, are, I

34:36

mean, nobody should actually

34:37

have to face this, but

34:40

it just makes me sad. It makes me want to hug my

34:42

little sisters and make sure that they're

34:44

okay. And I think we have to, as

34:46

a black community, collectively hug Kiki

34:48

and let her know you're doing the right thing and send

34:51

the same message. My hope is that other young

34:54

ladies watching this understand that this is not

34:56

how they're supposed to be treated.

34:59

And I'll say, you know, I actually took Miles's

35:02

question to me, you guys, to

35:04

be what, how do you negotiate

35:06

the, the public

35:08

life when, when things don't go well,

35:11

right? Like, like not necessarily as much as I was. And

35:13

I do think you probably, I'd be interested, I want to know

35:15

what your answer to that is. I will say I remember,

35:17

I mean, so many things, we're learning those people are mad

35:20

at me and I just didn't say anything

35:22

or like, just let it go. But

35:24

there are times I'm like, you know, if I

35:26

talk about it, then I'll have to talk about it forever.

35:29

That's what it feels like. If I address this thing in

35:31

a public forum, it will create

35:33

a public record of it and then people will feel like

35:35

they can ask me about it for the next 3000 years.

35:38

And I just cannot, there are some things

35:40

I'm like, I can't do it. And then there are other

35:43

things when not talking about it means that

35:45

I'll have to talk about it for the next thousand years and on somebody

35:47

else's terms. And that is sort of

35:49

like, when I'm trying

35:51

to negotiate what I think about, like I even think

35:54

about, you know, think about people's

35:56

frustration with Teach for America. I will never forget. I

35:58

was at a panel like in. and these activists

36:00

literally, they

36:04

asked me some questions on the panel and I give the

36:06

normal answer, which is like, hey, did you teach? I'm

36:09

always interested in the way people talk about classrooms

36:11

who've never been in them, but anyway. And

36:13

then afterwards, they literally come

36:15

up, put the camera in my face and

36:18

they're like, you needed to do that. I'm being

36:20

heckled by black activists

36:23

about teaching from America, I'm like, this is what I want to be. I

36:25

don't work for teacher America, I've never worked for, this

36:28

whole thing, but I remember like, am I

36:30

gonna say anything? And I didn't in the end, but like I was

36:32

heckled, or somebody, this is way before the

36:34

current situation in Gaza, this

36:37

like reporter for like

36:39

Breitbart chased me at the DNC

36:42

and was like, what is your statement

36:44

on Palestine? And I'm like, not you running,

36:46

like you're like running me down trying to get me to say something.

36:49

So I just say no comment, cause he's driving

36:51

me nuts.

36:52

Then no comment goes viral.

36:54

I get activists who I was with in Ferguson

36:57

being like, I can't believe you won't stand with the

36:59

Palestinian people. And I figure like, this man

37:01

is literally running me down,

37:03

trying to like force me to, I'm not doing

37:05

it as terrors, but then now I'm having to explain

37:08

all this stuff. And it just is a really

37:10

crazy thing that sometimes like, what

37:12

people expect and demand from you is really wild.

37:14

That's why I have a lot of sympathy for Kiki, not only as

37:17

a victim, but it sucks

37:19

to be the victim and have to do all this work

37:21

to like anticipate the public

37:23

response and get in front of it is just

37:26

awful.

37:28

Yeah, I, thanks for bringing

37:30

that dimension to it, Dore. I think people

37:32

have no idea what

37:35

living life in the public really feels

37:37

like. And, you know,

37:40

my public life pale,

37:42

I wasn't a celebrity, right? I was a local school

37:45

superintendent, but the way-

37:48

Celebrity, she was my favorite. She was

37:50

a young superintendent. My sister,

37:52

and I was just saying, my sister called me about kind

37:54

of like, she is who made me believe that

37:57

I could be a school leader cause I saw somebody

37:59

young. in black and office

38:02

and Kai was there for 10 years y'all she was

38:04

the cool because a lot of times you don't really want

38:06

to be like them you just like them but we wanted

38:08

to be like Kai we were like that's a young

38:10

girl over there.

38:12

I still want to be like Auncea Kai.

38:15

Thank you but I mean I will

38:17

say and I loved that job

38:19

and

38:20

we did great work but the worst

38:22

part of it the best and the worst part of it

38:24

was the public part because

38:26

you people think that

38:29

they own you and that they have

38:31

access to your life and you're doing

38:33

a job and you're still a person and there

38:36

were so many times where people would

38:38

come at me and you know

38:40

one it requires you to have a lot of empathy

38:43

in that role because like people are not

38:45

really thinking when they come at

38:47

you but

38:50

it also you're like aren't I a person like

38:52

I'm a person I get to decide

38:54

what parts of me you consume

38:57

and what parts of me you don't consume and

38:59

people don't believe that people think

39:02

that because you are a celebrity

39:04

or you know you have this public life

39:07

that that means that you belong to people

39:09

wholly and you know

39:11

Kiki clearly held

39:13

back what part she wanted to hold

39:15

back and that's her business that's how she

39:18

you know you don't you nobody has gets

39:21

access to everybody to anybody

39:23

in this way and I think

39:26

we've got to remember that these are people

39:28

yes they're entertainers but these are people

39:31

and politicians same thing you

39:33

just don't get to comment

39:36

on or have access to these people in this

39:38

way and I think you know there

39:40

are people who will say well then they shouldn't

39:43

be celebrities not true not true

39:45

because you are a great actress or a great game

39:47

shows or great whatever you

39:49

should not have to give up your whole entire personal

39:51

life in order to do that job and so

39:54

we have to you know I think clearly social

39:56

media has warped our

39:59

thinking

39:59

that.

40:02

But I do think that this

40:05

whole moment of where we are in

40:07

the world calls

40:07

us back to our humanity

40:10

and I think we have to extend

40:13

Kiki that and and everybody else

40:15

around the place. Everybody gets to live

40:17

their lives the way they want to. We get

40:19

to comment on some of it but not much and they

40:21

get to decide.

40:22

Don't go anywhere. More people.

40:41

My

40:41

news this week is about

40:44

artificial intelligence.

40:45

In fact last week

40:47

according to CNN was

40:49

the most significant week for artificial

40:51

intelligence since the launch of chat GPT

40:54

last year.

40:54

And I brought this

40:56

to the podcast because I

40:58

feel like

41:02

people are not paying attention.

41:04

Some people are not paying attention

41:06

to what's happening in the AI

41:08

world and

41:11

I can talk a little bit more about that later.

41:13

But the

41:15

announcements that were made last week

41:19

are indicative of the speed that

41:21

the AI market is moving at. Like literally

41:23

last year chat GPT burst

41:25

on the scene. A zillion people were using

41:27

it in 15 minutes and

41:29

like a year later all of these different

41:32

iterations improvements enhancements

41:33

have been happening and

41:36

if you're not paying attention you will get left

41:38

behind. So a couple

41:40

of things that happened last week.

41:42

OpenAI which is one of the largest

41:44

producers of of artificial

41:47

intelligence had their first developers

41:49

conference and at this developers

41:51

conference

41:52

they did a bunch of things. They announced

41:54

new updates to AI tools. They

41:57

announced a the ability

41:59

to create

41:59

create custom versions of chat

42:02

GPT called GPTs. And

42:04

anybody can create their own GPT,

42:08

whether you have coding experience or not,

42:11

which is going to make accessibility

42:13

super wide for

42:16

generative AI. They announced a GPT

42:18

store, much like the App Store, where

42:21

you can create a GPT for

42:23

education or a GPT for productivity

42:26

or lots of different GPTs based on whatever

42:28

you do. And the GPT

42:30

store will allow anybody

42:32

to search the different GPTs that are available,

42:35

that people have created, and have access

42:38

to them. So it

42:40

means that loads more people will have

42:42

access to these tools

42:44

and technologies, not just what the

42:47

big AI producers are creating,

42:49

but what any developer is creating. They

42:53

also announced at this

42:54

conference, GPT for Turbo, which

42:56

is the latest version of chat GPT, which basically

42:58

does 16 times more

43:01

work than the previous

43:03

version. And this is literally,

43:05

I mean, chat GPT for was released

43:07

in April. This is now November.

43:10

And the tool does 16 times

43:13

more work than the tool that was released

43:15

in April.

43:19

That all happened at one conference.

43:22

Also, right after that

43:24

conference, there was a targeted attack. So two days

43:26

after the conference, the developers conference, OpenAI

43:31

experienced large scale outages when

43:33

somebody attacked their servers. So

43:35

this is ground zero

43:38

for the tech wars and what's

43:40

happening. And GPT

43:42

went down for the first time. And that was

43:44

a huge thing for anybody who uses this. Also

43:47

last week, HumaneAI released

43:50

their pin product,

43:52

which is the first AI wearable device.

43:54

It attaches to your clothing and it

43:56

projects

43:57

the information onto your

43:59

hand.

43:59

So you can answer calls and

44:02

emails and all kinds of stuff

44:04

without holding a smartphone Friends

44:08

and in fact a human AI's

44:10

pin goes on sale November

44:12

16th for six hundred and ninety nine

44:14

dollars Like this stuff is not in the

44:17

future. This stuff is next week.

44:19

You can get this thing and start projecting

44:21

stuff onto your hand And

44:23

then there's your boy Elon Musk

44:26

who runs

44:29

X AI X

44:31

AI they are working on a chat

44:33

bot called

44:35

grok

44:36

Where does this man get the like his nomenclature

44:39

is so wack like I'm

44:41

not hashtag friend Like my friends are like what's the

44:43

hashtag for this weekend call me Elon I can

44:45

help you it better branding for

44:48

this stuff, but if this thing is called grok and

44:51

It is a chat box that

44:53

will be included in X

44:56

or Twitter's premium plus paid

44:58

service in the United States And the

45:00

chat box has a sarcastic

45:02

sense of humor similar to Elon

45:05

Musk's because we need that in our lives anyway

45:07

Coming soon on the AI front

45:10

are things like Amazon's

45:12

Olympus. Apparently they're pouring zillions

45:15

of dollars into a Olympus,

45:17

which is projected to

45:19

be smarter than chat GPT

45:21

for YouTube is

45:23

testing

45:24

AI tools to improve its products

45:26

and services And what's happening

45:29

is you know, basically everybody

45:31

is using this technology to enhance

45:36

and enrich what they do and I

45:39

brought this to the podcast because you know

45:41

I think about it

45:43

in our work at Reconstruction for those

45:45

people who don't know we teach African American history and culture

45:48

online to young people boom quick

45:50

easy simple

45:51

But you know, we also are

45:53

using generative AI

45:54

to help teachers create more

45:56

culturally responsive lesson plans and part

45:59

of the reason why we did that was because

46:03

we see white teachers, what teachers in

46:05

wealthy white areas already

46:07

incorporating chat GPT into how

46:09

they teach into the assignments that

46:11

their kids are getting. And in

46:14

in communities where teachers

46:16

are teaching low income students

46:18

and low income students of color, teachers

46:20

are wary about generative

46:23

AI, they don't know what's out there, our communities

46:26

are very distrustful. And

46:28

my real worry is that this stuff is

46:30

happening at the speed of light, it's not being

46:32

designed for us, it's not being designed

46:35

with us. And if we're not in here,

46:37

engaged in what is happening, we

46:40

will once again be left behind on this

46:42

technological frontier. And so I

46:45

brought this one to just

46:47

say here's what's happening, but to

46:50

to remind us that, you know,

46:52

as scary as this stuff might seem, it's

46:54

happening. And we need to get in here, we need to know

46:56

about this stuff. And we need

46:59

to make sure that our kids

47:01

are,

47:02

are primed to pick

47:04

this stuff up. I'll say one more thing. I was in

47:07

Phoenix this past week. And

47:10

I went to this AR

47:12

VR lab called dreamscape learn,

47:15

where they basically harness the power

47:17

of movie making and virtual reality and artificial

47:19

reality to

47:20

create to change how

47:22

you teach college classes.

47:24

And so, and it

47:26

was honestly, like the most mind blowing thing

47:28

that I've seen in a long time. And, you

47:32

know, I asked the people, I was like, so

47:34

let's talk about

47:34

diversity. Do you have diverse coders

47:37

and designers, our kids able to

47:39

see themselves in their communities and, and

47:43

that is a live conversation

47:46

for lots of people. And so we can't

47:48

be afraid of this technology, we've got to

47:50

know, keep abreast of what's happening. And

47:52

we've got to lean in on this so

47:55

that we are not left behind, x out,

47:58

whatever.

48:00

I had a little question and I

48:03

know if I have a little question that maybe probably

48:05

listeners do too. So is

48:07

there, can you like tell

48:09

us the difference between the chat

48:11

being on top of the GPT?

48:14

What is the difference between chat

48:16

GPT and GPT? Yeah so

48:20

chat GPT basically

48:22

you ask it a question, it scours

48:24

the

48:24

whole entire internet and

48:27

it comes up with an answer right?

48:30

And sometimes that's great, sometimes it's

48:32

horrible. You've heard all about hallucinations

48:34

and wrong information or racism

48:37

or whatever whatever but most times

48:39

it gives

48:39

you a fairly decent answer. In

48:43

many things it gives you a superior answer

48:45

to what people are

48:49

doing, what people might otherwise

48:51

do. A GPT is basically

48:53

a closed system so

48:56

it wouldn't be pulling from the whole entire

48:58

internet, it might

48:59

just pull from your

49:01

particular

49:02

database. So if YouTube

49:05

is working on it you might not want

49:07

it to go to YouTube might not want it to go to the whole

49:09

internet, YouTube might want it to pull answer

49:12

questions just from the YouTube

49:14

catalog right? And so

49:16

that is that would be a YouTube

49:18

GPT right?

49:21

So this is just the ability

49:23

to determine what

49:26

you load in where your information comes from

49:28

as opposed to the whole kit and

49:30

caboodle.

49:33

It

49:33

makes a lot of sense, thank you Dr. Asikaya.

49:40

So there are two parts of this. One is

49:42

that I do wish there was like a crash

49:45

course that I could take on the GPT

49:47

because I'm not watching a long YouTube

49:49

video and my friends who

49:52

like it are so excited about it that they're not good

49:54

teachers right? Like so there's

49:56

that whole piece. So that's one.

49:59

The second is that I just, you know,

50:01

I feel like an old man in this part where I'm

50:03

like, I don't have the bandwidth to learn

50:06

one more thing yet. Like I'm sort of like out of it.

50:08

I'm like, I get it. I see it. And

50:10

I'm at the point where I'm like, I think this should be used for

50:13

entertainment. Like I'm not, I'm not sure it should

50:15

be used for anything real. So if you need a quick

50:17

synopsis, cool. If you need

50:19

to make a little image, cool. I

50:21

don't want it making decisions about things.

50:24

I don't, I just, and maybe that's because I need to learn more,

50:26

but I'm like, I'm still nervous. It's

50:28

just cause I've seen the movie. I

50:29

just play.

50:31

This was made in the most inequitable way

50:33

by people who think they are smarter than everybody

50:36

and technology is going to save their life. That is their

50:38

ethos. And I am certain

50:40

that they have not thought about the way

50:42

this can be used for bad very

50:44

quickly.

50:47

And then we'll be reading the book about AI killed

50:49

all these people. We're like, how did that happen? I'm like, I know that

50:51

happened already. So I don't know. It makes me nervous.

50:54

As interested in it as I am,

50:56

I am more nervous and interested. So

50:59

you, I think that is a totally

51:02

like reasonable place to be. Um,

51:05

the reason why I would push a

51:08

little bit on this is like,

51:10

think about the internet, right? Like

51:12

the internet was going

51:14

to do all kinds of

51:16

things, right? People were very worried

51:18

about what the internet would be. And

51:20

the internet has tons of really bad uses,

51:23

right? We saw election interference

51:25

and all kinds of things. And

51:28

at the same time, the internet has been ubiquitous,

51:30

has become ubiquitous in how we live our lives.

51:33

Everybody uses the internet, right? And

51:36

my guess is that that is what's going to

51:38

happen with this generative AI stuff. People

51:41

do know what is like,

51:43

how it could be used for bad. Um,

51:45

it did come about in all of the ways. And in fact, some

51:48

of the people who created it are like, oh my God,

51:50

here's how it can be used for bad, right? We

51:52

covered that on the podcast a little while

51:54

ago to the Godfather of AI was

51:56

like war machines. Like that is

51:58

what is going to happen.

51:59

to be terrible, blah, blah, and a few other things. And

52:03

at the same time, the train is not

52:05

going to stop. So us

52:08

being wary just

52:11

means that we continue to be out of the conversation.

52:14

And that, to me, is scarier

52:16

because I'll just share

52:19

our experience in working with this.

52:21

We were like, can this help

52:23

us create curriculum more quickly? And

52:25

so we started prompting it and asking

52:27

it the questions to see if it could get the

52:30

quality of lessons that we as

52:32

black human beings create. And

52:34

our lessons are really good and nuanced

52:36

and come from an asset-based perspective

52:38

and have a pedagogical outlook.

52:41

I'm saying all of these education needs terms to

52:43

say, we got some good stuff. And

52:46

it is professional and it's thoughtful

52:48

and all of this, whatnot. And of course,

52:50

chat GPT could

52:52

not create what we've created. But

52:55

we started trying to train it to get as close

52:57

to what we created as possible. And then

52:59

what we realized is it is not going to

53:01

do that because it pulls from all

53:03

of this junk out in the world. And it

53:06

pulls wrong things and whatnot.

53:08

And so then the question is, can you build

53:10

something that will, when it pulls

53:13

out wrong things, you get it to spit

53:15

it out because there's a check or a balance. Can

53:17

you get it to never give you a classroom

53:22

exercise that says, divide your

53:24

students into masters and slaves?

53:28

Because that's what chat GPT will give you. Or

53:30

chat GPT, when you ask for great leaders

53:36

in African history, one of the things that

53:38

comes up is EDI meaning. You're like, say what now? And

53:41

so the challenge for

53:43

us, and I'm not a technologist, is could we

53:45

train a GPT

53:48

miles to kick

53:50

out those things to really

53:53

think the way we think. And

53:56

we've created a tool that helps teachers

53:59

add African-Americans.

53:59

American history and culture to

54:02

their lessons, that is really good

54:04

and nuanced and is almost as

54:07

good as what we would create ourselves. It

54:09

took a lot to do that. And if we

54:12

could do that, and I'm telling you, we're not technologists,

54:14

we're educators, other people can

54:16

figure out how to do this. I do think

54:19

that a lot of the

54:20

noise, like this thing is getting better,

54:23

like every 10 minutes. And

54:25

so the initial noise that we're

54:27

worried about in three months or six

54:29

months is not, there'll be a whole different set of problems.

54:32

And so I'm simply saying, these

54:35

people are going to fix the initial

54:37

problems, but if we are not in the conversation

54:40

saying what the next problems are going to be

54:42

or what we're seeing, then we

54:45

are going to be left behind.

54:48

So

54:49

I, listen, I'm 53 years old. I

54:53

am the least likely person

54:56

to adopt new technology, but

54:59

I watched what happens in education

55:02

when people of color are

55:04

not in the room designing things. And when

55:06

we're not trying these things out to see whether it works

55:08

for us and with us. And

55:10

I just decided we're not going to be left behind.

55:13

Let's go around. Auntie Kai is going

55:14

to be part of this conversation, baby. Okay.

55:18

And to your point, Auntie

55:20

Kai, that is

55:23

here. And I know there's like medical

55:25

fields that are like using these things we

55:27

already kind of like made fun of, but also

55:30

acknowledge that there are like legal

55:32

people who are people who are

55:34

in law, legal people, people who

55:36

practice law using the

55:39

GPT and AI technology. So

55:41

it really is integrating. And I think we're

55:43

in the very awkward, uncomfortable

55:45

part of something integrating with you. But by

55:47

the time the technology is able to be projected on

55:49

your palm, the integration has really settled in.

55:53

Next week, next week. Right. So

55:56

by the time it's under to get that technology, by the

55:58

time it kind of leans under. $500,000 the

56:01

access point to that is different because I just

56:03

paid for my headphones that were

56:05

six You know that were $600 from Apple.

56:08

So the price of entry is

56:10

just is way lower So

56:12

it will get lower. It'll get even

56:14

lower, right? Yes, and it is something

56:17

to think about because to Antica

56:19

is just really sharp point the

56:21

internet. We're still discovering the ways

56:24

the internet is Influencing

56:26

us there's always new studies about how

56:29

social media is doing things that we didn't expect

56:31

for it to do Work to do and we also

56:33

understand that there's a whole thing that we call the dark

56:35

internet where the most heinous

56:38

of things are being sold traded

56:40

talked about planned and That

56:44

will be happening with AI in the last that

56:46

you I'm trying to scare nobody But

56:48

if you think that the how to Google

56:51

to how to do a domestic terrorist

56:53

attack I hate to say pipeline but

56:56

no pun intended but pipeline. That's very

56:58

real that the the

57:01

the AI to Terrorist

57:04

hateful things is very real too. So if

57:06

not for anything else It's it to

57:08

me is valuable to engage with even if

57:10

you don't want it in the daily ness of your life It's

57:13

just good to know where

57:16

Where the world's going you remind me of what you brought up

57:18

with Darius Jackson the Homelander and

57:21

how so much of that internet

57:23

masculine culture is founded

57:26

on Interpretations of

57:28

comic books and in an

57:30

amaze and stuff like that. So it's just good to know

57:33

What's going on? Just if so

57:36

you can just intelligently say I ain't doing that

57:43

Tell

57:45

your friends to check it out and make sure you read

57:47

it wherever you get your podcast So the topic by

57:49

castor somewhere else and we'll see you

57:51

next week to

58:00

include deliberately backfill

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features