Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
It is an interesting time these days
0:02
around AI. Over the weekend, the
0:05
board of directors for the nonprofit Open
0:08
Ai, which developed and operates
0:10
chat gpt, which is super
0:13
duper popular I use it a lot, they
0:15
abruptly fired the co founder
0:17
and now previous CEO,
0:20
Sam Aldman. So they did this
0:22
in like like thirty minutes before the markets
0:24
closed on Friday, which that
0:27
kind of thing only happens in the rarest
0:29
and most inflammatory circumstances
0:32
because it's such a market shifting
0:35
move to do that right before the markets
0:37
closed, especially on a Friday. So
0:39
this is big news in
0:41
AI as chat gpt has
0:44
the fastest growing user base
0:47
for consumer app in history. They
0:49
did one hundred million users in
0:51
two months. Now, to give you some perspective,
0:54
it took Facebook like four
0:57
years to hit one hundred million
0:59
users. It took Twitter
1:01
five years to get one hundred
1:03
million users. I think it took ig Instagram
1:06
of like two years. Open
1:08
ais chat gpt did
1:11
it in two months, and
1:13
almost as fast as they blew
1:16
up their incredible valuation,
1:19
they blew up their incredible valuation.
1:21
So it went from look, they
1:23
went from looking at like a ninety billion
1:26
dollar valuation to uncertainty
1:29
in the matter of hours and
1:32
days since the board gave Sam
1:34
the boot. In April this year, the
1:36
company was valued at twenty nine
1:38
billion dollars, and just a week
1:40
ago they were looking at a ninety
1:43
billion dollar valuation raising money at
1:45
a ninety billion dollar valuation.
1:48
From April's twenty nine billion
1:50
dollar valuation, they were on track to do a
1:52
billion dollars in revenue,
1:54
which shows you how fast chat
1:57
gpt was growing and with the future
2:00
looked like for them. If you recall
2:02
chat GPT, it
2:04
was just legit like launched ten months
2:06
ago, eleven months ago, So this is
2:09
incredible growth such
2:11
that AI is just about
2:14
all anybody is talking about these days,
2:16
whether they're building it, they're trying
2:18
to fund it, they are starting
2:20
an AI focused startup, or
2:22
they're afraid of it altogether. AI
2:25
is on many of our minds. At
2:28
the time this recording, a lot has happened.
2:30
Sam was out at
2:33
Open Ai as CEO. This
2:35
was just Friday. Then the
2:37
employees and investors revolted because
2:41
the way people believed in Sam
2:43
obviously was a surprise to the board, so
2:45
there were attempts to bring him back over
2:48
the weekend. Then that didn't work
2:50
out. Then today Microsoft
2:52
says they're bringing him over there to run
2:55
some AI projects, and undoubtedly
2:57
many necessary, highly
2:59
talented open ai employees who
3:02
were on his team will follow him in the
3:04
Microsoft. So who knows
3:07
what will happen an hour after I published
3:09
this episode, because this is all happening so
3:11
fast. Everything I just mentioned happened
3:13
in like the span of forty eight to seventy two hours.
3:17
This thing. Usually
3:19
things like this play out over weeks.
3:22
Now.
3:22
The reason I kick off this episode
3:25
with this news is because it's that significant.
3:27
Because we all, if I mean, if
3:30
you've embraced at some level AI
3:33
and you have tried chat GPT
3:35
at least once, there's no way you're
3:37
not hooked. It's so good at
3:41
figuring out how to say things in
3:43
the way you want to say it, by just you giving
3:45
it some simple commands. And
3:47
so I wanted to start
3:50
this episode off just to point
3:52
to make a point that things
3:54
are moving so quickly in
3:56
our society. This is a rallying
3:58
call, a rallying to you, my black
4:00
tech, green money family, to be bout
4:03
it. Nobody asked permission,
4:05
Certainly the board that open Ai
4:07
didn't do a poll of their employees
4:09
and investors to see if kicking out the
4:12
founder and CEO will be acceptable.
4:15
They just bust a move. So
4:18
we're living in the days where people who
4:20
beg forgiveness versus ask
4:22
permission are going to be in a
4:24
position to succeed. If
4:26
you're waiting for your work situation to
4:28
get right, or your kids to graduate,
4:31
or the weather to warm up in the sun to be out
4:34
again before you start
4:36
making your moves, you'll be at a disadvantage.
4:40
Things that used to take months are happening
4:42
now in hours. So I want
4:44
to bring you a conversation from afro
4:47
tech executive in Seattle between
4:50
Jessica Matthews Jessica Old Matthews,
4:52
who's CEO and founder of
4:54
Uncharted, and Johnny Bradley,
4:56
who was the responsible
4:59
AI official for the Department
5:01
of Energy and senior program manager for
5:03
the Artificial Intelligence and Technology
5:06
Office. Because this conversation
5:09
they're having is about demystifying
5:12
AI for our community. So
5:14
I hope you get something from this. Follow
5:16
along, find your way
5:19
to not necessarily you have to go build an
5:21
AI startup, but find your way to use
5:23
it, leverage it to
5:26
do what you want to do, and do what you're doing
5:28
even better, more efficiently, faster,
5:31
cheaper, and with a greater punch.
5:33
Okay, hello everyone, how
5:36
are you? Yeah?
5:39
Yeah, yes, yeah, okay,
5:43
So let's get into it, right, yeah,
5:45
all right. So you know I love soccer and
5:47
you made socket. So I just have to start this conversation
5:50
off by asking one question, why
5:54
did you go from hardware
5:56
to data solutions?
5:58
So we have to go real deep into like the
6:00
technical stuff, but for hi, everybody,
6:03
Hello, Hello, Hello, thank
6:06
you. Will Will's dope
6:09
just consistent. Ask dude, man, how
6:12
much cursing is allowed?
6:14
Right?
6:15
That's always what I like to check. I tend to check
6:17
after my first one fuck falls out. So
6:23
listen, y'all. I flew here from Harlem and I
6:25
was like here, like, where does Morgan have me flying
6:27
to you? Now? That's what I thought was on that plane. I
6:29
was like, oh damn Seattle, Seattle, okay.
6:33
And I was walking down the street and I was like, where
6:35
all the black people?
6:36
Right?
6:38
And then but then you're here, Hello, hello,
6:42
gathered them off? No,
6:45
no, So say, you know, it's a good
6:47
question. So, uh, what Jonie's
6:49
talk about here is that I started
6:52
my career making energy
6:54
generating play products. So when
6:57
I was nineteen years old, I invented
6:59
an energy generator eating soccer ball that
7:01
could harness the energy from play, the kinetic
7:03
energy, and store that power inside
7:06
of the ball. Yeah,
7:08
it was pretty cool. You could play with it, you could
7:10
roll, you know, as it rolled, it was generating energy,
7:13
and about an hour of play would give you three
7:16
hours of light. Fast
7:18
forward. Now I'm thirty five.
7:21
I don't have problem saying it because, like, you know,
7:23
me, black guys, good,
7:28
right, what I'm saying? So,
7:32
and and I run
7:35
a data infrastructure company that
7:37
uses AI to help disadvantaged
7:40
communities develop sustainable
7:42
infrastructure with more equity
7:45
and efficiency than ever before. Right,
7:49
So how did I get from there to
7:52
here? Well, one
7:54
thing is that my true north was always incredibly
7:57
clear to me. I'm a dual citizen
7:59
of Nigeria and States. Always,
8:04
even in Seattle, where I'm
8:06
being so messed up.
8:07
Yeah, yeah.
8:08
The only thing I know about Seattle's Nirvana.
8:11
Right, So, so I like, and I love Nirvana.
8:13
So I was like, sheah, I wear my flannel. My
8:16
mom was like oh s ohse glenno.
8:23
So so for me, you
8:26
know, it
8:28
was going back and forth between Nigeria and the United
8:31
States, whether it's for weddings
8:33
or funerals, and just recognizing that there
8:35
were some things that were so dope, like everybody
8:38
had a mobile phone. It became
8:40
ubiquitous very quickly. Oftentimes
8:43
my cousins had a better cell phone than I did.
8:46
But also wondering like why
8:48
is it that it doesn't matter if we're in the village
8:50
or if we're in like the bustling city of Legos that
8:53
were losing power every single day.
8:56
And I knew it wasn't because
8:59
like there were with the technologies that needed to
9:01
exist to make that happen. I
9:04
knew it was an infrastructural issue immediately.
9:06
In fact, oftentimes in places like
9:09
Nigeria, people are paying more per kilowatt
9:12
hour than we pay here in the United States. I don't know if
9:14
that's true anymore, because like I was, the
9:16
bills getting high, I was your bills getting
9:18
high. But at least before it
9:20
was very much like that. And
9:24
you know, I believe that the first step
9:27
in innovation is the articulation of the problem.
9:29
And at the age of seventeen eighteen,
9:32
I articulated the problem to be that
9:35
people like my cousins who were trained
9:37
engineers did not
9:40
believe that there could be a world where
9:42
things could be different. They did not believe
9:44
that this problem could be solved
9:47
by some innovation or some public private
9:49
partnership. They
9:51
simply thought that the best way to solve the problem
9:53
was to pretend like it's not happening and to get
9:56
used to it. And so I
9:58
wanted to create something that
10:00
would make them change that view,
10:03
that would make them see in
10:05
the world not just as it is, but as it could be.
10:08
And soccer is the most popular
10:10
sport around the world, my favorite
10:12
sport, the most popular sport. And
10:16
I'm assuming you're probably really good. My cousins
10:18
were not that good.
10:22
Teams to the championship.
10:24
I was good. Oh,
10:27
you were so good that you were bad.
10:29
Now my cousins were just My cousins were so average
10:32
that they were like, why are we playing this game? This
10:35
is awkward. So
10:37
but in that in that way though, seeing
10:39
their passion and seeing the way that they would play on
10:41
the field, I'm like, you know,
10:43
the way that you approach this game, knowing
10:46
damn well that you are at best average is
10:48
the way you need to approach life. It's
10:50
the way you need to approach all of the problems
10:52
in our community and infrastructurally, and
10:55
so my thought was, like, let
10:57
me create something that would inspere
11:00
hire them to do it. And what ended
11:02
up happening was that long
11:04
story short inspired me to do it.
11:06
Got it?
11:07
Okay?
11:07
Yeah, all right. It took a while to figure
11:10
out exactly what technologies would
11:12
get there. First it was the play
11:14
products, and then different inventions, and then
11:17
a couple of years ago, right before
11:19
the pandemic, I started to
11:22
realize that the common thread in every community
11:24
wasn't needing, like you know,
11:26
an energy generating speed bump or some cool
11:29
thing here or there that was fun but couldn't
11:31
scale. The common problem was
11:33
data. It didn't matter if it was in Nigeria
11:36
or in the US. The data
11:38
problems were kind
11:41
of compiling on each
11:43
other, and that when governments
11:45
were operating in the right way, they were
11:47
spending at least half of their time tracking
11:49
down, collecting, correcting, and sharing
11:52
SOILID information to make infrastructure
11:54
decisions. They often did not have
11:56
the right information to know where to prioritize
11:59
who should be getting the solar or
12:02
where should we be replacing the lead pipes,
12:04
And in the other half of the time, they were just quote unquote
12:06
shooting from the hip, and
12:08
so I was like, well, if we could solve the data
12:11
problem and improve the way that they're organizing
12:13
their data to make decisions to build
12:16
sustainable infrastructure, we can help
12:18
them build it faster, we can help them
12:20
build it for less. And then as
12:22
a result, we can either help them make
12:24
it more equitable or at least make it
12:26
very obvious when they're being fucked up,
12:29
like make it more obvious, like, listen, that's what the data
12:31
say. If you just want to do that, you can do that, but don't act like that
12:33
data didn't say that. And so that's
12:36
how we came up with our current products.
12:38
And we thank you, right,
12:42
we thank you for that transition.
12:45
So we want to.
12:46
Talk about demystifying
12:48
AI. So when we look
12:50
at AI, I always think about that time
12:53
and I don't I don't mind dating myself either. Fifty
12:55
three Listen, I
12:57
don't mind that either.
12:58
We should just talk about that.
13:00
I do, right,
13:03
this homegrown. So back
13:06
when I was born, actually
13:08
legislation was passed on cigarettes. But as
13:11
I grew up, you know, cigarettes was just
13:13
like whoa, it was a cool thing to do. You had this slim
13:15
lady smoking a cigarette, right, and it
13:17
was just like ayeing to me, right, And
13:19
for twenty years I smoked based off of that.
13:21
But what I will say is
13:23
this, they demystified those cigarettes.
13:26
It was not.
13:28
Right, and that's great, that's
13:31
that's that's my exact point
13:33
is that during my time, it was cool,
13:36
right, and it was the thing to do.
13:38
And as time went by and
13:41
the Surgeon General said, no,
13:43
put these warnings on each pack
13:45
of these cigarettes and let people know
13:48
exactly what it does to you,
13:50
and it would curb and it just like
13:53
demystified the coolness and the sexiness
13:56
of cigarettes. So my question is,
13:58
how do we demystify AI?
14:01
Girl? Oh yeah,
14:03
listen, because
14:06
we had a hole behind the scenes conversation y'all
14:08
that we're trying to like, you know, keep civil
14:10
for the cameras and all that. So
14:15
I think that's part of why you guys are here, right.
14:17
Everyone's talking about AI. Everyone's
14:19
acting like it's the new hot thing, and
14:22
they're acting like it's this black bots and it's this scary
14:25
monster that cannot be controlled and things
14:27
are just happening, but it's it's
14:29
really not, you know, like you
14:31
should not be afraid of AI. You
14:33
should be afraid of the people who are building it. That's
14:37
it. So, like what
14:39
it comes down to, let's break this down, right,
14:42
AI artificial intelligence,
14:44
right, and we discuss this. It's
14:47
it's kind of like a child. It's
14:50
a child. It's like a robot baby, right,
14:52
chat GPT is at best a
14:54
sassy seven year old. And we all know, right,
14:56
we all know that seven year old that was born in like you
14:59
know, I don't know whatever
15:01
seven years ago was like you know, like
15:04
but like basically recently grew
15:06
up with all the social media platforms
15:08
and be out here talking to you like they've grown, and
15:11
you'd be like, well, damn girl, you've grown. No,
15:13
they just online? No,
15:16
Like, do not have this seven year old do your taxes?
15:19
It might go well sometimes until it
15:21
does not, right, But
15:24
what it ultimately then comes down to you have
15:26
to ask yourself, is okay,
15:28
so if this is just this
15:31
this code that has a great capacity
15:33
to learn, right, and the way
15:35
it's taught to learn, it's algorithms.
15:38
Algorithms are just processes.
15:41
Like people like to use fancy words
15:43
to scare us and say, oh, that's a distant
15:45
thing. But if you have a process
15:48
for anything. That's an algorithm.
15:51
That's an algorithm. You can call it that. Especially
15:54
half y'all when y'all women look great in here, I
15:56
know what you had to do, whatever you had to do to
15:58
get here on time. That's a very efficient
16:00
algorithm. That's a very efficient
16:03
algorithm. And all it actually comes down to is,
16:05
then how are you teaching that to
16:09
an artificial intelligence?
16:11
How are you teaching that to this robot?
16:13
Baby? Let's just put it in that way so that it
16:15
can start to do that for you. So
16:19
to that end, when we talk about demystifying
16:21
it, and we talked about this a lot,
16:25
you have to wonder who's doing the teaching.
16:29
You have to wonder who's doing the
16:31
teaching and how are they
16:33
framing the
16:35
way that this child
16:38
should observe and
16:40
respond to the world. So
16:43
if someone is not aware of their biases, if
16:46
someone is not aware of the fact, like, can you
16:48
imagine who's
16:50
the guy who does open Aye, I'm not trying
16:52
to start no shit though, this is going out to the world.
16:54
But let's just let's not use him specifically, because
16:56
you know, but let's can you imagine
16:59
can you imagine an Elon Musk? Can
17:04
you imagine Elon Musk being
17:06
at trying to teach AI how
17:09
to help me do my hair. I
17:14
could imagine he thinks he
17:16
can do it. He thinks, of course,
17:18
no problem. And that's a that's a
17:20
fun example, right.
17:22
Where So when we talk about
17:24
demystifying AI, it's it's
17:27
really saying take the blame away
17:29
from the AI and start focusing
17:31
on the people who are training these models
17:34
right, and start focusing on whether or not whether
17:36
they are doing it so intentionally or on intentionally
17:39
if they're actually considering the vast
17:41
globe of people and
17:43
all of their problems. Because all AI
17:46
is really is a tool. It's
17:48
a tool to help us
17:50
do more. To help people do more, you
17:53
need people to train this AI
17:55
and trust despite all the
17:57
things that you're hearing about AI taking job,
18:01
the thousands of jobs will be created because
18:03
of what this AI is doing. The people
18:06
who used to drive the carriages, when they saw
18:08
the cars, they were like, I don't
18:10
know, it's getting pretty scary out here. That
18:13
car don't even got no horses. They're
18:15
like, this is just wild, this is crazy.
18:17
I don't trust this. Okay, Sure they
18:20
found new jobs. So there.
18:22
I feel like when people start to say things like, oh,
18:24
well be afraid of it. Oh it's
18:26
gonna take your job. What they're really trying
18:29
to do is make you afraid of
18:31
going behind the veil and
18:34
wonder why can't I
18:36
be part of the team that's building this AI.
18:39
Why can't I be part of the crew that's raising
18:42
this baby. You know, they say
18:44
it takes a village, not just some dude who doesn't
18:46
blink in the corner. So
18:48
why are we allowing you?
18:49
I love the sasey seven year old.
18:52
So now let's talk about that sassy seven year old.
18:56
All of you know the landscape today, right,
18:58
you have states that are removed diversity
19:01
and inclusion. You have states that are removing
19:03
African studies. You can't say the word gay. And
19:06
you know I could go on and on, right because I watched the news
19:08
all day every day so and I don't
19:10
watch Fox, sorry, but I will
19:12
say this is that what
19:15
do those practitioners
19:17
look like in the future? Now, this sas
19:20
seven year old has never had anyone tangling
19:22
with it that did not have diversity
19:24
and inclusion. Because you have that today, right, you
19:27
have African American studies today, right, you
19:29
have gender equality, you have these things today,
19:32
but as the days go on, these things
19:34
are being removed slowly. So now
19:36
you have people that are graduating college
19:39
that want to be an AI practitioner,
19:42
but they did not learn what discrimination
19:45
was because they didn't believe in racism.
19:47
Right.
19:48
I had someone tell me yesterday when I was eating
19:50
She was like, my best friend said that there was no
19:52
racism.
19:53
Do I want that.
19:55
Person tangling
19:57
with my sassy seven year old?
20:00
And how does that look?
20:01
I think you know, you know, you know the answer, all right.
20:05
I want to before I respond, I want
20:07
to do a quick show of hands. How many of you are here
20:10
because you're considering how
20:12
to be more involved
20:14
in the AI industry? Okay,
20:17
okay? How many of you are
20:20
here because everyone's been talking about AI
20:23
and you're like, what what is
20:25
this? And you're just trying to understand what
20:27
it actually is? Oh
20:30
okay, okay, be proud, put your hand
20:32
up on Okay.
20:33
That's right. Yeah.
20:35
How many of you in some way
20:37
already work with AI
20:40
or related to AI? Okay,
20:43
okay. And so your
20:45
concern then is really that you
20:47
feel like there's a lot of things happening around
20:49
you that you don't
20:52
understand or don't like or can't control.
20:56
Okay, so this is
20:58
what what you're getting at. Indeed,
21:02
we have to be concerned in
21:04
general, and this goes beyond AI,
21:07
that we're going to start to have generations
21:10
of people who are very
21:12
much driving
21:14
our economy, driving industries,
21:17
developing technologies that
21:19
will have, from our perspective
21:21
and our opinion, a skewed view
21:24
of the world and how things work. And
21:27
they're not only going to teach their NI
21:29
their natural children, that they're
21:32
going to teach the AI this. But
21:35
because of the rapid impact
21:37
and a rapid scale, it's
21:40
it's not as bad as like, oh, those three
21:42
kids grew up in that racist sexist house, so now
21:44
they're racist and sexist. It's that
21:47
AI was developed by this
21:50
person who has racist and sexist biases.
21:53
And because of how impactful
21:55
AI can be, we now have an army
21:57
of racist, sexist or
22:00
the very least incredibly aloof right
22:02
like things happen
22:05
right. So my perspective
22:08
is ultimately radical
22:10
self reliance. I
22:13
can't help what's going on specifically
22:15
Macro in Florida, in Texas,
22:18
but I do know people who live there who are
22:20
saying, regardless of what they're teaching, my kids
22:22
in school. Here's what I'm going to teach you at home. And
22:25
so that's why I've been kind of recently
22:28
saying, please, please, please. The last
22:30
thing you should be is afraid of AI.
22:33
This is now, more than ever, the time where
22:35
you need to be incredibly excited about this
22:37
tool. But you need to see this as a battle
22:40
and you need to do everything you can to get your hands on
22:42
this weapon as well. I'm gonna keep
22:44
it super super real about this. So,
22:47
as I said earlier, I'm a thirty five year old woman, I'm
22:49
married, I love my husband. I'm getting
22:51
ready and preparing myself to freeze my eggs.
22:55
And.
22:57
I'm gonna be keep it real. Part of me is doing that because
23:00
I learned that the maternal
23:02
mortality rate has gotten worse. It's
23:06
twenty twenty three. It is
23:08
twenty twenty three in a developed country,
23:12
and you're telling me that over the last couple of years.
23:14
And it's not just because of COVID. If
23:17
I get pregnant, I have twelve patents
23:19
and patents pending. I'm building all these different
23:21
things, and the thing that scares me the most
23:24
is having a baby and dying. But
23:28
the people are developing AI to
23:30
do the wildest,
23:33
most random shit possible. But
23:36
women are dying when
23:39
when they get pregnant, because
23:42
not enough women and definitely not
23:44
enough Black women are sitting there saying, how
23:46
can I use AI as a tool to keep
23:48
more of us alive? And
23:51
the only way that's gonna change is if more
23:53
of us say we're not afraid of AI. Regardless
23:56
you know what, y'all are gonna do what you want to do with it, But here's
23:58
what I'm gonna do with it. Here's how I'm
24:01
going to teach this child on home, regardless
24:03
of what you're doing. And so that to me is
24:05
the only answer. It's it is not too
24:07
late for us all to
24:09
recognize that this
24:12
will not be done for us. This
24:14
will not be something where we can hope that
24:16
the right few people at the top are
24:19
going to be thinking about all the things that we
24:21
need. We know this, or we would not have systemic
24:23
issues right now. We know
24:25
this. So but what I
24:28
now view, though, is that I believe the technology
24:30
is one of the best equalizers, one of the
24:32
best democratizing tools,
24:36
and that to me is exciting.
24:39
That to me is an opportunity. So let's stop
24:41
talking about being afraid. Let's stop
24:43
talking about it as a black box. There
24:45
are several low code and no
24:48
code tools that you can use to
24:50
create something in AI if
24:52
you want to. How do we get
24:54
people to see this as a playground
24:57
versus I don't know more?
24:59
T ruary right right.
25:05
I'm also literally not kidden my
25:08
I'm sorry. My dad literally two days ago
25:10
and was like, oh, my grandkids are in the freezer.
25:12
I don't know when they're getting out. And
25:15
I was like, actually, Dad, Tiana,
25:18
my older sister, Tiana's, Tiana's
25:20
uh grand kids are in the freezer. Mine are about
25:22
to be in the freezer. Just want to confirm.
25:24
He's like, oh, when are they coming out of the freezer?
25:27
I'm like, this is what happens when you talk
25:29
to your mom. Your mom talks to your dad, your dad talks to
25:31
you. You don't know what's going on. So but
25:33
uh, it's it's real, and it's a
25:35
it's it's a real thing.
25:37
We have.
25:39
We have a very small group of people right
25:41
now who are focusing AI and
25:43
their problems, and I do
25:45
not blame them. Entrepreneurship
25:47
is problem solving without regard for resource, like
25:50
science is the study of life.
25:53
Like these things should not be scary big
25:55
words. Uh. But when
25:57
we silo ourselves and I say
26:00
we, it's just like if you are anyone, if you're
26:02
if you're not affluent, if you are
26:04
not a man, if you're not there's so many things
26:06
that actually most of us are
26:09
not that kind of that paradigm
26:11
of the person who's doing this, most of
26:13
us. But when you kind
26:15
of like push something away,
26:19
you are disenfranchising
26:23
yourself in so many ways. It goes
26:25
beyond, it goes beyond anything
26:27
we can imagine. The thing that I'm most scared
26:29
of is the
26:32
number of people who keep
26:34
saying I'm afraid of what
26:36
could literally be the best thing they've
26:38
ever put their hands on in their entire
26:41
lives.
26:43
All Right, with that, we
26:52
are we are privileged to have ten more minutes
26:54
with them, to have some Q and A. So I'm sure
26:57
there's some questions in the art. Ooh, we got one already.
26:59
I'm gonna come to you. You y'all
27:02
give another round of applause for that man that was
27:04
fantastic. Please
27:08
say your name.
27:10
Hi, my name is Sydney. Thank
27:13
you. Oh,
27:15
I got it?
27:16
No, okay, all right, all right, AnyWho
27:20
I saw you know I heard us talking about
27:22
like fear. I personally don't have fear
27:25
if AI. Maybe I should, I don't know, but
27:27
that's not what you're saying. So I shouldn't be afraid. But however,
27:29
how do we harness that you were sharing some ideas
27:32
of there's some low or no code ways
27:34
of leveraging AI. Can you tell us
27:36
more about like how to like leverage it? And
27:39
yes, no, of course that's a really good question.
27:41
I actually think I'm gonna go ahead and
27:43
maybe I can talk to the afrotech people.
27:46
I'm I'm gonna just list like on
27:48
my like LinkedIn just like seven seven
27:51
platforms. Some require you to know
27:53
a little bit some of you, some of them don't.
27:55
Now there is the underlying issue
27:58
of kind of who's creata even that no
28:01
code platform.
28:02
But at the end of the day, like
28:04
you know, nothing's ever going to be perfect, and we just
28:07
want people to get closer to something. And
28:09
if your engagement, even
28:12
with these no code platforms can
28:15
better educate the sassy
28:19
seven year olds that are running it, So now all
28:21
of a sudden they're not just kind
28:23
of operating with whatever the hell they're being told
28:26
by the very specific groups of people who are doing
28:28
this. It's a good thing, and so there
28:30
are several I don't want this necessarily
28:32
to be an advertisement for any one or the other.
28:34
But on
28:36
Monday, I'm gonna post If you go to
28:39
my LinkedIn, I'm just Jessica.
28:41
Oh Matthews, you'll see it. I will
28:43
post five to six that
28:45
I've heard some good things about. Because again
28:47
I want to be very clear, I
28:49
study psychology and economics. I
28:52
like to tell people I have a PhD in
28:54
Google, which pisses off people with
28:56
real PhDs, I find, But
28:59
the main point is that you know, I also have
29:01
a granted patent for wireless Mesh Energy
29:04
Networks, which is an algorithm
29:07
that essentially considers
29:10
the communication protocols for decentralized
29:14
micro energy systems. And
29:17
I did that with a degree
29:19
in psychology and economics and
29:21
a PhD in Google. So what I
29:24
actually am really trying to say is that you
29:27
don't have to go to school
29:31
for this. To do this, you
29:34
do have to have quite a ferocity for
29:36
self learning, and again
29:39
I fate to say a bit of self reliance in
29:41
this, but with
29:44
the right tools and
29:46
with that kind of interest in researching
29:48
as much as you can you'd be surprised
29:50
what you can do, especially if
29:52
you're comfortable with the prototype being
29:56
very much only a couple percentage points of
29:58
what you actually want. Talked about
30:00
the socket earlier. My first prototype
30:02
for the socket was a shake to charge
30:05
flashlight and a hamster ball. So
30:09
I will post that on what's today
30:11
Thursday. Got to get back to New York. I'll post
30:13
it on Monday. I promise I will.
30:16
Perfect question over here and I'll come over
30:18
there. Yeah, my name is Evan Poncels.
30:20
I'm with the Africa Down community of Alanterst and I just want to
30:22
let you know that black people are here in Seattle
30:25
and mostly concentrated in the Central district
30:27
of Seattle. So let's all learn a little geography about
30:29
this. So from the Central District
30:32
of.
30:32
Seattle, Ray Charles dropped his first studio album,
30:34
so it's not just Kirk Cobain. Also Jamie
30:36
Hendrix is from there. Shout out to my uncle high
30:39
school termer at Garfield High School of anybody from
30:41
Garfield. And so what I wanted to
30:43
say was just that, you know, in addition to radical
30:45
self reliance, we also should be
30:48
organizing around data and around
30:50
artificial intelligence. So one thing we're
30:52
doing with Africa Town is building programs
30:55
so that you can be exposed to these sorts of things. But
30:57
I was wondering, My question really is where
31:00
can we get exposure to the data sets that
31:02
could help us solve with AI things
31:05
like infant mortality or pregnant
31:07
mother mortality, mortality in the birthing
31:09
scenarios. So that because we
31:11
work with universities that are studying, you know,
31:13
data and like things like coming
31:16
up with the language models for African American vernacular
31:18
English and things like this. And so right
31:21
now we're about to start a consortium where we're
31:23
learning, well, what goals should we have, what problems
31:25
should we solve in, what strategy should we we implement?
31:28
And so I'm just trying to see where's our best footing
31:30
for that in terms of organizing, Jenny, I.
31:32
Think I have. I can tell you where my company,
31:34
like when we really started looking at disadvantaged
31:38
communities that are black and Latino majority
31:40
communities, and how do we get the data
31:43
to ensure that we're thinking through the
31:45
equitability of what's happening in this once in a
31:47
generation moment with our infrastructure, and
31:49
how we started creating actual actual
31:53
AI that could support that, But
31:56
I'd love to know what you think.
31:59
First. I can share our perspective, but as
32:02
someone who works with the government, you might know a
32:04
few more.
32:05
Peek so
32:08
you know the government, you know that's
32:11
a beast by itself, right, we
32:14
do have ways of putting
32:16
out actual where our data
32:18
is stored, so I will say that. Okay, So
32:20
what my office does. I'm the Artificial Intelligence
32:22
and Technology Office, right, and what
32:24
we do is every year we do an AI
32:27
use case inventory. And actually
32:29
we're sitting here, but that's what's going on back at
32:31
home is all the labs are putting
32:33
together in AI use case inventory
32:35
will still turn into us in
32:37
mid April. Once that's turned into
32:40
US, we will put that inventory
32:43
up on what's releasable to the public. So
32:45
let me say that because we're seventeen national laboratories,
32:48
so that way you know, everything's not releasable
32:50
to the public. But what is releasable
32:52
to the public. We'll go up on our website,
32:54
right, it's Artificial Intelligence and Technology Office.
32:57
You will see the inventory there. If
32:59
they are listing the code, it
33:01
will be there so you could actually read what
33:04
the name of the use case is you'll be able
33:06
to see a description of that use case if
33:08
it matches anything that you're trying to do or
33:10
looking to do, and it says where the
33:12
code is. That's where the code is at.
33:15
Right.
33:15
If it's blank and you want us
33:18
to find out, that's Jason Tally. You
33:20
want us to find out if
33:23
that code is available, just send us an
33:25
email. Our email address is on the website.
33:27
You send it to us and we'll get it for you.
33:29
If it's available, we'll get it for you. So that's from
33:31
the government perspective.
33:32
Which matters, right, because I think for
33:34
us we're often
33:37
looking at our data sources one from
33:39
actual governmental context. Like a lot of times people
33:41
don't realize that almost
33:43
everything is available to you. You just have to ask for it.
33:46
They're not going to make it clean and easy or create
33:49
an interface that makes it as simple as
33:51
a you know, downloading photos from you
33:53
know, from your whatever app you're using for that,
33:56
but you can reach out. The other thing
33:58
that's been interesting for us over the last two
34:01
years that, to be honest, was a
34:03
bit surprising, was connecting
34:05
and partnering with journalists. Journalists
34:08
are surprisingly good
34:11
at getting real hard data
34:14
in the aggregate. For example,
34:17
it was The New York Times that
34:19
went and actually published with
34:22
incredible support when you actually go in
34:24
and look at what they published and what
34:26
studies that they were pulling from the
34:29
infant mortality rates. And
34:32
I think some of you may have seen that article
34:35
and so, and that's happened before
34:36
we actually struggled and looking
34:38
at a lot of the things related
34:41
to justice forty and disadvantaged communities
34:43
and this idea of forty percent of the
34:46
infrastructure funds they're actually meant
34:48
to go to disadvantaged communities
34:50
across the United States. But we struggled
34:53
to understand how many of those disadvantaged
34:55
communities were majority
34:58
minority, right, because you
35:00
can work some things out there, and that wasn't actually
35:02
available through any government sources. And
35:04
so there was actually a journalist that had
35:06
been doing the work for about
35:09
five years that allowed
35:11
us to actually see every city, township,
35:14
and village that was black
35:16
majority Latino majority
35:18
and black Latino majority. And
35:21
the data set was so massive and
35:24
again readily available. And so I
35:26
think that because if you
35:28
find truly reputable news
35:31
organizations that are pushing data because
35:34
they are often fearful
35:36
of publishing a massive story
35:38
that isn't backed up. They've
35:41
done their homework and you can dig there
35:43
and get their data sets, and when you combine those
35:45
with government data sets, you can
35:47
do some things that are very very cool. You
35:50
know. Hi everyone, my name
35:52
is Asia.
35:53
First, I want to thank you for being
35:55
so honest and real and challenging all
35:57
of us in this room to do more with data.
35:59
And I didn't have a question, but I just
36:01
wanted to tell you, like being a black woman
36:04
is seeing you dominate this space is just
36:06
empowering.
36:07
Wow. Thank you.
36:10
Would be the last one, regular, last
36:12
one.
36:14
Hi.
36:14
My name is Erica Adams Immagrad student
36:16
at UDUB and I am in
36:18
the Information School and I sit on
36:20
faculty committee, and we've been talking
36:22
a lot about student use with ch hat GPT. We're
36:25
already using it most of us, but there's
36:27
a lot of ambiguity around,
36:31
I guess, like cheating and stuff like that. So
36:33
I'm just curious if you have any advice
36:35
on persuading older academics
36:39
on you know, like coming up with guidelines
36:41
for use, because I think it's a great tool for
36:43
us to continue to use and we shouldn't
36:46
be using it with fear.
36:51
See I don't even know when you say older, do you mean people like
36:53
my age,
36:58
because yeah, you're real quick
37:00
and all of a sudden, you like when you go out and you're like, I'm
37:03
not the youngest person out here no more. Right, So,
37:08
persuading that's
37:11
that's a that's a tricky one. That is a
37:13
tricky one because there
37:16
has to be empathy for how
37:18
long they've existed
37:21
and known certain things to be true that are now becoming
37:23
very much untrue. Uh,
37:26
And I I think I
37:28
think starting from that place of empathy is one. So
37:32
I think there's a couple of ways to see this
37:34
if I'm going to be very just kind of direct
37:37
about it. One is, you
37:39
know, I don't know what guidelines are in place
37:41
right now. But obviously
37:44
if everyone goes to chat GPT and
37:46
says, write me a paper on the World
37:48
War, and everyone
37:51
submits a similar paper, the
37:53
teacher will say, oh, clearly there's some sort of plagiarism,
37:56
right because like whether you did this through
37:58
something that you google or you use
38:00
chat GPT, they can tell if
38:04
you go to the effort of engaging with
38:06
that chat GPT interface
38:08
such that what you produce, your
38:11
professor cannot tell. At
38:16
this point I don't really know what else
38:18
to tell me. You No, I mean, I'm not even I'm not. And it's
38:20
not about saying is this cheating or is this not cheating?
38:22
This is about being realistic about the
38:24
world that we're in. Like if everyone,
38:27
if you are lazy with this tool, you
38:29
will be found out to be lazy.
38:32
If you are innovative
38:34
and proactive with this tool,
38:38
you will still rise above. I truly
38:40
believe that there's always still a way to rise
38:42
above and still write
38:44
the best paper with chat GPT compared
38:47
to everyone else. And to be
38:49
honest, if if college
38:51
is meant to prepare you for the real world, acting
38:55
like these tools don't exist when
38:58
they do. And I get
39:00
it that people, Oh, we want to make sure you can write a paper.
39:02
We want to make sure you can do all those things. Yes, yes,
39:05
guess what. I also still don't know how to drive
39:07
stick because I
39:09
didn't have to. So you
39:12
know, we can lament about this world
39:14
of like, oh, we hope people wish they should.
39:16
We want to make sure you still have all these different things.
39:19
Those who care about those skills will get
39:21
them. I believe my husband
39:23
said he could drive stick, but then recently actually I was
39:25
like, were you were you lying? Because this is
39:27
not I don't think he's spot
39:29
to make those noises. You
39:32
know, he's from Mississippi, so he's a
39:34
you know, so he was telling me a whole of the storm Mississippi
39:36
and Texas. So who knows, but
39:39
to that end, right, like I think, but
39:41
he clearly felt that it was important that he said
39:44
that he could drive stick. I was like, boy, I could barely
39:46
do automatic at the time, honestly, like I cannot
39:48
wait for driverless cars. So
39:50
it's it's it's one
39:53
of those things where I would say, so, I don't think it's about
39:56
persuasion. I think it's about recognizing
39:59
that the entire
40:01
all the standards will shift that
40:04
you cannot restaur on your laurels here, like
40:06
everyone keeps saying, I use chatchypet to create
40:09
a marketing plan to do this and do that. We
40:12
will see certain similarities
40:14
that will negate that work if
40:16
you do not still put your human intellect
40:18
on top of it. We're not there again,
40:21
sassy seven year old y'all, we're not
40:23
there, and don't let anyone make you think
40:26
that that we are. But
40:29
yeah, so it really it sounds to me that like,
40:31
if your professors are like super not into it,
40:34
you might be able to save yourself some time and just do
40:37
what you gotta do and be like with the chatchypt No,
40:41
I would never.
40:53
Black dec Green Money is a production of Blavity,
40:55
Afrotech, Black Effect Podcast
40:58
Network, and iHeart Media, and it's
41:00
produced by Morgan Debonne and me Well
41:02
Lucas, with additional production support
41:04
by Sarah Ergin and Rose McLucas. Special
41:08
thank you to Michael Davis of Vanessa Serrano. Learn
41:10
more about my guests the other tech This represent innovators
41:13
at afrotech dot com. Enjoy
41:15
your Black Tech Green Money. Share this
41:17
with somebody, Go get
41:19
your money. Peace and love,
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More