Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:06
l and
0:21
the
0:28
been the AI year here
0:30
now . At the beginning of this year I
0:33
started off . I did
0:35
like three or four episodes and when I called it the AI takeover
0:37
and that was like in January and that was only after using
0:39
chat GPT . I remember talking to you about this
0:41
and it was so incredible to
0:44
me at that time , like I was like this is
0:46
incredible , it's gonna change the world and all that stuff . And
0:48
fast forward here four
0:51
or five months roughly
0:53
six at the time of recording . Actually , in
0:56
AI , i'm seeing a lot of AI stuff . There's
0:58
been a lot of AI companies . There's a lot of AI
1:01
. There's people who feel like it's gonna take over the
1:03
world . There's people who feel like it's gonna try to take over humanity
1:05
. There's people who feel like it's gonna help humanity
1:07
And I'm seeing a lot more AI
1:09
stuff . And one of the things that's interesting to me is
1:12
how we feel like AI is gonna affect an aging
1:15
group of people . Now we know
1:17
that when we or at least I should say
1:19
that from even having a computer store or
1:21
just living through the
1:23
internet and the advent of technology
1:25
becoming so much more important in our lives via
1:28
smartphones , whatever it may be that
1:30
you had slower adoption there for
1:33
a lot of people There's a lot of people who kinda get stuck in their
1:35
ways And , if you think about
1:37
it , if you're talking to a person
1:39
that's lived a vast majority of their life without
1:41
something , whether it is AI
1:43
, which all of us have lived the vast majority of our
1:45
lives without because it's so
1:48
new , at least in the
1:50
way that it is now , cause there's been Alexa and
1:52
Siri and all that stuff . That's all still AI
1:55
, so we've had that in our lives for a while , but
1:57
AI in its generative form and newest
1:59
form is very new for a lot of people And
2:03
I wonder about the adoption here
2:05
. Now , you personally , how
2:07
quick are you to adopt new technology
2:09
And we've talked about this with the iPhone for you , laura , and
2:14
you fought the good fight with that last iPhone that you
2:16
had How
2:18
quick are you to adopt onto new
2:20
technology and stuff yourself , laura ?
2:25
So , believe it or not , reggie
2:27
, i really am quick to adopt new
2:30
technology , right ? I
2:32
think I'm just old enough to where I question
2:35
things and I'm not gonna go blindly into
2:37
things . With the phone
2:39
situation , i just felt like Apple
2:42
was forcing me to
2:45
upgrade , And so that was my my
2:47
hesitance in yeah
2:51
, it really was my protest , Like
2:54
why can't my phone
2:56
work at this number
2:58
as opposed to that number ? I
3:00
think that was what I was grappling
3:02
with , but they won
3:05
ultimately And
3:08
I gave in . But
3:11
when it comes to AI , i'm excited
3:13
. There's
3:17
an anxiousness not necessarily
3:20
anxious , and
3:22
just kind of anxious to see the
3:25
hype . Like , if it's gonna live up to
3:27
the hype , and if it does , it
3:29
could prove to be as amazing
3:32
as it could
3:35
be detrimental . So
3:38
, like anything , right .
3:40
Correct , correct , and I
3:43
personally I'm a tech guy so
3:45
I love it . I'm always first adopter
3:47
. I'm running in head first
3:50
to see what's what
3:52
may be on the horizon for people , you know , maybe
3:55
on the horizon for humanity , and that's where
3:57
I've been pretty much all my life . So I'm
3:59
pretty big early adopter when it comes to
4:01
a lot of technology . But at the same
4:03
time I do understand
4:06
why some people would not
4:08
be , especially if you're more on
4:10
the successful side . Right , if you've had success
4:12
and you have
4:15
what you want , you live the life you want , you
4:17
have the house you want
4:19
, you got the car you want , you got the things in your
4:21
life that you feel are important are all in order for
4:24
you . I could see more hesitancy
4:26
to change anything and more . You know the tech's
4:28
just part of it . But if you're thinking
4:30
, life's been pretty good for me
4:32
up to this point , why in the world would I move over
4:35
to something else ? You know those
4:37
are your typewriter people , days going over to computers
4:39
. And that was your people who were like oh , why would I send
4:41
an email ? I've been sending mail for years . You
4:43
know those kind of folks who really felt there was
4:45
a lot of those people out there too , for people who weren't
4:47
around when the internet was kind of coming up . There
4:50
were a lot of people even with the phones when
4:52
the Blackberry was around and
4:54
iPhone got introduced and people were like that thing doesn't
4:56
even have a keyboard , it'll never last . You
4:59
know , you had those folks out there , right
5:01
, yeah , yeah .
5:03
Right , So would you
5:06
the people around you .
5:07
Do you see them adopting stuff pretty quickly
5:09
? What has that been like for your experience ? or
5:12
maybe people that you've seen
5:14
, maybe in your job friends , family
5:16
, stuff like that .
5:20
So my friends are not so much talking
5:25
about AI , like we're all you
5:27
know in our fifties we're
5:29
talking about our conversations are
5:31
very different than technology
5:34
. A
5:36
lot of you know they range
5:38
the gamut most are in education
5:40
, so I think
5:42
but they're in . you know their principles
5:45
, assistant principles , and they're really looking to retire
5:47
. I think we all know that
5:52
it's coming . I don't think anybody is like
5:54
scared or anything , even though I've
5:56
heard some things . You know
5:58
that . You know , like I've seen
6:00
, i've actually seen some
6:04
ads where people were talking and
6:06
it looked like them , sounded like them
6:09
. You know , for all intents and purposes
6:11
, if you didn't know any better , you would think it was
6:13
them and I think that can be dangerous
6:15
. But like anything , you
6:18
know , i think we're half . You
6:20
know we have to be discerning . I've seen some beautiful pictures
6:22
and they haven't
6:24
quite mastered
6:26
People
6:28
yet . I see but
6:31
, I can see . I
6:33
Can see where
6:35
, if they did master it , it would
6:37
definitely like what . There's no need
6:39
for a model . They can make beautiful
6:41
people , you
6:47
really can . And I think the
6:49
scary thing is , though , you know , having someone
6:52
you know like the president or dignitaries
6:54
, and you're having them say
6:57
some things that are Not
6:59
real , and you know that could cause , yeah
7:03
, some problems . But aside from that , i think I
7:05
think it's really . It's
7:08
amazing . Actually , it really is amazing
7:10
, and I think , the more that we
7:12
explore what it , you know the possibilities
7:15
. Especially , you know we've had this you
7:17
and I talked about . I've talked about this often
7:20
You
7:22
know the workforce and
7:24
You know when you're dealing with
7:27
people , you're dealing with personalities
7:29
and you're dealing with you know the ups and downs of the
7:31
market right and
7:33
then If you
7:35
can have some artificial intelligence
7:38
, come in here and do something You know
7:40
have to deal with , you know people saying
7:42
, well , you know , i'm not
7:44
gonna do that , i'm just not gonna do it and
7:47
you're at their mercy . So
7:49
I it has the potential
7:52
of Being amazing
7:54
. Yeah
7:57
, but a lot of people my age , i think until
7:59
we have to like that's not
8:01
a conversation that we have . Hey
8:04
, what do you think about this new AI ? We're
8:07
trying to retire , be
8:10
somewhere , or , you know kind
8:12
of worried about some of the the political
8:14
, you know uneasiness
8:16
that that is happening right now
8:18
.
8:20
And that's understandable , because I think that another
8:22
thing to that too and that's not just age
8:24
or person specific , not everybody just interested , you
8:26
know , and I feel like if I
8:28
could , i May
8:30
see a lot of it and I might
8:33
think that maybe 20
8:35
, 30 , 40 percent of people really , really
8:37
care , but maybe that number is too high , like I don't
8:40
know that . Even my
8:42
conversations , a lot of them aren't about it , unless
8:44
I'm talking to somebody who's really into it . Laura
8:46
, you know , it's not like I'm having
8:48
AI conversation with random people
8:50
throughout the day , and so I can kind of definitely
8:52
see that , because I think a lot of things , laura
8:55
. Until they affect your actual life , you
8:57
don't really care about them as much . Yeah
9:00
, you're right , like , until it has an effect
9:02
on me . Yeah , personally I
9:04
don't really care . I'm
9:06
kind of trying to handle business in other areas , other
9:09
facets of my life . Now We
9:11
brought up the working situation . I wonder
9:13
how that does that ? does that affect an aging
9:15
workforce more than it affects like a younger workforce
9:18
? when it comes to people kind of Getting out
9:20
like of not wanting
9:22
to deal with other
9:24
, are they human beings ? and when
9:27
you know AI , it's gonna work 24
9:29
hours , seven days a week , no complaints , no
9:31
days off , no vacation . You know
9:33
what I mean ? No , no , talk back . And even from a
9:35
from an older standpoint , it
9:37
might not be the person that's actually
9:40
getting fired . It might be the person doing
9:42
the firing because of AI , like You
9:45
know it could fall both ways , right
9:47
could fall both ways in a sense of like . Well , if
9:50
I have an aging demographic , because you
9:52
know the the baby
9:54
boomer demographic right there , right , that's an aging
9:57
demographic and that's gonna be a massive , i'm not sure
9:59
if all of them have left the workforce , but
10:01
it's a good amount of them that
10:03
are about to if they have not right
10:05
, and that's massive for
10:09
just The amount of jobs
10:11
that are kind of coming out of the workforce as people look to kind
10:13
of go sit on a beach somewhere Or chill and
10:15
kind of hang out . Then
10:17
I think maybe you have a situation there where
10:20
you're not dealing with as much , where
10:23
you can bring on AI then and
10:25
and kind of blunt the impact of that . You think
10:27
that's a possibility It
10:31
becomes on soon enough anyway .
10:32
I think , yeah
10:35
, i think so , and I think the more that
10:38
we explore , like
10:41
all of the possibilities , the more people will
10:43
be talking about it , how
10:45
it's going to affect them . I think , older
10:47
the older generations , just like every older
10:50
generation , you know you
10:52
look at things differently . You may
10:54
not want your food served to you
10:56
by , you know , automation . Yeah
10:58
you know . But you have a
11:00
younger generation that is so accustomed
11:02
to that That
11:04
they're going to , you know , you have little bitty
11:06
kids now and they know what to do with phones
11:09
Because that's
11:11
what they were brought up on , like they , they know
11:13
automatically , right . But um
11:16
, an older generation
11:18
still is gonna want that human
11:20
touch , that human interaction . But
11:23
eventually those people will , will
11:25
die off and you have this
11:27
whole generation that is that is , you
11:30
know , very accustomed
11:33
to robots
11:35
and automation and no
11:37
one human doing different things , whether
11:39
it's banking or medicine or grocery
11:42
stores , or you
11:44
know what I'm saying People are saying . You know , we
11:47
fall in line with things . We kick
11:49
and scream , but eventually
11:51
you learn to adapt to
11:53
whatever is out there and whatever
11:56
you know is present
11:58
.
11:59
Does that make sense ? Yeah , How has that been for you
12:01
? How was that ? Because
12:04
you said that you're a person , that you're not necessarily adverse
12:07
to technology , because you know some people are completely
12:09
have more of a hostile take
12:12
on it and technology ? in general , like
12:14
machines are taken over the world and stuff like that
12:16
. you know How
12:19
has that been for you Just having ? you've always been
12:21
like , just I'm just going to roll with the punches and whatever
12:23
comes , kind of comes .
12:27
Yeah , pretty much . So I mean you
12:29
. I can remember
12:31
them making us wear seat belts . I can remember
12:33
a time when you did not have to wear a seat belt And
12:36
I remember when they
12:38
said you're going to have to wear a seat belt , and I remember
12:40
specifically and I'm embarrassed to say this
12:42
I remember saying it's kind of wrinkle
12:44
my clothes .
12:45
Yeah , absolutely .
12:47
I mean , forget the safety . I can't show up like
12:49
this , right
12:51
, right , if I have
12:53
a wreck . You know , imagine that . But
12:55
those , but you do it , you know and then
12:57
you know . Of course now we know
12:59
why , but you think about Walmart
13:02
and how you know . Little
13:04
by little , you saw less cash
13:06
years . Yeah man , and now the self-checkout
13:09
is hit Like it's . The
13:11
area is huge now in Walmart
13:13
And no one seems to be there
13:15
. Every now and then , you
13:18
know , you'll see somebody
13:20
that just has to go through the line . Like me
13:22
, i don't know .
13:23
Have you been ? when's the last time you've been through the line at Walmart
13:25
yourself ? Like had to go to a person .
13:28
I went . I went the other day . We
13:31
had to get some things for my
13:34
job And
13:36
so I had to go through the line , because we're
13:38
a nonprofit and I use a
13:40
nonprofit card , so we
13:43
don't pay taxes . So I had
13:45
to go through the line And
13:48
it was different . I was waiting
13:51
, yes , and it's a nightmare
13:53
.
13:54
It's a nightmare . now , i'm sorry , i
13:57
would never . I would never
13:59
go . I would never go to Walmart if I
14:01
had to go to the line every time . Remember old school
14:03
Walmart when people hated going in there for a book because
14:05
of that reason , like the lines being so
14:07
long that you
14:09
avoided it all together And that
14:11
self-checkout has changed my experience . I want to
14:13
. I mean personally tell you that .
14:17
It's changed everyone . It's changed everyone
14:19
. And imagine so you've got six
14:22
self-checkouts I'm thinking of the grocery store
14:24
and you've got now you just pay
14:26
one person to watch everyone
14:28
, instead of paying six people to check people
14:30
out .
14:31
Yeah .
14:32
So imagine for a grocery store like that's heaven
14:34
.
14:35
For sure , for sure .
14:38
Wind , wind all the way around Wind , wind
14:41
, weight loss And I think about my industry
14:43
, like
14:46
if it will affect my
14:48
industry . I work with people
14:50
, you know , people with disabilities , as
14:52
a matter of fact , and so I
14:54
think the human aspect
14:56
is very important , especially
14:59
for people with disabilities , like they need to feel
15:01
that touch and see a smile
15:03
and things like that
15:05
. And so I wonder , though I
15:07
mean definitely , you
15:09
know , one of the things that we do
15:11
is give meds
15:14
, and it's not always an easy
15:16
thing for somebody who's making $10 , $11
15:18
an hour . They forget or they just throw out meds
15:21
, which is extremely
15:23
huge . Right , i
15:25
mean , that's a life or death situation . A
15:28
machine is not going to make
15:30
as many mistakes , and
15:33
I would take a machine any day to give meds
15:36
to my clients , as opposed
15:38
to a person who's distracted
15:40
or who's not in a good mood or
15:42
who is forgetful
15:45
or whatever
15:47
you know on medicine . So
15:53
I'm really looking forward to
15:56
seeing what it's going to
15:58
be like for professional
16:02
, you know , for businesses
16:05
, for people
16:07
personally , for everything
16:09
. I'm just really , i'm
16:12
really . I hope I
16:14
get to see , and I'm pretty sure that the way things are
16:17
moving , that it's not going to take long for
16:20
me to see . God willing , I'll be here . It's not going to take
16:22
me long to see just what
16:24
can be done with AI
16:26
.
16:28
I'm thinking here in the next year
16:30
, the next two years , because you can think
16:32
about how much more it's done in
16:35
less than a year . So it's before
16:37
me . I'm just going to chat to you and
16:39
even a different iterations of chat to BT
16:41
from each other are way different or way better than
16:43
they are from each other . So if you think about all
16:45
the learning that it's doing on
16:48
every single day because of just people asking
16:50
you questions or using it on a daily
16:52
basis , making it smarter , then
16:55
you have a situation that it's going to be so different
16:57
year to year , laura , like it's going to be so
17:00
much better a year from
17:02
now and then it's going to be so much better a year
17:04
from that , that
17:07
I think the possibilities are endless . It's definitely
17:09
one of the largest thing I've . I couldn't harp on it . I
17:11
think I've done at least 10 , 15 shows on
17:14
it just this year because
17:16
it's so amazing . I don't feel like a lot of
17:18
times , Laura , that we come across technology
17:21
like this that is going
17:24
to change how people do everything , And people
17:26
are using it to write their dating profiles and they're
17:29
getting a lot more pool than they were when they were
17:31
writing their own . So that's one
17:33
of the stuff I'm saying Yeah
17:37
.
17:37
but then what happens when you write this amazing
17:39
dating profile ? and then someone's dating you
17:41
and they're like what happened to this witty person
17:43
? What happened to this intelligent person that I thought
17:45
I was going to get , or this you know . You
17:49
know what I'm saying .
17:53
At some point you have to show up , at some point I'm
17:55
going to crack up with you .
17:56
Yeah , yeah , yeah
17:59
, you do . Yeah , you
18:02
know what ?
18:03
you think I do , laura . But you
18:05
know , what's funny is , as you were saying , that I feel
18:07
like that's what's happening to people . Anyway , people
18:09
are already feeling like that , whether it's you
18:11
taking creative angles with your photos
18:14
, you saying , hey , you lying about your height , i'm throwing
18:16
a couple of extra inches on
18:18
there , making sure she hits me back . And then I got to
18:20
show up to the date and hope she loves me . There's
18:23
a lot of that stuff that was already going on to
18:26
a certain degree , even without AI
18:29
, that I feel like it'll be perpetuated or
18:31
be much more of the same . You know what I mean
18:33
. It'll be the same . Or you know it
18:35
even because we're talking about the dating aspect . Right , let's
18:37
talk about the resume aspect , because a lot of people are using it to write their
18:39
resumes . Same situation You're
18:41
a rock star on your resume , right You're ?
18:42
a rock star in that interview .
18:44
You get the person , the person that's working for you , and they're
18:46
just average or maybe
18:48
a little below in some areas
18:51
, and leave me wanting that same person
18:53
that you interviewed or that same resume that you read
18:55
. That was so amazing . So
18:59
I do feel like that there's going to probably perpetuate
19:01
more of that stuff . You know what I mean ? It's
19:03
definitely . I don't think any of that stuff gets made better by
19:06
AI . I think definitely AI is going
19:08
to make you sound like more of a rock star . It's
19:11
just imagine . And then the person still does have to
19:13
show up and actually do the job or
19:15
actually show up at the date and actually
19:17
be that person . You know what I
19:20
mean , but I do think it can help . If
19:23
you're just not good at writing profiles , if you're just
19:25
not good at writing resumes
19:27
and stuff like that , maybe it can help
19:29
you in that way , but I do think
19:31
that you're going to have to actually have the experience
19:34
. Now , what do you think about the ? because
19:36
I think the biggest negative , in my
19:38
opinion , is just going to be the repercussion for
19:40
jobs . I think it's the biggest
19:42
negative that's going to come with that . If you don't
19:44
have a job that is a
19:47
that calls for you to actually be there , like
19:49
, let's say , i can't have an AI roofer
19:51
, right , an AI plumber , an
19:54
AI person come through and like
19:56
do my drywall . You know what
19:58
I mean . I think those jobs may
20:00
be a little bit safer because you're going to need a
20:02
human being to come and do it . Maybe they
20:05
have AI to help run their business , but still
20:07
they're going to have to show up and do the actual
20:09
labor . So I think that those jobs are a lot
20:11
safer than
20:14
a lot of like . Let's say that you're a lawyer and all
20:16
you do is work on contracts all day , or you're an accountant
20:18
. I think AI can come
20:20
for your job . I think it can , because
20:23
you can probably do your job a lot
20:25
quicker than you can do it . What are
20:27
your thoughts on that ?
20:28
Yeah , thank you , i
20:31
think you're right . I think
20:34
you're right , um , and
20:36
I certainly think that AI could do a better
20:38
job than SCOTUS . So
20:40
, um , you're
20:43
going to have like nine judges
20:45
.
20:45
You just wanted you had to get that one in there , didn't you ?
20:49
I had to get that in there . I mean , ai
20:52
couldn't do any worse , but I
20:54
definitely think , um that
20:57
there are going to be
20:59
some jobs that
21:01
you know I mean , but it's that
21:04
has happened throughout time . Reggie
21:06
and the
21:09
the cotton gin
21:11
, you know .
21:12
Yeah , man , there's always been something right that
21:14
comes and just disrupts
21:17
things .
21:18
Yeah , there's always and
21:20
it does . It does for a while
21:22
. Like you think about auto plants
21:24
, you know , and you think about you . Look at
21:27
cities like um Detroit that
21:29
are now ghost towns right , yeah
21:31
, because these machines , um
21:35
, you know , get away with their jobs
21:37
. But , um , and so , while
21:40
the machines are amazing and you ask any
21:42
car , you know
21:44
, company or whatever you know , was this
21:46
great ? Yeah , this technology was amazing
21:48
. But then you ask , um
21:50
, the workers and they're like , yeah , i lost
21:52
my job , i lost my house , we had to move
21:55
. And so I think those
21:57
are the pitfalls
22:00
, those are the great things and
22:02
the very worst things
22:04
that come with evolving
22:09
and that come with anything
22:12
, not just AI . Ai just happens
22:14
to be the big
22:17
thing , right , we haven't had like this huge
22:19
thing in a while . And
22:22
so every now and then , every
22:25
once in a while , our
22:27
society comes up with this ah
22:30
, you know whether it's the car that puts
22:33
the horse and buggy out of
22:35
commission , and
22:38
so somebody loses and
22:41
somebody wins big , and society wins
22:43
big , but there's always some
22:45
. What do you call it ? Ah
22:49
, what's the word that I'm looking for ? You know , these are
22:51
the . These are some of the pitfalls that happen
22:53
when you're involved in
22:56
technology . Collateral damage , thank
22:58
you . Those are that collateral damage . And
23:00
, yes , you're going to have a lot of collateral
23:02
damage with AI , but when you think
23:04
about , you know what it can
23:06
do . You
23:09
know , hopefully I'm thinking that the pros
23:12
outweigh the cons , right
23:14
.
23:15
Would you use AI to write your dating profile
23:17
, Laura ?
23:20
Then you have to think about it .
23:22
You can just put a few pictures up there . Laura
23:24
, Throw some pictures up there . Let AI
23:27
do the rest . You can just wait for the wait
23:29
for the men to come in there , Less
23:34
time spent .
23:36
You get the same situation , yeah .
23:37
I remember we talked about the face . Remember a while back ago , laura
23:40
, when we were like we're talking about the . Facebook situation
23:42
. I don't know if you were on there even
23:44
five days , laura . You're like this is too
23:46
overwhelming .
23:49
You're saying you're gay , right Right
23:52
, it was too much So imagine
23:54
an AI profile .
23:56
Man maybe roll it in , laura Roll
23:58
it in .
23:59
Right , yeah , yeah
24:02
, um sure , i'd
24:05
use it to do whatever , as long as it didn't
24:07
hurt anyone . Um , i
24:09
don't want to be deceitful , but , um
24:13
, you know I mean
24:15
, and then what , how do you do the AI profile
24:17
, like ? do you tell them , do they ask
24:19
these questions like okay , what's your age
24:22
, what do you like ? And then , based
24:24
off of those answers , they write
24:26
something . Or do they just write something , not knowing
24:28
who you are , and it could be totally not
24:31
representative of who you are .
24:33
I feel like it would have to be some questions
24:35
, right , like there would have to be some kind of like
24:37
prompt or something or otherwise
24:39
this can write the same profile for every human
24:41
being , like wouldn't even have to like have any deviation
24:43
. If it's just writing you a human
24:46
woman profile like , that's very
24:48
, that's very vague , like . I feel like you would
24:50
have to like narrow it down to something
24:52
, right , like . So I
24:54
feel like maybe , cause I haven't used it , maybe I should
24:57
just go with that and do that on the show one day . I'm just gonna do
24:59
that on the show show and just create an AI
25:01
profile and just see what happens , see if see what
25:03
happens there , cause it's interesting . I
25:08
don't really know , cause I've been out of dating game for a while
25:10
, so I don't . I don't know exactly what that would
25:12
be , but I do know that I would have used
25:14
it to save me the time , absolutely Like if
25:17
I could save time on that and I could just
25:19
now just vet people , that kind of like . You
25:21
know , cause it's different for guys and I feel
25:23
like for a lot of women they're
25:26
going to be inundated , no matter what they do , with
25:28
just a lot of dudes . Um , so they're going to
25:30
be inundated with a lot of guys just hitting
25:32
them up , versus , like a lot of guys , you gotta
25:34
go more , like you gotta be more aggressive or
25:36
more assertive . I should say not really aggressive , but you're more
25:39
assertive , like you're going to go after . Go
25:41
after a women more so , cause
25:43
I mean , i looked at
25:45
a friend of mine's , who's her
25:47
? she , her profile had like a ton of just random dudes
25:50
in it And she wasn't even trying
25:52
hard , you know , and it had a ton of dudes in
25:54
there . So I'm like they don't
25:56
have time . I don't feel like a lot of women have time to
25:58
do it , to sit there and just reach out to dudes
26:00
all the time , even though I do respect that more . Um
26:03
, but why would you if you ? got like a hundred dudes
26:06
in your inbox already . Like well
26:08
, you can just spend all your time just going through your inbox
26:10
, you know . So I think that it would have
26:12
to have some kind of prompts for what you like , what your
26:14
interests are right . Because , you
26:16
, even if it's going to be like a matching , and
26:18
a lot of these companies are on top
26:20
of that . They might be using AI to even do the matches
26:22
. So , you have AI that you might
26:24
use to build your profile . The company
26:27
itself might be using AI to kind of do
26:29
like a match making type situation . You know what I mean . So
26:32
you got AI all over the place .
26:33
Well , i was just going to say that , like why can't
26:35
AI find my person too
26:37
? Like , if you're so intelligent , fine
26:40
, you know what I like . So
26:43
I'm my person for me , yo
26:45
. Laura .
26:46
I think that's possible . Hello Laura , I
26:49
think that's absolutely possible , right Cause ? think
26:51
about it like this , laura If AI
26:53
can replace , like somebody writing
26:55
some lawyer writing
26:57
up some massive contract for some company
26:59
, right , and if AI can do that in seconds
27:02
, then it should be able to
27:04
look at some matchmaking possibilities
27:06
. In my opinion , like I don't see
27:08
why it wouldn't be like
27:10
it's intelligent enough to do that first thing , but not
27:12
intelligent enough to do that second thing . Of
27:15
course the people have to meet , right
27:17
? So the course that people have to meet , actually
27:19
like each other , actually be able to jail , have conversation
27:21
, all that stuff . Still , of course , you would definitely need
27:23
all that and that AI can't do that part
27:25
. But AI , can AI give me
27:27
like 10 suitors who are
27:30
my type ? Why not ? Why
27:33
not ? Why can't I get a small group
27:35
of people ?
27:36
That , i think , would be great .
27:38
Matchmaking situation right .
27:40
Like an AI matchmaker , because that would be the most difficult part
27:43
. It really is . I mean , you're looking
27:45
through . that was the most difficult part for
27:47
me . Looking through everything
27:49
and saying , okay , am I making the right
27:51
choice ? Like , how do I know
27:53
that ? you know what I'm
27:56
saying ? That really was the
27:58
difficult part . It's not difficult to
28:00
put your profile and put those things . It's
28:03
difficult to say , gosh
28:05
, i don't want to base it on how he looks , or he looks
28:08
like , he doesn't , like he's not
28:10
spontaneous , but he really may be . So
28:13
yeah , that would be awesome , really
28:16
awesome I
28:18
think a lot of companies .
28:20
I could see that all day , laura . Yes , i could see
28:22
it , man , and some companies may already
28:24
be doing that , but I can . Why not ? Why not What
28:26
all the other stuff is doing ? It's like passing
28:28
. The AI is passing the bar . It's passing
28:31
like physician test Why ? could it not be
28:33
a matchmaker Like it seems like , from
28:35
an intelligence standpoint , that's lower on the scale than like
28:37
going and passing the bar or doing some of that other stuff
28:39
that AI has already been doing . Why
28:43
not ? Yeah , why not ? Maybe maybe somebody's
28:45
already done that .
28:45
And why not ? Can't do it Like , yeah
28:47
, yeah , like I'm interviewing
28:49
for a receptionist
28:52
and I'm like how do I know
28:54
I'm picking the right person ? Like they're all amazing
28:56
. I could get AI to like
28:58
do the random you know what
29:01
is really needed and how are they going to fit into
29:03
the culture ? and will they ? you know , do
29:05
they have what it takes to ? that
29:07
would be awesome .
29:08
Yeah , cause I think that you could do that as like a filter
29:10
right Like , if nothing else , filter out
29:13
applicants for me Yeah . Just filter out some applicants
29:15
for me See who's best from a statistical standpoint
29:17
. Fits , fits what I'm looking for , um
29:20
, and then you know what I feel like . That's what a
29:22
lot of the legwork that AI is going to do . That's
29:25
a lot of the legwork AI is going to do for you . It'll
29:27
have like , if I don't have to like a lot of times
29:29
, if you're looking for , let's just say something as simple
29:31
as this podcast , and I'm going to look at some
29:33
hashtags for this podcast , like what kind of what are the
29:35
best hashtags ? I can just literally type in the
29:37
AI and have it pop them out . I don't have to try to
29:39
do a ton of research . I don't need to go read
29:42
now into researching podcasts , or you
29:44
don't have to now hope that you can
29:46
. You could just put in a few things about your company culture , what
29:48
you're looking for , a few prompts for the
29:50
kind of specific person that you're looking for , and then
29:52
maybe get some suggestions on the best way to go about
29:54
that hiring process and cut some
29:56
time out of the processes which I think , ultimately
29:59
, ai would do . I think AI , the
30:01
biggest thing it would do for humanity , in my opinion
30:03
, is give it back some time in its
30:05
every day , doing it to every day .
30:07
Yeah , you know what I mean
30:10
, wouldn't that be
30:12
awesome ? But also think about
30:14
it this way , reggie What if
30:16
AI ? you
30:18
know you used AI to get
30:20
all of your
30:23
podcast people
30:25
and what did they just like ? totally
30:28
skipped over me , like
30:30
you see what I'm saying . You would have
30:32
totally missed out on
30:34
me .
30:36
It's a possibility right . That's
30:39
the collateral damage , right , that's
30:42
the collateral damage that could be , and
30:44
sometimes yeah . Well
30:47
, you're gonna say finish that statement and I'm gonna Yeah , that's it .
30:51
I was just gonna say . You know , sometimes it
30:54
really is important to have that human touch , like
30:56
sometimes you know you really
30:58
need that , but I could definitely see how it would
31:00
aid Absolutely .
31:03
I agree with that , because I was gonna say that's the same
31:05
situation , kind of going back if we're doing the dating
31:07
situation . But isn't that in general , though , cause
31:09
? let's think about it like this , laura , we were just talking about
31:12
the situation with receptionist . Right , You're
31:14
already thinking that without
31:16
AI , without AI , you're thinking that already
31:19
. Right , We're all kind of like a human
31:21
condition and all kind of second guess ourselves . To
31:23
a certain degree , We make certain decisions
31:25
, right ? Like we're already thinking of like , well , if I'm
31:27
dating this person , the person that's supposed
31:30
to be spending the rest of my life with , is just , I'm just have less
31:32
time with that person . They're just out there somewhere
31:34
And but
31:36
I'm here , that dream job that I want
31:39
. It's out there somewhere , but I'm working this other
31:41
job . You know Like I feel like we
31:43
would always second guess ourselves as
31:45
to like , what it could miss out on
31:48
, like , or what we could be missing
31:50
out on , even as we make our own decisions , right
31:52
. This
31:54
is true , yes , absolutely , and
31:57
maybe we'll have more answers to this here down the line
31:59
, because I know this is not our last AI show . We'll
32:01
do some more AI shows . Definitely
32:04
appreciate you taking some time out here
32:06
, as always , laura , absolutely
32:10
.
32:11
It is always a pleasure , Reggie . This is Reggie and
32:14
ATL .
32:14
Check us out . Stay dry , heart radio , google
32:17
Podcasts , apple Podcasts , spotify wherever you find
32:19
your podcast . See you next time .
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More