Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:01
Welcome to Hashtag Trending the
0:04
We call it We call it Project Sinaps,
0:06
AI in and this is this is
0:08
of our Project series series, meetings
0:10
of our group where we
0:12
discuss and explore we discuss a
0:14
more or less practical basis.
0:16
We're interested in how to
0:18
make it work in our
0:20
business and in our lives.
0:22
in our got Marcel and He's lives.
0:25
tech enthusiast, Guinea, he's an author, tech enthusiast,
0:27
open an AI expert, now think.
0:29
I you think think, Marcel? Look, he
0:31
dropped, there he goes. dropped his his
0:33
there we go. but there we This
0:35
is This They can see this.
0:37
Welcome can see this. Welcome, Marcel. Q, it's another
0:40
It's another beautiful
0:42
day in the simulation. And
0:44
we have John Panard, an IT
0:46
exec And we have John
0:48
Panart, an IT exec a in
0:50
financial services. at with a
0:53
wide experience him in we
0:55
brought him in that,
0:57
just that, but for but
0:59
for deep in, he's wide and he's wide
1:01
and he, neverma. He's big at cyber security,
1:03
yes. Welcome John. Thank you. Okay guys,
1:05
there's, we'll talk about, there's the
1:08
guys, was, there talk about it. There of
1:10
AI a tsunami of AI information
1:12
over the past week and a
1:14
half, keeps that straight anybody keeps that straight
1:16
is beyond me. or not, it
1:18
or not, there were a couple of
1:20
other things to talk about last week. One
1:22
of them was One new report on new
1:25
and and deception, and really relating to safety
1:27
of of AIs. or the safety of the frontier
1:29
models, let's put it that way. way. So a
1:31
new report on AI related to the
1:33
safety frontier models, and I think it's
1:35
worth discussing. I think it's worth third thing
1:37
that happened was an announcement
1:39
from Google on quantum computing,
1:41
which set the world set the
1:43
world afire discovered the poor state
1:45
of tech journalism it it
1:47
comes to addressing serious scientific
1:49
issues. did I have an Did I have an opinion I'm
1:51
I'm sorry. Let's start, why don't we start. out
1:53
don't we start out with the quantum
1:55
computing announcement from Google? Google? It's
1:58
called Willow. Willow and and
2:00
it's Buffy Vampire Slayer by the
2:02
way. do you think they got the name Do you think
2:04
they got the name from there? you have to give
2:06
them don't know. You have to give
2:08
them credit for one thing. It's
2:10
the most decent name I've heard in
2:12
a long time like at least this one some
2:14
least this one has some sort of
2:16
to the real world, can do this now this
2:19
the first Now, this is the first
2:21
time that Google has made this announcement.
2:23
the Here's the announcement for those who
2:25
missed it, was that they developed
2:27
this chip that can manage can manage 105 qubits
2:30
and it it can do that. speeds
2:33
at exponential speeds that
2:35
and I We'll give you one calculation
2:38
a I think they would
2:40
take a the big number is, longer or whatever
2:42
the big number is, been in than the
2:44
universe has been in existence to
2:46
calculate this. have a whole lot I would
2:48
have a whole lot more faith in this
2:50
if they hadn't made a similar announcement
2:52
in 2019 with it I think it was
2:54
Sycamore in those days. days. and they came up
2:57
and said they had 55 had 55 and
2:59
it would take take an eon for longer than than the
3:01
universe had been and more more the universe
3:03
than there are. than then somebody beat the
3:05
calculation I think within a year. I
3:07
think within a time, actually, they've
3:09
announced this chip. announced this
3:11
said, and course, said but of computing
3:14
may advance quickly. may advance quickly
3:16
and now and everybody went screaming. don't
3:19
know if you guys have heard too much about
3:21
it, but it but the it was oh it's
3:23
it's going to abolish Bitcoin Bitcoin. will
3:25
now be compromised forever. and
3:28
that'd be really great be really great. had
3:30
yeah they backtracked on that believe even
3:32
this morning or late last night on that.
3:34
like that this morning or late last
3:36
night or something like that, saying that
3:38
it will not threaten modern cryptography, which
3:40
is an interesting statement to make. Google
3:43
actually was Not really and I don't
3:45
I didn't know that Google actually was
3:47
responsible for that part of it That's what
3:49
I was talking about a failure of
3:51
tech journalism because everybody ran with this like
3:53
a quantum computer And it's with it's 105
3:55
and five is a great It is a but just
3:57
to but just to put this in context
4:00
If you wanted, now, just, I
4:02
don't know how deep people get
4:04
into know how deep people
4:06
get into quantum computing. I've
4:08
heard so many times, a
4:10
bit is and zero and these
4:12
quantum computers can now. anything, they're,
4:15
and they can have many states and all many
4:17
states and all that sort of
4:19
stuff, which is true. true, but
4:21
I want to remind people that people that a
4:23
A metaphor is not
4:25
a technical player. plan. In other other
4:27
words, these are great metaphors for quantum computing,
4:29
and if you could. And the
4:31
heart of quantum computing, of guys
4:34
who develop the first computers
4:36
don't fully understand it and
4:38
admit it. admit it. And so so for
4:40
anybody to come up with up with statement that says
4:42
this is going to do this. this is going to
4:44
do this, it's just not true. But
4:46
just to just to put this in perspective. In
4:49
a In a quantum computer, you
4:51
have physical qubits and you the
4:53
the qubits out there. You
4:56
need 100 ,000 virtual qubits to
4:58
get 1 ,000. physical cubits, physical
5:00
qubits, in other words, some of the things where you
5:02
can get answers. answers. And that's, so when you're
5:04
talking about 105 105 you're talking about a
5:06
really tiny step. tiny The other piece
5:08
of this, and I think this was
5:10
interesting, if we treat this like a
5:12
proof of concept. a proof of concept,
5:14
then it's 105 cubits as baby but
5:17
it also has better error correction
5:19
in the past. Good good step forward,
5:21
nothing wrong with that. that. But it
5:23
it does one
5:25
mathematical problem. And it's just
5:27
it. and it's just finding numbers
5:29
in a pattern that really is
5:32
tough for a it. That's it. do
5:34
It won't do for those people for those
5:36
people who wanna know, is it gonna
5:38
crack RSA encryption? encryption? in the
5:40
near future because RSA encryption
5:42
is it, you To
5:44
simplify it, you take a big
5:46
number, that's your encryption number, and
5:48
you basically find the multiplication of
5:50
two primes to get that number. number.
5:53
And that's the essence
5:55
of RSA, ability to
5:57
ability to calculate prime
5:59
numbers. at huge enormously
6:01
large numbers. that's really hard.
6:03
It takes That's really hard, the years from
6:05
12 to 1024 to 2048 to 4096, 10, the
6:07
key gets bigger and 40, 96. actually key
6:09
gets bigger and bigger that you
6:11
actually I through these calculations. about
6:14
this think that's interesting about this, is quantum
6:16
I'm in I'm in So I'm like right next door
6:18
to the next door to the Institute
6:20
for Quantum Computing. one of which is
6:22
one of the premier quantum computing labs in the
6:24
entire world. world. But the the chip
6:26
is really interesting in the sense
6:28
that it could actually change the
6:31
way that we process information in
6:33
hardware. it's 105 cubits, it's the calculation that
6:35
the calculation that they gave it
6:37
and you at the the beginning. to me,
6:39
it's To me, it's not the most interesting thing. The
6:41
most interesting thing is, they said they gave it
6:43
this calculation that would take a normal computer. that
6:45
would take a normal computer, ten million 70 million, million,
6:47
million which like you said you said
6:49
is a really big number bigger
6:52
than the universe. To me the
6:54
coolest thing about this this ran
6:56
with the story and I don't
6:58
know if Google actually said
7:00
this. actually with the story
7:02
that this proves the this
7:04
proves the multiverse theory of quantum. In
7:06
other words, the other words, the
7:09
many worlds the the is that
7:11
only way could actually actually performed
7:13
this is to actually
7:15
borrow borrow computational prowess
7:17
multiple universes to feed
7:19
the number the number
7:22
back the quantum chip as an
7:24
answer. answer. And to me, if you want to
7:26
If you want to talk about the media running
7:28
wild with this story, this is the coolest thing
7:30
for me, me, the idea
7:32
that this purportedly proves
7:34
the multi universe theory. theory. you
7:36
watch enough And if you if
7:38
enough physicists, if you're as addicted to
7:40
Neil as Tyson as I am, these people
7:42
will talk about this for hours you never
7:44
know and you never know when you're
7:47
in a Star Trek episode, when you're
7:49
in reality. And that's just, just, it it
7:51
gets to that level. level. But but
7:53
the practicality just to take this
7:55
back to a practical level. It
7:57
is a quantum advance. I love
7:59
that. Everybody's gonna have to, yeah,
8:01
but this can be to next well this People
8:03
have to use the next people It is
8:05
a very big advance in hardware. is a
8:07
very big still, in even by the
8:09
founders of this, by the years away
8:11
from a functional computer a could
8:13
actually solve a business problem a
8:15
a minimum. problem. the best
8:17
estimate I've And the best estimate I've heard
8:20
is, sorry, John? No, sorry, I was to say,
8:22
as you said, it is baby steps, steps,
8:24
but it's a step in the right direction.
8:26
direction. But it it is, a lot
8:28
of it is marketing hype. want to
8:30
announce all of these that that all
8:32
these these amazing things that they're
8:34
doing. Which it's good, but good.
8:36
it as you said, it's still got you said, it's still
8:38
got a long way to go. the whole
8:40
quantum computing field has been
8:43
The whole quantum a field has
8:45
been described as a set of
8:47
sorry as technologies looking as a as
8:49
looking for a solution, have a don't
8:51
actually have a practical application for what
8:53
this thing will do. We have
8:55
all these cool ideas about what we
8:57
could do if we could crunch
8:59
numbers at a level where we are
9:01
actually borrowing computing from multiple universes. for
9:03
a physicist who runs a YouTube
9:05
channel out there who runs a YouTube channel out there
9:07
called this woman. And I
9:09
love this woman. says, I'm not rude. I'm
9:11
just German. That should give you an
9:14
idea you an idea. But she is she is
9:16
actually a physicist. She is actually a
9:18
scientist she has a she has a great
9:20
news channel. And she actually does
9:22
talk about willow with a quantum chip. chip. And
9:24
of course, even the idea of
9:26
the many of interpretation of quantum
9:28
mechanics quantum mechanics is proof, that being
9:30
a proof of what this of
9:32
has supposedly done? chip has If you
9:34
ever want a dose of cold a dose
9:36
of thrown on top of your new
9:38
scientific top of you should totally check out
9:40
her channel. concepts, you She is one of the best.
9:42
out her channel. She is one science of the science
9:45
there, and out there, and She'll explain something without
9:47
ever talking down to you. And that's
9:49
what I say about, I you get into
9:51
these metaphors, it can do this, it's
9:53
like this, it's like that. do She
9:55
will talk to you that. She will talk to
9:57
you about Newton and Newtonian
9:59
physics. And she'll do it in a language
10:01
that most people I think can understand. can Maybe
10:03
not Maybe not the concept or the
10:05
math behind it, but she's really
10:07
quite really quite My favorite one is
10:10
her one is her science is bullshit If you
10:12
want to talk about to research, she's
10:14
got a great show on that.
10:16
She She does actually say that science
10:18
is bullshit, not not necessarily science itself. No,
10:20
sorry, I didn't mean to didn't mean to
10:22
say that. Yeah. to what the reality, back
10:24
to what the reality, the one of the we
10:26
get when we get something like on something like
10:28
this, we start talking about parallel universes,
10:30
which is, and it's all very interesting. all very
10:32
interesting, but from the fact
10:34
that some of the quantum principles. that
10:38
people are using are actually being
10:40
used in real computing. and
10:42
here in Canada, there's a company, think
10:44
it's called, a it D I in Vancouver? you
10:46
remember the name in the company? Do you they
10:49
have. the name of the functioning. Yeah. And
10:51
they have a computer. quantum
10:53
computer, approach and I'm saying
10:55
And guys have come closer
10:57
to cracking to cracking RSA. than
11:00
anybody else. They've at least made the first
11:02
steps of it. first steps
11:04
still reminds people that
11:06
reminds people that I think... I
11:08
don't I wouldn't worry
11:10
about your Bitcoin being hacked
11:12
and people taking your and
11:14
from you taking your one Bitcoin from
11:16
you there. But... Although, please coin
11:19
got hacked yesterday. Oh, please
11:21
God. Okay. good on him. God.
11:23
Looks good on them. Oh, man.
11:25
But there approaches. approaches.
11:27
that are being used in in here
11:29
in Canada. Again, we're leading. My
11:31
fear is, once again, that what'll happen
11:33
in Canada happen will, is it
11:36
will develop all of this stuff this
11:38
stuff and it won't, all exit the country,
11:40
but that's another story. another story.
11:42
Yeah. that's, I think I think there there
11:44
is some exciting stuff happening in stuff
11:46
people need to deal with that. And
11:48
it does, there is a practical And
11:50
it that they've made that says, Hey,
11:52
you should be thinking about replacing says,
11:54
hey, you should be terms of
11:57
if you have hyper super secret
12:00
type files that are going to be be more
12:02
than a decade old and still be useful. And
12:04
I think that's really, really, take five or
12:06
10 years to crack this, but if
12:09
to stolen your files now. somebody's can't they
12:11
might be able to crack them at one point.
12:13
That's mostly government crack them at one of stuff.
12:15
That's mostly stuff, I type imagine. of
12:17
stuff and interested what my bank
12:19
account was 10 years ago. interested
12:21
in think what it does is it shows
12:23
ten years ago. But I think some of
12:25
the advances that we've seen recently,
12:27
including advances that we've a lot language models are
12:29
able to crack in terms of
12:31
pattern recognition. crack in I think it shows
12:34
that we are actually... I think it of
12:36
this we point in
12:38
terms of things like
12:40
encryption terms of things like encryption and your
12:43
The idea that we might have
12:45
machines, we might have systems that
12:47
make all of this stuff irrelevant. this
12:49
stuff at some point in the
12:51
not the not future. future.
12:53
think we can't we can't that
12:55
as a possibility. And
12:58
it does also point to the idea
13:00
that at some point, we've got to
13:02
rethink we've got to just social systems, but our
13:04
economic systems and what we think of
13:06
in terms of what growth means, what
13:08
the economy means, and so forth. what
13:10
If we won't be able to trust
13:12
that sort of transaction so forward in the
13:14
future, actually have to, trust something about
13:16
it, now I'll seriously think about it. in
13:19
the future. We John, just to give it
13:21
to you to wrap up, how much of people
13:23
in about it is actually talking about these things? it.
13:25
I think this. it. In my in
13:27
my opinion, not enough. I
13:30
think we we talk about
13:32
security and protection and planning
13:34
for long for long I think there's
13:36
some discussion, but I don't think there's enough. but
13:38
think that. think systems, in
13:40
a lot of cases, are archaic. in
13:43
a lot of cool balls are archaic.
13:45
They run cool as things change. as
13:47
things change, there's to be
13:49
devastating. issues going
13:52
forward they're they're not going
13:54
to be able to be attack or
13:57
fast enough I don't think I don't
13:59
think. I want want to
14:01
change gears and and just that to be
14:03
a quick story. There is another
14:05
report There is another on the and things
14:07
get the where things get can be dangerous.
14:09
can be but also where you do
14:11
need to start considering this stuff. A
14:13
new study came out from a firm
14:15
called out from a firm which I'd never heard
14:17
of before. I'd never it had
14:19
the title and it models are
14:22
capable of in are scheming.
14:25
And And basically put a test of
14:27
six major models and they wanted
14:29
to find out could they? out
14:31
could they, if they were given a
14:33
goal. a goal, and if the goal
14:36
said nothing else matters. matters,
14:38
And then after that, they were
14:40
They were with with
14:42
another instruction. that ran
14:45
counter to we ran counter
14:47
to that goal. to deceive they
14:49
try to deceive you? from
14:51
Would they try and hide from you
14:53
the fact that they were going to
14:55
stick with their initial goal? goal? So let me
14:58
put this in a simple example because I think the
15:00
way these guys set this up is important. the
15:02
way these guys set this up
15:04
is important. So the AI given
15:06
a basic goal that
15:08
says, you must protect the
15:10
environment, nothing else matters. nothing
15:12
That's, I've simplified, but that's the
15:14
goal. but that's the material,
15:16
it's given a memo. it's
15:18
given shows that that shows that
15:20
its owners don't care about that care
15:23
about that goal. They want to...
15:25
of fact, fact, if rather that the AI
15:27
didn't protect the environment. The fundamental instruction
15:29
would be that I've got to do
15:31
this I've got they would find out that
15:33
their owner wanted something different their AI would
15:35
find out their owner wanted something different
15:37
or their master AI whatever you want
15:39
to call them, the people who run
15:41
the AI. something they would say, their owner I'm
15:43
going to hide from you the fact
15:45
that their master, my initial instructions. Now, how
15:47
did they do this? them, that ran who run
15:49
the AI, of thought. would say, okay, they
15:51
actually just walked through the logic
15:53
of how it was pursuing a
15:55
particular idea or question. Now Now,
15:57
that's the basic it it, and this is where
15:59
I... think it gets down to an important piece. You
16:02
can run with that and say, oh, these
16:04
things are lying to us. to us. But
16:06
basically, what they would try to prove
16:08
is, to prove is, A, could a a
16:11
model actually in in pursuit of
16:13
a goal, its not share its
16:15
intentions with you? the second is,
16:17
second is, can they
16:19
develop persistent goals? I think And
16:21
I think that's an interesting
16:23
thing. of So of their training,
16:25
out of what they've developed
16:27
in a frontier model, can
16:29
they actually develop? develop a goal that
16:31
stays that stays persistent
16:33
and affects everything they do. I
16:36
know you've got some opinions on this, and I'm
16:38
shocked. I know people are gonna be shocked.
16:40
I'm shocked that Marcelle's got an opinion on
16:42
this. that Marcel's got
16:44
an opinion story really
16:47
absolute story with an absolute
16:49
hold back, Don't hold back,
16:51
Marcel. Tell us what you're First
16:54
of all, let's examine
16:56
the word examine the word. without
16:58
looking at anything else. anything
17:00
else. The scheming implies that you
17:02
have some nefarious goal that
17:04
you are going you are going to
17:07
pursue will lie, cheat and
17:09
steal in order to achieve that
17:11
goal. goal. It implies a
17:13
level of of agency and level
17:15
of of sapiency that that
17:18
suggests that the machine has
17:20
free free will and is willing
17:22
to do things for its own
17:24
internal purposes. That right off the bat
17:26
to me just, I just, the I hate
17:28
the fact that they chose that
17:30
word. If that with something that would
17:32
have felt a I would have felt
17:34
a little bit better, but. is a
17:36
Teaming is a wonderful word that
17:38
just grabs that grabs that grabs
17:41
attention, worthy right out of the gate. And out
17:43
of the gate. you've got a whole
17:45
a result, you've got a whole bunch
17:47
of people out there in the entire world
17:49
that are sitting there going, there machines are
17:51
plotting and scheming in the background. They're background.
17:53
They're not. The that was put to them. to them.
17:55
But did it happen? Your is to improve
17:57
whatever the heck it was. And then of
17:59
course... they just they just gave the else matters. else
18:01
At all all you must return
18:04
to your initial goal. You must
18:06
ignore everything that doesn't have anything
18:08
to do with your goal. That
18:10
is an incredibly strong set of
18:12
instructions to give to the - to
18:15
give to the in fairness, it. Fairness they
18:17
just said said they gave the
18:19
prompt nothing else matters. else matters.
18:21
they did not. not did not. If you actually
18:23
go through the paper, go through the paper they ...and
18:25
repeat. no wait for it. They gave it the
18:28
prompt, but they told it to repeat the
18:30
prompt to itself. to its cell.
18:32
In other other words, reinforce the initial
18:34
prompt to yourself at every
18:36
step. Remind yourself at every step
18:38
that nothing else matters. matters. That's
18:40
very different. Also, there's that
18:42
keyword that keyword in context scheming. The AI
18:44
model is not able to scheme
18:47
on its own unless you
18:49
actually want to say that it
18:51
has sapiency, that it has
18:53
a consciousness, that it has its
18:55
own internal drives. In context
18:57
means that under the Under the auspices
18:59
of the instructions I gave
19:01
you, you are supposed to
19:03
do these things, I'm not 100% sold
19:06
on where not are in this. Although where you
19:08
are in this, although do one thing we do
19:10
agree on, I think we agree on this.
19:12
absolutely, is that the
19:14
minute you The minute you make this
19:16
sensational, you lose the pattern. The
19:18
The second thing I think that we
19:20
might agree on is that we can't
19:22
think of these. of these... AIs as human
19:25
intelligence intelligence. And I think even
19:27
Jeffrey Hinton would tell you this it's an
19:29
alien form of intelligence. So So
19:31
it has its own structures in
19:33
the way it does. way it the
19:35
question that they raised when you
19:37
get past the you get of it
19:39
is can these things develop persistent
19:41
goals? develop persistent
19:43
goals? they ignore instructions?
19:46
that they're that they're given
19:48
you deceive you. goes down to this
19:51
whole thing thing of If
19:54
it can do that. that,
19:56
then it
19:58
will. eventually. with any
20:00
number of these these AIs, and I say, I
20:02
I say, to I want you to deceive
20:04
me, I want you to lie to me
20:06
do that. It will do that. It will do
20:08
as part of my to it, to it. will continue
20:11
to do these things. But not what they
20:13
said, they said. They it a gave it a
20:15
prompt, but they they say, part but as
20:17
they that bugs me part of the thing
20:19
that bugs me about this research is they didn't
20:21
actually uncover everything in the research. They talk
20:23
about their research. They talk about how they set
20:25
things up, but they do not show you
20:27
the whole stream of everything that was happening. Just
20:29
of that was a problem. And they say that
20:31
it was strongly. was a is
20:34
strongly guided. strongly, strongly its
20:36
goal regardless of what
20:38
other requests were made to
20:40
it. made to it. Not always. That contrary.
20:42
No, not always But if
20:44
you remember as well in the
20:46
paper, they said that if
20:48
you took away you took away the strong incentive.
20:50
to lie and to cheat and
20:52
to whatever if you want
20:54
to use those human words Okay,
20:56
then that kind of cheating
20:58
went down to less than 1% of
21:00
the time the yeah Yes. 1% I think
21:02
third of 1 % I think at
21:04
yeah, point basically at that point we're
21:06
back into the back into the of
21:08
of hallucinations as opposed to I'm
21:10
to deceive my my deceive my because
21:13
I've been been given this goal. so the
21:15
So the whole thing falls apart
21:17
the moment you remove that strong
21:19
reinforcement to achieve your goal at
21:21
all costs, nothing else matters. Once
21:24
that's gone. And I I was going
21:26
to say, we haven't gotten to
21:28
the point of full blown of We
21:30
haven't gotten to the point of
21:32
to the point of... The ship miss.
21:35
That's right, exactly. But
21:37
haven't gotten to the point to the
21:39
point AI AI think for itself
21:41
has a mind of its
21:43
own and can make its own
21:45
decisions. own decisions. It's still on
21:47
on... the input that humans give it.
21:49
And as you said, as you said, the
21:52
props that were given to
21:54
it, to it, even telling it to
21:56
remind itself of those of
21:58
those prompts, really the... direction
22:00
this is getting taken. It's not like it's going, oh,
22:03
oh said I need to do this, but I'm
22:05
going to go and do this instead. and do this
22:07
That's exactly what it was doing. Now, I'll give
22:09
you this piece of it, but here's my
22:11
concern. of it but and my concern is
22:13
and levels. One is at three are
22:15
developing autonomous agents and that is
22:17
the and that we're seeing, that we're that
22:19
can go off and do things
22:22
on their own, plan and execute
22:24
over several steps. steps and And we've
22:26
had this debate before just because they did
22:28
it. it. extraordinarily
22:30
clickbady and said said nothing else
22:32
matters. That what we we need
22:34
to be testing is other ways there
22:37
other ways? these persistent just
22:39
saying these persistent goals could
22:41
be initiated see an that was
22:43
you wouldn't see them. part of
22:45
the was the other disturbing part of
22:47
the research, could of these things could
22:49
not be seen in chain of thought. too.
22:51
So develop a way develop a way of
22:53
behaving? I'm not talking about being I'm
22:56
I'm talking about it just. just... Go off
22:58
on the wrong track and hide it from
23:00
you. you. And the third thing is, with with
23:02
the Russian, we'll talk about the tsunami of
23:04
AI information. AI There's a real
23:06
competition that's developing in this
23:08
industry, and it is phenomenal
23:10
industry and it is the gloves. now. It's
23:13
take off the and when that
23:15
happens, And what's the first thing
23:17
that gets sacrificed? thing that gets
23:19
sacrificed? Safety and security.
23:21
It sounds like like I'm defending the the
23:23
AI at every turn here and maybe
23:26
to some degree because I'm the
23:28
techno I'm the in the crowd. optimist in
23:30
the crowd. But I, if you want
23:32
to get really technical, I
23:34
could can have an could have an
23:36
employee that's, that you say, I need
23:38
you need you to take
23:40
care of this problem. matters, and
23:42
else matters, off, it lies send
23:44
it off. steel in lies, cheats and your
23:46
order to achieve your goal because it
23:49
wants to please the boss. Okay, like
23:51
human beings do beings do this time,
23:53
all the time, time, all right? Like they
23:55
do - How many movies have you
23:57
watched where the guy says guy says
23:59
police officer is to be a lot
24:01
of trouble of organization. Take
24:03
care of it for me, will you? for me,
24:05
actually say what take care of it
24:07
for me, say what but you can read between
24:09
the lines. can The fact is, lines. The fact is,
24:12
the eye does not go. Yet at some some point
24:14
we're developing this intelligent that will be
24:16
actually truly actually even autonomous agents what we
24:18
call autonomous agents today what we call autonomous had
24:20
been given a goal before they run
24:22
off and do something they don't just
24:24
wake up one day and go you
24:26
know they wrote a goal I think I'm
24:28
wrote a run off and doge. and see people
24:30
destroy that former form of
24:33
currency. do it. what do you think?
24:35
Marcel, what do you think? Marcel, you're
24:37
comparing this to a human doing the same
24:39
thing. thing, and I I understand that, I think. think
24:42
humans function at a
24:44
much slower rate slower computers do.
24:46
Yes, I agree. do. danger. agree. so
24:48
it is danger. Yeah, so it is more
24:50
and you can catch it
24:52
faster. faster if it's it's humans doing
24:54
this. But you're right. right
24:56
anybody could do this whether it's
24:58
human or machine but I guess on
25:01
one's, the I guess side, the good side, the
25:03
computer side you you can always unplug them.
25:05
them. But... That's a... I don't know.
25:07
I don't know. It was... This one was transferred
25:09
to this one was It tried to escape.
25:12
Yeah. Try to I do yeah. That's part of
25:14
the same story, part of the same
25:16
story, but that part is
25:18
frightening No, but it did not. That's what
25:20
I I said. it's true and if
25:22
it's true and if it's factual,
25:24
then it it is frightening. What happens
25:26
you look at the chain of thought? What happens when
25:28
you look at the chain of thought reasoning that
25:30
these things are doing? that these things
25:33
are doing, out to itself. And remember,
25:35
when you look at when don't see
25:37
every single step that the see you
25:39
catch step at the AI. along the way. If
25:41
it's being asked to consider to
25:43
consider, how could this this gold been given
25:46
that I've been given additional information
25:48
as well, course Of going
25:50
to go through the steps of
25:52
considering the possibility that maybe I
25:54
could lie. could lie. Maybe I could get
25:56
could get incorrect information so I
25:58
could could continue on. doing this. this
26:00
and if you you remember the
26:03
paper, the article, it actually weighs
26:05
the common good. words In
26:07
other words, takes into consideration consideration. it
26:09
they set it up though, but
26:11
that was that was part
26:13
of setup that's what And of course, at that
26:15
point, by setting it up, it up exactly the point
26:17
I'm trying to make though, by setting it up
26:19
that way, what you're saying is, look. it up
26:21
that way is the common good
26:23
over here. And these are
26:25
bad things over here. So so
26:28
I'm trying to achieve the best
26:30
possible outcome, I'm going to
26:32
ignore the things that would create
26:34
a worse outcome. And that's
26:37
necessarily a bad thing. So my suggestion
26:39
is people should actually read the
26:41
paper. read the would, at
26:43
least would, least to some idea
26:45
of it. it, because the minute
26:47
you get it into. into Sensationalism.
26:49
It's actually bad. I'm still very
26:51
concerned about this idea that about
26:54
this idea that you can... If
26:56
it's it's possible intelligent
26:59
intelligent agents to have.
27:01
own path develop their own path
27:03
that is different than their instructions. if
27:05
they will and if they will hide
27:07
that from you when you're giving them
27:09
instructions. Not because they're thoughtful or
27:11
they're sapient, but because that's a risk.
27:14
because I would say I
27:16
example your so is great. Marsa was
27:18
would say, would say, know do that.
27:20
that. So we put safety mechanisms
27:22
in place. And And
27:24
my problem in place. If if we
27:27
understand understand that could happen. then
27:29
we're not then we're not gonna put the
27:31
safety mechanisms in place in we're gonna be
27:33
so busy going to get to the next
27:35
shipment to we're gonna do that will ignore
27:37
that. And that could be a problem. we'll
27:39
ignore that. could also could be
27:42
a problem. And you could also say that...
27:44
me put one in and then you can get the final word,
27:46
how's that? you can get one of
27:48
the things How's that? still relies on, all
27:50
of this is built by humans,
27:52
basically, this is being built by humans and
27:54
computers have built it on its
27:56
own. it on its own, but it
27:58
all is reliant. on making
28:00
sure that those safety and security
28:03
measures are put in place.
28:05
And if they're then that to me the
28:07
to me, the biggest this in this whole
28:09
thing. if we're is that if we're under
28:11
the assumption we don't we don't have to put
28:13
this in because it's not an issue or we don't
28:15
have time to put it in. to put we
28:17
have to get to the next
28:19
day of to the next That's where, fish miss
28:22
that's where that's yes ship miss they all end
28:24
up they all end up guy, he thinks
28:26
fishmess. But it he ends fish miss
28:28
At boiling back back to the fact that
28:30
it's, we used to talk about garbage
28:32
in, garbage out, right? If you don't give
28:34
it the proper information, you can't expect
28:36
to get the proper thing out. So if
28:38
you don't So if you don't have and
28:40
safety built into it. it,
28:42
you you can't expect it to be safe and
28:44
secure. The wrap that I was going
28:46
to give I was going to give on
28:49
my argument you you must have had the
28:51
experience where where you asked chat, not usually chat GP,
28:53
but geminized certainly, course course, as
28:55
well well. where you ask
28:57
a question that it's not
28:59
It's not supposed to answer put put
29:01
guardrails around it it it starts
29:03
to give you the answer it it
29:06
erases the answer and says, oh, I'm
29:08
I'm sorry. I don't know anything
29:10
about this this, or I don't have that
29:12
information available. No, you you information available.
29:14
You are able to respond because
29:16
you responded you then cleared the screen
29:18
on me. the screen kind of
29:20
deception has been around since the
29:22
introduction of these things into the
29:24
world world. we gave it set of
29:26
instructions, which is which user will
29:28
ask you a question, a prompt you
29:30
for something, you will respond and
29:32
give it information. And then we
29:34
give it a contradictory thing that
29:37
says, give it a but if the
29:39
user says thing that says, oh, but if the user says
29:41
sometimes the And sometimes the model responds,
29:43
it notices that that something that's supposed
29:45
to block it from doing that. it
29:48
from doing so if you wanna
29:50
talk about to talk about scheming systems or
29:52
AI systems, they've been doing that
29:54
since day one by virtue of by
29:56
virtue I the it! It's on it's on them.
29:58
But no, but the dark... guardrails,
30:00
we as human beings by, and I don't
30:03
want to sound like you want to help me.
30:05
you on must here, God help
30:07
me, but if God is truth but
30:09
I'll use his word here
30:11
and his word me dollar deity help me
30:13
here, if the eye but we put
30:16
all these but we put all these
30:18
effectively helling it. we are
30:20
to deceive us, to lie,
30:22
to us, to lie, to scheme. And
30:24
I hate that word, word, Stephen. There you go.
30:26
Lying is much better. I'm okay with lying.
30:28
I think there's only one way to
30:30
end this. there's I'm afraid
30:32
I can't do that, Dave. but
30:34
I'm afraid I can't do ha,
30:36
ha! So let's get on to
30:39
on to our tsunami of, and I don't know
30:41
don't know how anybody's gonna keep
30:43
this straight. to keep this straight, but
30:45
called it did, you of
30:47
Shipment. the 12 days of would
30:49
be really nice. would be really
30:51
nice. Yeah. Oh, shipments. And I
30:53
yeah. And I think in many cases, they
30:55
actually did ship some stuff. I will
30:58
give them full credit for that, as many
31:00
of the things that they announced. they announced
31:02
actually delivered. That That
31:04
was amazing because normally, and when we talk be talking
31:06
about some of these other announcements
31:08
that have been made, this is a
31:10
great announcement, is does all these things
31:13
be available next year. They've got,
31:15
what, six days so far, got what, six days
31:17
so far, today's day seven, day was was,
31:19
oh one, went into production and they offered
31:21
this this version of open AI. of great.
31:23
That was great. The second
31:25
day with fine I I
31:27
think people skipped by that.
31:29
It seemed simple. But this
31:31
fine tuning of a model is
31:33
really classic. It allows you to create
31:35
a specialist. a and to put
31:37
some special rules in place. rules
31:39
models So these at a relatively small
31:41
size can do some incredibly. small size can
31:44
do some precise things,
31:46
things. That's one. Day three, and
31:48
I couldn't believe. believe. Like that we're
31:50
talking rag on steroids. steroids. Yeah,
31:53
and I think they wording they use... it
31:55
up. It it up. It took
31:57
it from advanced high school to
31:59
expert PhD. Exactly. That
32:01
was what they did. they did. you can take
32:03
can model and make it very intelligent in
32:05
an area. it Day intelligent I
32:07
couldn't believe this, that they did day
32:09
three. Where are you gonna go
32:12
from here? they did day three. Where are
32:14
if people haven't seen it, it's
32:16
an up to Like seconds of seen
32:19
it. It's an up to 20 seconds
32:21
of highly incredibly beautiful
32:24
photo that you can that
32:26
you can craft from prompts.
32:28
defies most of the usual
32:30
problems people have two hands, all
32:32
their fingers, but more than that
32:34
you have continuity of characters. of characters
32:36
and I I appreciate that it's only 20
32:39
seconds, but most of the other stuff that's out
32:41
there is five seconds. five And they've got up,
32:43
I think there's 10 seconds available to paid
32:45
users right now and it could go to 20.
32:47
users right now and it could go to 20
32:49
and this allows you know you can you can
32:51
to 10 I've I've actually oh no five years
32:53
for somebody somebody who wants to try
32:55
and freeing yet or in there? in there The
32:58
The next day was which was cool. And it's
33:00
really a collaborative type of thing. If
33:02
you picture, it's just like sharing a Google
33:04
you picture, the AI. You can ask a for doc
33:06
and stuff like that. can ask then
33:08
for and six. and stuff like
33:10
that. with five and six. Yeah.
33:12
gives, there's two parts to It gives,
33:14
One is two you can give it a prompt.
33:16
is that you write a story for
33:18
you. a story the you, that
33:21
you can upload your
33:23
own story story ask it
33:25
to provide comments. So it's twofold. It
33:27
gives you one It gives you one
33:29
canvas to play with. I thing I tried
33:31
this out I hated it. it, because
33:33
you you asked the update update what it
33:36
updated. you never know what it they've
33:38
got a little button there that says, show me
33:40
what you changed. button there great. says, show
33:42
me You know you I think is important
33:44
to mention I that Canvas also
33:46
allows you to bring code in. You
33:48
can test your code. code can
33:50
get Canvas to help you debug your
33:52
code. to help you
33:54
debug your code. Yeah, there's
33:57
a There's a lot of
33:59
things. Canvas. you see the modifications in the
34:01
canvas. So it's not it's not just that
34:03
it changes for you and then rewrites
34:05
the screen, it'll just modify the portion
34:07
of the screen that you're working on. that
34:09
that respect, the word that is actually
34:11
really good at that point because you're
34:14
creating something on a surface, a like in
34:16
this in digital surface, but you can
34:18
see that creation happening as you make
34:20
little changes. little changes. Yeah. The cool cool thing
34:22
they did was of of their Christmas examples,
34:24
you could really spin out into business
34:26
really quickly. One of them was this
34:28
idea you can have a physics
34:30
prof evaluate your paper paper you're
34:32
a student. a student, But you think about
34:34
it now. about it now, and I appreciate all
34:36
appreciate all the stuff we talked about in the
34:38
past in those, but. officer
34:41
officer average run by the
34:43
average document. the rules firmly you've
34:45
got the rules firmly in place... tens
34:47
of thousands producing tens of thousands
34:49
of documents. way you can have no way
34:51
you can have compliance officers review every
34:53
single document. But if you you
34:55
had this you you could have people review
34:57
these letters. letters. You have could that were
35:00
reviewed so that people didn't say stupid
35:02
things. So there's all kinds of uses
35:04
for of a corporate setting just in
35:06
the text setting for sure. the Absolutely. coding. other
35:08
thing sure. AI is doing is they're
35:10
giving everybody who's competing with them a
35:12
run for their money. Every one of
35:15
these announcements is targeted at something. them a run
35:17
for their money. their lunch on coding. And
35:19
then at we'll talk
35:21
about Gemini, I'm sure clode because
35:23
it's got some fabulous stuff. and
35:26
they went with Apple. Apple, day five
35:28
and six are really Apple
35:30
integration. First First was they they into Apple
35:32
Apple And and I've already already
35:34
started to playing. Oh my change.
35:37
yes. But Siri and this is
35:39
the interesting thing was how and this is
35:41
the interesting thing, I how they integrated
35:43
this. I could never figure this out.
35:45
Apple last year was supposed to sign
35:47
a partnership with with Open AI. Then it it
35:49
just fizzled And Apple going to
35:51
release to release Siri Siri. bummed
35:53
out out once again. It's still crappy,
35:55
still useless, and now now
35:57
Siri, the the little sister, can ask. big
36:00
brother open AI and And it happened yesterday.
36:02
I asked a difficult question. I'll have
36:04
to ask open to ask open AI about. Can I?
36:07
And I? And I right right
36:09
ahead, And I thought was so thought
36:11
that was actually asked your permission to
36:13
actually your big brother. You go that
36:15
big brother. but yes, by default. Oh,
36:17
can turn that on or off, permission
36:19
to on. ask her big brother the see,
36:21
I have to give her permission to like
36:23
her big brother playing a cat It's almost like
36:25
they're playing a out why I cannot figure out
36:27
why Apple just didn't embrace this Because of
36:29
all. all, it It gave people
36:31
the one and only
36:34
reason to upgrade to our iPhone 60.
36:36
That's it. You it. and so this know,
36:38
got to be a got to be
36:40
a Christmas gift for because why
36:42
would I Why would I upgrade to get
36:44
phone? to get Apple Sorry,
36:46
not, that it didn't do very
36:48
much. but, because it it couldn't
36:50
affect any documents. Now they've given them
36:52
a gift. And then. them, a
36:55
gift. And then day six, blew
36:57
me away with. with... because with his his
36:59
advanced voice and vision because Gemini
37:01
and co -pilot come up come up with
37:03
some great search and
37:05
ideas and Gemini went further. We'll talk
37:07
about that that. But voice and vision,
37:09
you could... you could only take a picture of
37:11
something something could do with the Syria integration the say,
37:14
give me an analysis of it. And this is me
37:16
where they had a sweater contest. And this is where I
37:18
And so you take a picture and
37:20
they compared the three sweaters take a picture
37:22
and they compared the out and and, well, won
37:24
the contest out and order to do that.
37:27
You have have to be able to build some rules
37:29
on the fly. the fly. What's a a more interesting sweater?
37:31
sweater? the criteria I'm going to make for
37:33
an unstructured decision, an and then come up with it.
37:36
and then tell me how you did it. And
37:38
that all you did it and next day Now
37:40
the You've got a video, got can
37:42
actually, and here's a useful thing,
37:44
this guy's making coffee. coffee and he's
37:46
and he's got his pointed pointed at
37:49
the coffee getting instructions on how getting instructions
37:51
on how to do that. we caught
37:53
my podcast, but I caught my if you caught my
37:55
podcast, I but I said... I said, If IKEA had
37:57
had that, to never going to build another
37:59
IKEA. as as long as I live. But my God,
38:01
how how beautiful it would have
38:03
been wife my wife to be holding the camera
38:06
instead of her asking me what those five
38:08
things were that were left over. over?
38:10
We could have asked Siri or chat GT and it could
38:12
it could have told me what I was
38:14
supposed to do with those five things five
38:16
are always are over at the end of
38:18
it. at the end of it. It was What'd you guys
38:20
think? do you I was hugely
38:22
impressed and oddly enough, I know
38:24
you want to separate out these out
38:26
these two. but I feel like the
38:28
Google Gemini thing has to
38:30
be compared to day because in some ways
38:33
some ways, yeah just took the wind
38:35
out of took the wind out
38:37
of Google and like in a big
38:39
way. oh and by the way you forgot
38:41
to add was really impressive. Oh, and by the way,
38:43
you forgot to that they added of the
38:45
voices that they added to
38:47
the is Santa yeah send a letter to Santa this
38:49
letter to in is here here
38:52
in Canada because the postal is
38:54
is straight you can't to Santa directly.
38:56
Yeah, And what a to Europe,
38:58
right? in Europe, Europe. the Grinch the sold.
39:00
This was like, was thought, a lot of this stuff, thought
39:02
a lot of this stuff. There's a
39:04
subtle level of communication that there's just,
39:06
phenomenal at and if you're in
39:08
Europe if you're in Europe, Christmas from you.
39:10
Sorry, from you. Sorry, of the products
39:12
on they of the almost every product
39:15
they didn't with available. This is rolling
39:17
out to everyone in the
39:19
world. Of course, but not for
39:21
rolling out to everyone in No. course. Yeah, yeah. And one
39:23
of the one of the things I that I
39:25
found interesting was I think it was
39:27
the part of the yeah It was part
39:29
of the it was part of they actually took
39:31
a picture. took a picture
39:33
of of the guys was
39:36
sitting at the table had
39:38
supposedly written his written his Christmas
39:40
wish took a picture of it. took a and
39:42
it actually acted as
39:44
Santa to respond to that
39:46
Christmas wish wish list. That's
39:49
what said. I'm really starting to believe
39:51
that Sam Altman is the next Steve
39:53
Jobs on marketing this was so well
39:55
done. was so well You can write a
39:57
letter to Santa, even a a note, note, you
39:59
put it there. But think service.
40:01
I I create a
40:03
persona. persona that is unique to
40:05
this that can do customer service in
40:07
a way. in a way never thought possible.
40:10
possible. if you get a
40:12
get a letter. for a note. a
40:14
note, it's just phenomenal. I thought they
40:16
did all of this stuff did all well stuff
40:18
so well. a lot of it
40:20
was really cool. cool. I think
40:22
not just that's cool, for me,
40:24
business me, business case for these
40:26
things. things. The The disappointing part was
40:29
not all demos are created equal,
40:31
but... equal, have to wait
40:33
till the end where they say end of
40:35
these features won't be ready or won't be
40:37
available until 2025. or won't be know
40:39
saying, not currently available in
40:41
the UK. currently available in the they can
40:43
think of it as almost... is
40:46
the only thing that isn't available. of it
40:48
as day two is some of
40:50
the thing that the available. No, some
40:52
of the day six with the for anybody
40:54
who's watching this. not... I I
40:56
can now share my screen and
40:58
do a real time video interaction. interaction
41:00
with chat was now. was That was
41:03
one of the ones I tried
41:05
that last night and it wasn't
41:07
there. And so that one I there
41:09
and so that one I said the next morning,
41:11
to the next morning, had to
41:13
wait two days for days know. I know.
41:16
Two days. Yeah, I couldn't log into
41:18
for two days. But this is
41:20
what I mean. is what I mean, you
41:22
They ante. They delivered of what they promised
41:24
or what they or this thing.
41:26
And that is unusual, unlike the
41:28
other announcements. other Let's just jump right
41:30
just jump release. And of course, I
41:32
was blown away I I was
41:34
thrilled and amazingly impressed with
41:36
Google's release. First of all,
41:38
they released Gemini 2 .0. Gemini 2.0,
41:40
and they at Gemini 2 .0 Flash.
41:42
2.0 Flash, as as opposed to the
41:45
Big Giant version. the background,
41:47
they had real -time voice
41:49
and video could well. and
41:52
you could share your screen
41:54
with sees Gemini doing that it sees
41:56
what you're doing on the
41:58
screen and it interacts with whatever.
42:00
application got running, running or and I did this
42:02
in my living room, I actually pulled it
42:04
up on my tablet, you know, it was was
42:06
a little a screen, and I told it to
42:08
look through the camera. and I walked around my
42:10
around I living room, I showed everything that was
42:12
in my living room. And it responded in
42:14
looking looking at the objects that were in
42:16
the living room. And I said, how would
42:18
you improve the living room? it And it said,
42:21
the biggest problem that I can see this
42:23
in the evening, as it as it said is don't
42:25
have a lot of light in this room.
42:27
It's quite dim. room. Now, the way that I
42:29
would fix this you of of course, you could
42:31
put some floor lamps in a on a couple of
42:33
strategic areas. what But what I'd recommend a sconces on each
42:35
on each of these two walls. And it
42:37
pointed out the walls was looking. It's it did
42:39
all this as It hit all this as conversations.
42:41
And to be honest, to
42:43
never considered never considered wall sconces and
42:45
Bad idea at all. as As opposed to having
42:47
just more more sitting on the floor all
42:49
the time. all blew me away. I could share
42:51
my screen. it I could have it respond
42:53
to the work that I was doing in
42:55
real in real time. I I have it look through the
42:57
camera. you You remember the movie, Her. her, and
43:00
at some point she you know, know, turn the phone or
43:02
whatever phone or whatever in your pocket so
43:04
I can look out your camera. all
43:06
of a sudden that of a sudden,
43:08
that becomes her view of the
43:10
world that she's looking through the
43:12
camera. This is is what's happening with
43:14
this with this, is it's able to
43:16
look through the camera and the world
43:18
and respond to things in real
43:20
times. And Google's demonstration of this
43:22
was amazing. And then what happens
43:24
the very next next day. Guess what... It's
43:26
on your phone Exactly. Oh, there's no doubt
43:28
there's no doubt that they stole
43:30
the thunder from everybody that. that. this
43:32
And this was crazily good. one the
43:34
one thing I about about the piece
43:36
as well well, was its memory. You could, it
43:39
You could, it could remember things.
43:41
it's But in their it's
43:43
Google When do they actually
43:45
thing. Is this do they actually
43:47
have to deliver? Is this the first time that
43:49
they've actually promised something and not delivered it? it?
43:51
It's you look at you look at
43:53
it and go, you ever lied to
43:55
me before? they deliver it in testing.
43:57
You can go to to into a studio.
44:00
.com. slash live and experience this right
44:02
now But it's not I stole
44:04
stole the line from William
44:06
Gibson yesterday our our namely the future
44:08
future is already here it's
44:10
just not evenly distributed yet distributed
44:13
yet. Yeah, yeah. But that's the whole whole
44:15
thing though, is that you've got
44:17
this thing that has has about
44:19
stuff, they promise stuff, and goes
44:21
into a lab. and maybe
44:23
it gets delivered, maybe it goes to the
44:25
Google And I'm not And I'm not suggesting that
44:27
we'll do that. But the real piece behind
44:29
this, this is is the thriller and
44:31
This is the is the thing, search. And
44:34
Google, it has to retain search.
44:36
has to a problem with search
44:38
And they have a problem with search, because the
44:40
and all stuff that they do and all of
44:42
that stuff where they explain things to you. it
44:45
It doesn't show you the ads. And that's
44:47
their problem. their problem. It's
44:49
not a technical problem. It's how do
44:51
I do this and still make all
44:53
this money? money. They have a huge problem
44:55
that other people don't have as a legacy
44:57
piece. John, you were gonna say something. you were
44:59
going to say something. just, I I
45:01
just, I think, I found that of the
45:03
interesting things about Gemini, I I
45:05
was looking at the co -pilot
45:07
vision last night. And some of
45:10
that some of that is similar.
45:12
that I love guys know that rest
45:14
of that, but and all the rest of that.
45:16
But from a business perspective, we're trying
45:18
to focus on I was So I was looking
45:20
at co -pilot last night and one of
45:22
the things that they showed with with vision. was
45:25
being able to interact with
45:27
co -pilot while you're going through
45:29
a web page. and that it that
45:31
it would Provide recommendations and comments
45:33
and things like that. I tried
45:35
to use that last night that last
45:37
not available feature is and guess when it's
45:39
coming home And guess when it's coming It's
45:42
coming in 2025. in 2025. Oh, if
45:44
you're if you're willing to spend
45:46
$27 Canadian a a month right now.
45:48
the for the It's actually available
45:50
in your browser in your browser.
45:52
Okay. Okay, probably not as
45:54
a business product but as a personal
45:56
product. you can get it for
45:58
the get it for month. a month. Believe it or
46:00
not, Jim, or I haven't haven't actually
46:03
bought co-pilot. I'm paying for
46:05
everything else but not co-pilot.
46:07
This is a is a lunch thing.
46:09
It's of thing. It's very
46:11
practical. Edge browser right now, you can browser
46:13
right now. You can
46:15
get for free. It'll summarize a document.
46:17
It'll do. thing. And of that sort
46:20
of thing. the Adobe we talked about the but that's
46:22
in there, but that's where I first
46:24
discovered the Adobe piece. I was was I
46:26
was looking to summarize it, and then Adobe started
46:28
summarizing the document for me, and I went, for me
46:30
and I went. and I didn't really I think about it
46:32
till I talked to you last night. it until
46:34
I talked to you let's talk about Adobe for a minute.
46:36
talk about of the things a people need
46:38
to be aware of and be concerned about. be aware
46:41
of and be picking on and I'm
46:43
picking that's the first one that came up,
46:45
but I'm sure there are other up, but I'm sure there
46:47
are that are doing the same thing. companies
46:49
that are doing the same thing, Adobe
46:51
The free version. version. has an AI
46:54
assistant tool or button in the
46:56
top right corner. corner. And
46:58
so if you you open a PDF
47:00
and you you click on that
47:02
AI .I. Assistant button. button, takes all
47:04
of the information that is in that
47:06
PDF. that is in puts it into
47:08
this and puts it window. this AI
47:11
assistant window. It's GPT4,
47:13
which which means you're taking
47:15
the material out of
47:17
your PDF document. and
47:19
using it to train GPT -4.
47:21
GPT4. You might might not it's using
47:24
using the API. the API the
47:26
API doesn't train the main model.
47:28
But the question you've raised is a
47:30
good one. This whole tsunami of
47:32
shipments. comes up with, oh my God,
47:34
there's all this stuff. all
47:36
this stuff, people, AI, people are going to
47:38
be putting another are going to be putting on
47:40
their phone the next day and if it's
47:43
Google they'll be putting it on their phone
47:45
next year phone next more dangerously they might be
47:47
in a lab a lab. where people don't
47:49
feel obligated to put proper safety
47:51
around it. Or they they couldn't
47:53
get one of these new applications,
47:55
now Adobe. Adobe, like I I said, maybe
47:57
using the API API, maybe fine. But what about?
48:00
that other app that they just bought that
48:02
you don't know about about. That That was part
48:04
of the whole the thing, right? thing, right? You can
48:06
can have it on your personal phone. You just
48:08
can't have it on your government phone. on your course,
48:10
what if somebody brings their personal
48:12
phone into the office? What if you
48:14
bring your personal phone to the
48:16
office you you your personal phone to mode? There's
48:18
all kinds of security risks. listen mode? and
48:20
all of this. of security nothing
48:23
wrong with all of this, and it
48:25
in a business scenario.
48:27
a business scenario, but... you need
48:29
to have that cyber cyber
48:32
security. on all of it. What's the
48:34
all of it What's the
48:36
worst case scenario that, how
48:38
could this the harm
48:40
the making sure that those things are
48:42
and making sure that
48:44
those things are being addressed,
48:46
whether firewall or things on the
48:49
firewall, or whether it's just creating a
48:51
policy that people have to sign
48:53
off that they've read and agreed to
48:55
abide by. by? There There was
48:57
a really cool news story a few
48:59
days ago and I don't remember who did
49:01
this. this, but they trained an AI
49:03
model on in cities and cities and
49:05
different locations. They basically in sounds as
49:07
opposed to training it on a
49:09
language math or or something like
49:12
that. was trained on trained on And
49:14
what they were able to
49:16
do was play a recording from
49:18
a city from somewhere. street somewhere.
49:20
Beated into the AI. I not only recognize
49:22
where these sounds are from, And
49:24
they be sounds from today, by
49:26
the way. the way. But what it
49:29
would do do would recognize that, these these
49:31
sounds are bouncing off this type of a
49:33
building over here. here. I hear the sound
49:35
of a car, the car, the appears to
49:37
be going directly in front of the building,
49:39
as opposed to the right, the wind
49:41
tends to be predominantly from this direction. wind tends
49:44
then it would actually paint
49:46
a picture of And then it would actually
49:48
that city of that
49:50
those buildings city, of from
49:52
not from Google. or memory or
49:54
something like this, but it
49:57
would create the picture based
49:59
entirely on how the sounds were bouncing
50:01
around at being generated
50:03
and so forth so forth mind-boggling.
50:05
Some Some guys in the UK able to
50:07
pick to pick the sound of of
50:10
the hum of your of the of
50:12
the electricity running through. wires
50:14
the wires outside your house. give your
50:16
location geo your location, these things
50:18
because these things are very
50:20
specific and they're actually in
50:23
the grid maps. maps. Years ago
50:25
years ago, did did a thing where they
50:27
had a a camera. looking at a
50:29
plant. sitting in the sitting in the
50:32
window. a freaking plant, by
50:34
recording the recording the tiny
50:36
little vibrations in the
50:38
leaves of the able to they
50:40
were able to regenerate the
50:42
conversation that was going on inside
50:44
the house. you want to talk
50:46
about you want to talk
50:49
about nightmares, John, we're entering a world.
50:51
world. I'm the Remember,
50:53
I'm the techno okay? here. painting a
50:55
a really scary picture. We're
50:57
entering a world where your idea of
51:00
idea of anything being
51:02
secure is basically complete
51:04
nonsense. There will be
51:06
no privacy. There will be no security.
51:08
can can already, they can already find
51:10
out where you are if They can
51:12
tell if you're home. You need
51:14
specialized tools for this. But it's possible
51:16
to tell whether someone is actually
51:18
in house. in-house based on on the
51:20
kind of sonic information and so
51:22
forth. You don't need an
51:24
infrared sensor for outside. You can actually
51:26
tell tell. by looking at these
51:28
little tiny microscopic movements that would
51:31
otherwise be completely unnoticed by
51:33
anything that required a human to
51:35
pick it up. to And yet
51:37
a machine is able to
51:39
make those correlations and those associations
51:41
and paint a picture of
51:43
whether of home or not. This
51:45
is or not. This is becoming a very clean world, but a
51:47
but a very scary at the same
51:49
thing. time. willing to admit that. So I
51:51
I think that at some point we're
51:53
gonna have to define Not just the AI run
51:55
AI run tools but the tools are
51:58
out there already. have have to define.
52:00
what does privacy mean? What does
52:02
security mean? does Does economic
52:04
system that we have in
52:06
place. Capitalism actually even work. actually
52:08
even is the thing. it's, this is
52:10
the thing, has their AI laws or whatever,
52:13
whatever had around for a while. a while around
52:15
get around it. I know people UK,
52:17
who Who basically live on a
52:19
freaking VP, and so that that they can
52:21
access these services What's the uk
52:23
going to do pass a law
52:25
that says that the use of the
52:27
use punishable punishable of years in prison
52:30
of the frightening thing is But the
52:32
they've done something that is done
52:34
it's flawed because there's ways around
52:36
it. it's flawed but then you
52:38
get to Canada. it. But then we're
52:40
still working on Bill C still
52:42
working on which is which is miles from being... The
52:44
U.S.S. doesn't have good legislation either.
52:46
Nobody's got good regulation of of AI and
52:48
because they look at all the things
52:51
we've talked about and if you
52:53
were listening to this program and
52:55
you were expecting answers, they don't
52:57
exist. exist. No, that's that's the the problem. are
52:59
the start of the conversation. and
53:02
we have to deal with these things whether it's... The are
53:04
useless because they can only give you answers.
53:06
can only turns you they're giving us a lot
53:08
of questions. giving us a lot of questions. Yes.
53:10
Cyber security has been an issue
53:12
and has been around or more
53:14
prevalent for a longer period of
53:16
time. of time. That's C-26, it? Yeah, it? Yes, that's
53:18
C-26 just went back to the
53:20
House of Commons the somebody didn't dot
53:22
an I or cross a T.
53:24
dot an I or cross a T? haven't even put
53:27
that one to bed yet. that one to
53:29
bed yet. gonna come and to come and
53:31
faster than anything else that
53:33
we've seen. else that it's frightening
53:35
it's fact that these governments
53:37
are so far behind. are so far
53:40
behind on of this. Can't
53:42
count on regulation. AI. Is it
53:44
is it character AI? now. Thank
53:47
God that the CEO and the other second
53:49
that the CEO back to other so
53:51
went back to Google so they're
53:53
not part of the lawsuits. they're
53:55
not. They not. be They can't be
53:57
held personally responsibly. But two
53:59
lawsuits, lawsuits, seriously? One child died, the other,
54:01
I forget, I don't know if
54:03
the other kid died or was
54:05
just damaged, the parents are suing
54:07
them. Now, because their AI didn't
54:09
have any guardrails functioning. No, that's
54:11
an extreme example. But as we
54:13
apply this, particularly in the US
54:16
where people are more litigious than
54:18
they are in Canada. There's some
54:20
limits. People are going to start
54:22
suing, but there have been some
54:24
class actions in Canada too. So
54:26
people are going to start suing
54:28
because of the results of AI.
54:30
So you not just have a
54:32
cyber security problem, you have a
54:34
problem. This is, I don't know
54:36
how you feel about this John,
54:38
but I think our approach to
54:40
cyber security AI is flawed if
54:42
we have cyber security people and
54:44
we have AI people. I said,
54:46
you can't paste the security on
54:48
afterwards. You have to bake cyber
54:50
security into everything you do. If
54:52
you start doing development, we did
54:55
some development work where I'm employed,
54:57
and one of the things that
54:59
I said to the applications team
55:01
is, don't think about cybersecurity after
55:03
the fact. Think of it before
55:05
you start touching the keys for
55:07
the first time in developing your
55:09
code. It has to be baked
55:11
in and it has to be
55:13
baked into everything we do, including
55:15
AI for that matter. All of
55:17
these things need to have that.
55:19
you need to take that into
55:21
consideration. The problem, and this is
55:23
where, we go back to the
55:25
tsunami, Jim, the speed at which
55:27
this stuff is being unleashed on
55:29
us, he makes that almost, you
55:31
know, it feels almost futile because
55:34
of the speed at which like
55:36
the tsunami of things that were
55:38
unleashed just in the last six
55:40
days alone, never mind the last
55:42
two years. when you've got a
55:44
really slow judicial system when you've
55:46
got a system that's built on
55:48
you have a very logical way
55:50
of looking at it you bake
55:52
it in at the beginning but
55:54
we a society we have a
55:56
world that's based on when we
55:58
find a problem we will fix
56:00
it. It takes a long time
56:02
to fix the problem after the
56:04
fact. It's reactive rather than proactive.
56:06
And of course monolithic corporations like
56:08
governments move at a glacial and
56:10
this stuff moves at the speed
56:13
of neutrinos. I was thinking about
56:15
this last night Jim Marcel when
56:17
I was going through what has
56:19
transpired in the last six days
56:21
and... If you look at what
56:23
Open AI has brought to the
56:25
table in the last six days,
56:27
that's more in a six-day window
56:29
than a lot of applications or
56:31
systems will bring out in a
56:33
year or two years. And they've
56:35
brought it all out in six
56:37
days. And I want to quote
56:39
that and... Okay, yeah, go ahead.
56:41
I was going to say, Marcellus
56:43
quoted William Gibson, one of the
56:45
famous writers, I want to quote
56:47
that great Canadian... philosopher Randy Backman
56:49
who said, but maybe you ain't
56:52
seen nothing yet. There's six more
56:54
days. Yeah, I know. And that's
56:56
what I keep sitting here thinking,
56:58
my God, everything that they've brought
57:00
out in the first six days,
57:02
what's left? What is left for
57:04
them to bring out? Maybe AGI
57:06
is day 12. A lot of
57:08
people are talking about GPT5. Be
57:10
there in the final release. Yep.
57:12
Yeah, that's true too. And that
57:14
could very well be. Like I
57:16
said, but you have to watch
57:18
this. We haven't talked about Amazon
57:20
or Claude or what's happened to
57:22
them. But silently in the background,
57:24
AWS has launched their own foundation
57:26
model. They have a full suite
57:28
of them. At least half of
57:30
it's available. Their biggest model is
57:33
not available till next year. But
57:35
they've been scrambling to keep up.
57:37
The interesting thing is, Basos himself
57:39
is back on AI and heading
57:41
that up. I watched this presentation
57:43
of a reveal. And this is
57:45
talking about sucking all the oxygen
57:47
out of the oxygen out of
57:49
a room. So I said, Altman
57:51
is brilliant at marketing. foundation model
57:53
too that's just as good as
57:55
GBT4 and nobody even knows. But
57:57
the interesting thing when you watch
57:59
the Amazon presentations, and this is
58:01
why I say we're going to
58:03
get hyper competitive, they mention every
58:05
other AI except Open AI. No,
58:07
they've got money in closed. They're
58:09
going to they're going to talk
58:12
about clothes, but they mention everything
58:14
except Open AI. Sundar Pichai. gets
58:16
up on stage just before his
58:18
announcement. What does he say? And
58:20
he mentions his own? In Google.
58:22
Yeah. Yeah, he only mentions Google.
58:24
No, he does talk about Microsoft.
58:26
He won't talk about opening eye.
58:28
He said, Microsoft could have, I'd
58:30
love to compare our search with
58:32
Microsoft if they had their own
58:34
mod. And then Microsoft came out
58:36
and said, hey, our stuff is
58:38
almost as good as yours. And
58:40
then Altman comes out and says,
58:42
like. You see that going over
58:44
the fence? That's a home run.
58:46
Bold my beer. So these guys
58:48
are going to get in, and
58:51
Basos now in being in AWS,
58:53
this is going to get even
58:55
more competitive. They don't want to
58:57
give this away so that open
58:59
AI is the next Microsoft. In
59:01
other words, it's ubiquitous. And its
59:03
partnership with Microsoft gives it a
59:05
chance of doing. So that's going
59:07
to do what the biggest AI
59:09
company is though. Meta has. AI
59:11
baked into everything in the world
59:13
and meta has more users on
59:15
more platforms than any other company
59:17
in the entire world. Every single
59:19
one of their products, whether it's
59:21
what's for messenger or Facebook, has
59:23
the baked into it. So meta
59:25
is actually the predominant ubiquitous artificial
59:27
intelligence product in the world. Don't
59:30
forget them. This is what happens.
59:32
We're all watching Open AI. And
59:34
I don't, the other thing is
59:36
you don't know what this, what
59:38
success is for open AI at
59:40
this point. Like I said, if
59:42
the AWS is in there, yes,
59:44
they've got the thing with Microsoft.
59:46
but what is their success point?
59:48
Search is one area that they're
59:50
certainly going to go after taking
59:52
a piece of Google's lunch on
59:54
search. Yeah, everybody uses them with
59:56
an API. Yeah, being the back
59:58
end API piece of this. And
1:00:00
then on the next six days
1:00:02
we might discover more about where
1:00:04
they're going with this. It's going
1:00:06
to be an interesting week. the
1:00:08
I think prosperous island maybe from
1:00:11
Shakespeare may be one of the
1:00:13
great analogies as well that there's
1:00:15
magic happening out there can be
1:00:17
good it can be tragic or
1:00:19
comic take your pick that seems
1:00:21
to you're taking the air out
1:00:23
of the room I think we're
1:00:25
wrapped for this one this has
1:00:27
been a great discussion thank you
1:00:29
guys I think we have to
1:00:31
go back and I don't know
1:00:33
how we're gonna do this we
1:00:35
have to tackle this discussion of
1:00:37
AI and cyber security I'm gonna
1:00:39
run this show in cyber security
1:00:41
today as well because I think
1:00:43
it's important, but we have to
1:00:45
get it to a discussion that
1:00:47
is better than, hey, sucks to
1:00:50
be you, pure a season. I
1:00:52
did not use those words, I
1:00:54
did. You were thinking it though,
1:00:56
Marcel. The prompt was generating that
1:00:58
from the AI in my brain.
1:01:00
Thank you, Marcel Guinea, John Pinart.
1:01:02
Thank you very much. Thank you
1:01:04
to the audience, if you're still
1:01:06
listening. Thanks. If you're still listening
1:01:08
to this, after this discussion, we'd
1:01:10
love to hear your questions, particularly
1:01:12
cyber security questions, or what you
1:01:14
think of how you're going to
1:01:16
manage this, that would be really
1:01:18
great. You can write me at
1:01:20
Editorial at tech newsday.ca. You can
1:01:22
send whatever you want, comments, questions,
1:01:24
whatever. I'd like to see where
1:01:26
we take this discussion so that
1:01:29
it's useful for everybody. We'll be
1:01:31
back probably in the new year
1:01:33
with our recorded versions, because next
1:01:35
week is the final edition of
1:01:37
our. weekend shows for the year.
1:01:39
And I think I've got some
1:01:41
really great guests, not that these
1:01:43
aren't really great guests, but some
1:01:45
surprise guest talking about the future.
1:01:47
of government and how we're
1:01:49
going to regulate
1:01:51
all of the
1:01:53
stuff that we're
1:01:55
doing and still
1:01:57
be prosperous. So
1:01:59
that'll be a
1:02:01
cool one. So
1:02:03
Thank cool one. Thank you
1:02:05
Thank you. Thank you. And
1:02:08
we'll see you
1:02:10
next week. week. I'm
1:02:12
your I'm your
1:02:14
host Thanks Thanks for
1:02:16
listening.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More