Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Brought to you by Toyota. Let's
0:02
go places. Welcome
0:07
to Forward Thinking. Hey
0:12
there, welcome to Forward Thinking,
0:15
the podcast that looks at the future and
0:17
says Yoshimi, they don't believe me, but
0:19
you won't let those robots defeat me. I'm
0:21
Jonvin Strickler and I'm Lauren foc Obam,
0:23
and I was hoping that you would use Yoshimi
0:25
battles the Pink Robots, uh quote
0:28
for this particular episode from The Flaming Lips.
0:30
I'm so pleased you recognized where
0:32
that came from. I think this is like the third
0:35
time that's happened in the entirety of the
0:37
what like two or three years we've been recording. There
0:39
was I think that when I used maps once you got
0:41
that one. To
0:43
be fair, my musical references
0:46
are different than Lauren and Joe's
0:48
musical references. That is generally
0:51
true. Speaking of Joe,
0:53
he is not with us today, as you probably guessed
0:55
from the fact that he did not chime in. Joe McCormick,
0:58
our third co host, will be very
1:00
soon. In the meanwhile, if you
1:02
couldn't tell, we're talking more
1:04
about robots. Yeah, we mentioned in our last
1:06
episode that we were having a
1:08
duo robot conversation this
1:11
week, and in fact, this topic that
1:13
we're going to talk about today, robot security
1:15
guards, was a suggestion from Joe
1:17
himself, and so we hope to
1:19
do justice to the topic, Joe, and
1:22
we look forward to you being back in the studio very
1:24
soon. So let's talk about
1:27
robots and security guards and
1:29
and and robot security guards. Yes,
1:32
because well, okay, we've talked
1:34
many times on the show before about the
1:36
amazing and sometimes strange
1:38
ways in which robots are taking over
1:40
traditionally human work, the
1:42
type of human work that frankly just
1:44
stinks, uh, you know it worked that's
1:47
physically hard or repetitive, or dangerous
1:49
or just boring. Yeah, and all of those
1:52
things are detrimental
1:54
to a person's sense of self
1:56
worth. They can affect job satisfaction.
1:59
Like if your job ends up devolving
2:02
into something that is repetitively dangerous,
2:04
obviously, that can have a profound
2:06
effect on you stress, either physical
2:08
or emotional or both. And even if
2:10
it's just dull, if it's just boring
2:13
and repetitive, then that can have
2:15
an effect on you as well, a big negative
2:17
impact. Absolutely. And okay,
2:19
so so security why is security
2:23
this type of a field, this type of field
2:25
in which we would potentially want to get robots
2:27
into. Well. Well, uh,
2:30
some some background factoids on security.
2:32
According to the U. S. Bureau of Labor Statistics,
2:34
there were about a million human
2:37
security guards employed around the
2:39
United States as of Um.
2:41
They generally make about twelve or thirteen bucks
2:43
an now or on average patrolling and monitoring
2:46
premises for suspicious behavior and
2:48
crime. And they're important because
2:50
even even in this world of ours that is increasingly
2:53
rigged with security cameras everywhere,
2:55
security guards, you know, having
2:57
having a physical person with a physical presence
3:00
can absolutely be a deterrent to crime, right,
3:02
I mean a camera might
3:05
deter a criminal, it might not. And if
3:07
the criminal has decided to go ahead and commit
3:09
said crime, perhaps they've
3:12
got a mask on or something, they feel fairly confident
3:14
that the camera's not going to be able to
3:16
figure out their identity. What's the camera
3:18
gonna do? Not not a whole lot,
3:20
just going to record like it
3:23
usually does the same thing it does every night, pinky
3:25
yeah. Um. And furthermore, as we know
3:27
from every heist movie ever, cameras are
3:29
really easy to trick, uh,
3:31
and very entertaining in the
3:34
ways that you can come up with to do so.
3:36
So okay, um,
3:39
this job is important, but it is
3:41
not an ideal job. It is it is dangerous
3:43
by nature to put your soft and
3:45
fleshy body in the way of a person who's
3:48
desperate or determined enough to steal
3:50
or destroy property and uh
3:53
and furthermore, it can just be lonely and boring
3:55
and patrolling an area over and over again
3:58
isn't really fun. Yeah, so it's already
4:00
fitting two of the three DS
4:02
that are often cited as work that is
4:04
ideal for robots and not for
4:06
humans. That would be the dangerous and
4:09
dull. But depending
4:11
upon where you're doing your security gig, you might even
4:13
hit the third D, which is dirty.
4:15
Dirty meaning not just like, oh, that
4:17
place is poorly maintained. But it
4:20
may be that we're talking about security guards for
4:22
areas like where there might be hazardous
4:24
waste and you don't want people wandering
4:26
into that place. But then you
4:28
put a human being in danger by
4:30
placing that person in close proximity with
4:33
those hazardous materials. So
4:35
what's the solution. Why not
4:37
put some robots there? Robots?
4:40
So here's the thing, Lauren
4:43
went out and started looking at this
4:46
crazy, crazy company
4:48
that has a security robot that's that's
4:50
been active for a couple of years now, particularly
4:52
over in the Silicon Valley area.
4:55
But recently we saw
4:58
some news coming out because he
5:00
has started to use one or maybe
5:02
two of these robots at one
5:04
of their lots. Right. Yeah, we were inspired
5:07
by this news story, and so we wanted to
5:09
to look in to this startup called
5:12
night Scope, which is awesome
5:14
name. Yeah. Night is in like the stabby like
5:17
armored human kind of not not night the
5:19
Holy Grail style. Yeah yeah, not the nighttime
5:22
but the dark dark thing. Yeah. Okay. So
5:25
so in this this startup
5:28
in Mountain View, California, home of Google,
5:30
home of Google and etcetera, UM
5:32
started designing this robot that they called
5:34
the K five and what
5:36
they eventually brought to market is a five
5:38
ft three inch three hundred pound
5:41
shiny white doom egg. Yeah
5:43
that's a good way of describing it. It is conical,
5:47
not comical, although I
5:49
guess depending on what happens to it, that also could
5:52
be the case. But but conical
5:54
in shape and uh three
5:56
hundred pounds, I mean this is this is not a lightweight.
5:59
Yeah, I mean, I am five foot three,
6:01
y'all. For for for reference and and the
6:03
idea. Like, I'm thinking that I could
6:05
easily like ride one of these, like a like a doo
6:07
meg pony, but but I
6:10
probably shouldn't. It
6:13
might start making noise, but we'll talk about that in the
6:15
second Yeah, yeah, okay. So,
6:17
so these things are outfitted with just a whole
6:19
lot of sensors. You've got four high
6:21
definition cameras in there that give it three D
6:23
sixty degree vision. You've got a license
6:26
plate recognition camera, four microphones,
6:28
a thermal camera, a laser range
6:31
finder GPS, and
6:33
a weather sensor um or or an
6:35
environment sensor really you should say that
6:38
can monitor the temperature, the barometric
6:40
pressure, and carbon dioxide levels in
6:42
the area. So what all
6:44
of this does is help it find its way
6:46
around, watch any humans that
6:48
are in the area, detect fires very
6:51
important, uh huh um, watch for
6:53
license plates for potential troublemakers
6:55
like a recently fired employees or
6:57
like a car that seems to be casing the joint something
7:00
like that, and etcetera. Um
7:02
it also has Wi Fi capacity so that it can
7:04
communicate with fellow security bots and
7:07
also with human personnel. Who are either
7:09
remote or on the site, but not you
7:11
know, like poking it right, not right, not
7:13
not walking, not literally walking step
7:15
by step with the robot um
7:18
for furthermore, like like room bas they know
7:20
when they're running out of juice and can go
7:23
head back over to their recharge station. Now
7:25
was interesting to me, beyond
7:27
the technology here, which is already impressive,
7:29
this idea, you know, it seems fairly
7:32
comprehensive for your basic uh
7:34
guarding type of duties. I mean you're
7:36
talking about some surveillance, some recording
7:39
patrolling. Is that nine
7:41
scopes come up with a clever means for generating
7:44
revenue. They're not like, obviously these
7:47
robots have to be mega super
7:49
expensive, but they're not. They're
7:51
not just saying, hey, come look
7:53
at our catalog and you see this
7:56
dollar robot. How many would you like to order?
7:58
Yeah? No, they they have a rental
8:01
business model. So uh
8:03
so the official company line and how much
8:05
they cost to rent is that you should totally
8:07
speak with a sales representative to find out a plan
8:09
that works best for you. But
8:12
but news reports have listed it at about seven
8:15
dollars per robot per hour, or
8:18
about sixty dollars a year for seven
8:21
surveillance. Right, So if
8:23
you think, uh, seven dollars per
8:25
hour for a robot, like that's
8:27
probably less than what you would
8:30
pay a human person, and
8:32
in fact, according to averages, it is.
8:34
So in that case, you've got night Scope
8:37
having a sales pitch saying, well,
8:39
if if your needs don't
8:41
require an actual human being to be
8:44
there all the time, all the time, then
8:46
perhaps this robot would be a way that not
8:48
only could you maintain security at your
8:50
space, but you could save money in the process.
8:53
Yeah, and uh and and under under this kind
8:55
of concept, they are in use by
8:57
a few tech companies around the Silicon
9:00
Valley area, like like qual Common and Uber
9:02
we mentioned, and they
9:04
were certainly in use for a demonstration at Microsoft.
9:07
I'm not sure if they actually Yeah.
9:10
Well, what's interesting to me is when I started doing research
9:12
on this, I keep coming up with articles
9:15
from different time periods, like two
9:17
years apart. They were like and every time
9:19
it was like people had just discovered them
9:22
right right, and I had known I had only
9:24
seen them right now. So and I read
9:26
a lot about robot I'm
9:28
like, how did I miss this? They're glorious, to
9:30
be fair, even people in Silicon
9:33
Valley have missed it because every time
9:35
there's another report about these robots being put
9:37
in place by someone, it's like, this
9:40
is amazing, it's brand new, that
9:42
happened two years ago. There's a hashtag on Instagram.
9:45
I don't anyway, Yeah, that is hashtag
9:47
security robot. By the way, if
9:49
you'd like to look it up and see some pictures and videos
9:52
of it interacting with humans. Um, so
9:54
let's say that you rent a few of
9:56
these. How do you set it up? How
9:58
does it know where to patrol? Well,
10:01
you lead it around with a wireless
10:03
controller to help it build out a basic
10:06
map of its patrol roots. That makes sense because
10:08
it sounds adorable. It wouldn't magically
10:10
have the information stored in its
10:12
data banks. It has to learn clearly, So
10:15
yeah, it's it's kind of like taking an
10:17
employee on a tour, right exactly,
10:19
just your giant metal employee.
10:22
Well, and the other neat thing is because it's a machine,
10:24
it doesn't forget unless its memory gets
10:26
corrupted, so exactly will always remember
10:28
where everything is. Yeah, and so, and
10:31
by observing its surroundings, it starts to
10:33
learn, you know, where it can and cannot roll
10:36
because it is a rolling machine, walking
10:39
machine. So you can defeat the security
10:41
robot by with a short curb or
10:43
yeah, exactly, just a just a slight ledge.
10:45
It's it's kind of got the classic
10:48
dollic uh
10:51
weakness. Although I know, I
10:56
know that today they can fly, but
10:58
back in the day they couldn't know. And
11:01
and the first time that a Dollek said elevate and
11:03
went up some stairs, like, I was terrified.
11:06
I was genuinely upset. Spoiler
11:08
alert for first season of Dr Who
11:10
reboot season nine. I guess of the
11:13
technically, but whatever at
11:15
any rate. Um, yeah, so so you so you lead
11:17
it around to to show it where it can
11:20
cannot go. It will start building up its own
11:22
personal database of its surroundings
11:24
there. And furthermore, you can program in GPS
11:27
coordinates to give it like an outline of
11:29
like like a fence almost. Yeah, so it's like
11:31
it's a very invisible fence that the robots
11:33
not going to go beyond. All right,
11:35
So all right, I've decided
11:37
to rent my security my RoboCop
11:41
and and he says, your move, creep. No wait,
11:43
no, I put him. I put him out there, uh,
11:46
patrolling the area that we have determined.
11:50
Let's say that it detects something
11:52
hinky is going on. What happens next?
11:54
Not a hinky thing anything but a hanky thing. Um,
11:57
it phones home, and by home, I
11:59
mean whatever humans are on security
12:01
team at that current moment um. You can have an
12:03
alert different people during different shifts, and
12:05
anyone who gets an alert can tune into
12:08
a live stream of what the boat is
12:10
seeing and hearing. Okay, So
12:12
in other words, you whether it's you've
12:14
got a team that's on site, but they
12:16
are maybe they're overseeing the entire
12:19
area. Like if you're talking about a really big
12:21
area, you might have multiple robots to like
12:23
on different floors of a building for example. Oh absolutely,
12:25
Or you might even have I imagine
12:28
like a like a larger security call center
12:31
that that's servicing many different robots
12:33
over different businesses. Right, that would make sense,
12:35
and you would have certain teams assigned to specific
12:37
robots. Therefore you would get that notification.
12:40
So all right, let's say that, uh,
12:43
we've got the security robot. Clearly,
12:46
this is something that's meant to interact with human
12:48
beings, whether it's someone
12:51
that needs help or it's someone
12:53
who's doing something they shouldn't be doing so,
12:56
how how does the robot actually perform
12:59
those interactions. Does it just like say like
13:02
please don't do that? Well,
13:05
yeah, yeah, because because on on the ground level, you know, like
13:07
it's it's going to phone home if it senses a
13:09
problem. But um, but it's
13:11
it's the humans in the in the security
13:13
team that are going to actually choose
13:17
what what happens, like basically
13:19
the robot disengages. Essentially,
13:21
the robot is programmed to not interact
13:24
with humans. Um, I mean it's it's which
13:26
is a strange feature because and I think
13:28
that the team put really quite a lot of research
13:30
and development into the physical
13:33
design and the response design
13:36
of this robot um too, to
13:38
to make it this object that can
13:41
interact when it has to. But that really
13:44
isn't meant to It's not it's not
13:46
interfering with someone. It's not it's
13:48
not intimidating someone who is
13:51
perhaps just they're
13:53
under normal circumstances or
13:55
perhaps they need help and they're coming to the robot
13:57
to get help. And because the robot actually
13:59
can can communicate back to yeah,
14:02
there's there's a button for that. If you need help, you
14:04
can push a button. It's essentially an emergency
14:06
call button. Yeah, exactly. So you don't
14:08
want to design your robot to look like, you
14:10
know, like the Queens and aren't Killbot two thousand
14:12
with spinning saw blades
14:14
all over. Yeah. Yeah, You're not going to go up to that kind
14:16
of robot for help. I mean also, you gotta run
14:18
from that robot screaming help. Yes.
14:21
Furthermore, if if your if your kid wanders up
14:23
to it, that's it's bad time. It's probably
14:27
you don't want so. So you know, you
14:29
wanted to make it intimidating enough to
14:31
deter crime, but you don't want to make it unfriendly,
14:34
cute enough that it doesn't scare people, but
14:37
not so cute that people are twisted,
14:40
lee willing in, or motivated to mess
14:42
with it, as we have seen
14:44
before in studies of human robot
14:46
interaction. Uh remember that
14:48
that hitchhiking robot, the one that successfully
14:51
got all the way across Canada and then
14:53
maybe two days into the United States before
14:55
it was stolen and disassembled. Good,
14:59
good job, Pennsylvania. That tells you
15:02
so much about the different cultures
15:04
of Canada and the United States. Or
15:07
yeah, okay. There was also a study in Japan
15:11
of how people would interact
15:13
with a robot that was designed to to help
15:15
the elderly shop Yeah, good
15:17
good robot. Kids messed with it even
15:20
and perhaps because it was saying things like
15:22
somebody helped me or ouch that hurts.
15:25
But I imagine in Japanese there is something
15:29
tragically comical about a robot
15:31
crying out saying somebody helped me, because
15:35
you know, it's not really feeling anything.
15:37
Yeah, it's not in pain, and that
15:39
kind of makes it funny, and that kind of makes
15:41
you want to mess with it more. I mean, I'm
15:44
a softie, so I have a feeling that I would
15:46
totally feel empathy for such a machine,
15:48
but I can also see the
15:51
perverse appeal of messing with
15:54
it further. Right, Yeah, there's there's really
15:57
interesting and probably terrible
15:59
psycho oology at work here.
16:01
Um So, so the team,
16:04
it seems like what was aiming to make the K five
16:06
like clearly inhuman
16:09
like, like clearly a a robot
16:12
and not attempting to to tug
16:14
on your heart strings. Right, it doesn't have big puppy
16:16
dog eyes no no, no, no like
16:18
that. Yeah, but also obviously
16:20
autonomous and obviously intelligent.
16:23
Um So, if it's approached
16:25
by a human person, it will turn
16:28
turn to watch you with all of its
16:30
big front sensors because it kind of kind
16:32
of has a front in the back. Um, if you corner
16:35
it, it will stop moving. Just I'm
16:37
pretty boring anyway, Why would you want to mess
16:39
with me? I pounds? Where are you gonna
16:42
come on? What are you gonna do? Stop? Um?
16:44
If you keep it cornered or if you start poking
16:46
at it, it will chirp at you and
16:48
and also phone home like like a low level alert
16:51
like hey, And then you can actually
16:53
have a person say something like hey,
16:56
knock it off, we can see
16:58
you. Yeah.
17:00
Yeah, and if you if you try to mess
17:02
with it, it'll sound an actual alarm
17:04
like like not like a friendly like hey, more
17:07
like a yeah, more like more
17:09
like imagine the worst fire alarm
17:12
you've heard at the loudest level,
17:14
something akin to that, because
17:16
I have seen uh it
17:18
described as ear piercing
17:21
Yes, uh so so yeah
17:23
and and and then then that will send
17:25
like an official alert to to the
17:27
team managing it, and uh and
17:29
the team managing it can I'm pretty
17:32
sure, like I like I wasn't clear, but but it sounds
17:34
like you think that this is the case that the team can
17:36
like speak to you through the robot. I mean,
17:39
if you have if you have a help button, I
17:41
would imagine you have to have some sort of communication
17:44
back or else. All I would do is have like
17:46
an elevated sense of this button
17:48
is doing nothing and we just beak
17:51
behind the curtain. Laurenen Lauren wrote, and
17:53
I recorded an episode of brain Stuff
17:55
about placebo buttons. That's one case where you
17:57
don't want a placebo button. No, No, that's emerge
18:00
as they call. You want a real effective button. Yeah,
18:02
that would be very, very upsetting. So my
18:04
guess is, while I didn't see anything specifically,
18:08
uh yeah, common
18:11
sense would dictate there has to be some sort of speaker
18:13
through which people could could talk. And
18:16
I would imagine that you would that would
18:18
become useful both in a security feature where
18:20
you say, hi, we're you're under
18:22
surveillance, don't do that, go
18:25
away, or we
18:27
we heard you were sending police
18:29
or ambulance or whatever the emergency
18:31
might be to your location right
18:33
now, and we'll stay on the phone with you that kind
18:35
of stuff. Yeah. Um, but but there's also
18:37
a list of things that will will absolutely
18:40
not do. Yeah, it won't physically interact with a
18:42
person. It's not gonna it's not gonna roll
18:44
up and be like, hey, back off, dude. I
18:46
likely
18:50
did the universal fronting
18:52
like pull me back, coming back. Yeah,
18:55
that's not going to do that. Um. The
18:57
closest you will get to any sort
19:00
of physical interaction is that alarm,
19:02
which is powerful enough for you to perhaps
19:04
and sound is technically a physical
19:07
phenomenon. So I guess by then you
19:09
could say that's how it gets physical with you. But it doesn't
19:11
like like, it doesn't have a little
19:13
zapper that comes out allah r two D
19:15
two. No, it's not weaponized in any way.
19:18
It's it's not it's not more about that one in just
19:20
a moment. It's not going to mess with you back.
19:23
Um. And I'm delighted
19:25
by by the existence of this thing,
19:27
a little bit creeped out. Um. You
19:30
know, the fact that it can look me right in my beady eyes.
19:32
You know, I'm not entirely sure how I feel about that.
19:34
I also the fact that they
19:36
one of the owners of the company or one of the creators
19:39
of the robot, in one interview, said like it's the
19:41
absolute minimum height that it could
19:43
be in order to Um, to look around
19:45
a parking lot, and I was like, absolute
19:47
minimum.
19:50
You can't be any shorter, Lauren, you
19:52
will be lost in every parking lot. Um.
19:56
Well, one of the things that makes me think about is that, Well,
19:59
on the one hand, and like you
20:01
get into the discussion about, yes, this is
20:03
taking a job that is not the
20:05
greatest job for a human being, But on the other hand,
20:07
you're like, well, there are human beings who are depending upon
20:09
jobs like these in order to make a living. So there's
20:12
there's two sides of that point. You
20:14
would hope to be able to come up with a means,
20:17
and it's beyond the scope of
20:19
nights scope. It's beyond their scope to
20:21
come up with the means, But you would hope you could
20:23
come up with some way of getting
20:26
those people who would otherwise be be out
20:28
of a job into a different line of
20:30
work or some other related field
20:33
so that they could continue to make a living and contribute
20:35
to society while also not being
20:37
put in physical harm or in
20:39
a situation that's going to be
20:41
soul sapping because it's so dull and repetitive.
20:45
The robot is not going to care, so
20:47
so it's up to us to care. Yeah, I also
20:49
I like the idea of a robot being able to
20:51
to record everything, uh
20:55
perfectly, like with video and audio.
20:58
Uh. Once upon a time, Laren,
21:00
before I worked at How Stuff Works, by
21:02
about ten years, I had
21:05
attempt job working
21:07
for a security a private security firm.
21:09
My job was to transcribe security
21:12
officer reports, handwritten
21:15
reports, so that they could be filed in the computer
21:17
system. And that
21:20
was not the best or most
21:22
rewarding job I've ever had. The people who were
21:24
working there were perfectly earnest
21:27
and hard working, and I
21:30
give all the credit to them. They were not
21:32
writers, and sometimes deciphering
21:35
what they were writing, not just like physically deciphering
21:38
what a word was, but trying to figure out what
21:40
they've meant. Um,
21:43
I'll have to tell you the story of Cookie after
21:46
this podcast. I tell it now, but it would be
21:48
such a huge tangent that it would be a waste of time,
21:50
so I'll tell you afterwards. But
21:52
but you wouldn't have to worry about that because the robot
21:54
would be recording everything. So that
21:57
of data and having a
21:59
cont tenual stream of that data
22:02
at a certain point becomes a little bit of a big data problem.
22:05
Um, But it's probably a good problem
22:07
that many companies would love to have, well, especially
22:09
if you have to pursue a legal claim
22:11
against someone, because you're not relying upon
22:14
uh eyewitness
22:16
account or a security person's account
22:18
of what happened. For a stationary camera's footage
22:21
right where you might not have a good view, you've got
22:23
a three degree view, a mobile
22:26
view, because they can move to wherever the problem
22:28
is or somewhere where has a better
22:30
line of sight of where a problem might be and
22:33
have a recording of that thing that is
22:35
incredibly valuable, well beyond like
22:38
an eyewitness account, which we know is
22:40
not entirely or even largely
22:43
reliable or scientifically. Yeah. Absolutely,
22:45
um. But but this, this night Scope
22:48
K five is not the only security
22:50
bot out there. No, I found a couple of others.
22:53
There's the Avatar three security
22:55
robot from Robotex.
22:58
It's t e X for the into that
23:00
word. But it's twenty four by
23:02
fifteen point inches by six point
23:05
fourteen inches, which is about sixty two by
23:07
thirty nine by fifteen point six centimeters,
23:10
so a little bit more
23:12
than two ft long, a little bit more than
23:14
a foot tall, and bigger than
23:16
a bread box half Yeah,
23:18
yeah, and uh it looks like
23:20
a like a little tread robot, Like it's
23:22
got treads to propel it, so don't
23:24
we Well, it's got wheels, but wheels to turn the tread
23:27
like so to treads kind of like
23:29
a tank. It's got a little kind of
23:31
glass dome on the top that has all the
23:33
cameras and sensors on it, and a pull
23:35
for an antenna to connect to
23:37
WiFi. This one is
23:40
under manual control. It's
23:42
not intended to be autonomous, but it
23:44
would allow someone to patrol
23:46
an area without having to leave their
23:49
they're nice air conditioned security room,
23:51
for example. Oh yeah, you're to check out a situation
23:53
that could potentially be dangerous. Yeah, if
23:55
it's if it's something where, you know, maybe someone
23:58
spotted a box and you're like, well,
24:00
let's get a closer look before we call in
24:02
any authorities. Yeah, because
24:04
it may just be that, oh, someone accidentally
24:06
set their stuff down while loading things into a car
24:08
and then drop off for getting it. Or it
24:10
could be that it's clearly a package of ramen.
24:13
Yeah, then let's not let's
24:16
call the swat team. Uh. So it's
24:19
a little different, it's not it's not like the
24:21
K five and that's the K Fi've been semi
24:23
autonomous, mostly autonomous, really, But
24:26
then you have the a mott a n
24:28
bot from China. Now, this
24:30
one is a little shorter and
24:33
and and about half the weight of
24:35
the K five. It's one point five meters tall,
24:37
which is about five ft and ways seventy
24:39
eights are about a hundred seventy two pounds.
24:42
It's also cone shaped. In fact, it
24:45
looks a lot like the K five, to
24:47
the point where I start to wonder if this is one of
24:49
those cases where one person
24:51
maybe possibly copied down
24:54
an idea of someone else had. It could very
24:56
well be that these are parallel forms of development
24:58
that have nothing in connection with each other. Happens frequently
25:01
remarkably similar designs. However,
25:03
um to the point where when I looked at the first
25:06
picture, I thought, well, why do they have a picture of the K five
25:08
on here? And then I was like, wait, that's
25:10
not the K five. That's central and but
25:13
um so uh.
25:16
It is meant to patrol areas
25:18
prone to civil unrest, according
25:21
to Newsweek, at any rate, and it is autonomous,
25:23
and that it can patrol an area, has
25:25
obstacle avoidance technology,
25:28
and it is able to go back to like a recharging
25:31
point in charge itself. But it
25:33
is also quote equipped with weapons,
25:36
although to be fair, those weapons must
25:38
be manually operated by someone.
25:40
So it's not like the robot just decides
25:43
to hit you. Yeah, that's that's a much
25:45
better option. Someone controlling the robot has decided
25:47
to hit I guess, depending upon who's controlling
25:49
the robot. The weapon, by the way, is an electrically
25:52
charged right controlled tools, so essentially
25:54
like a stun gun or a cattle prod
25:56
something along those lines. So it is like
25:58
the zapper from R two D two popping out
26:01
and giving someone a jolt of electricity.
26:04
But if that ever does happen to you, if
26:06
you are ever zapped by
26:08
an and bot, it's because
26:10
somebody controlling it doesn't like you
26:12
or doesn't like what you're doing. It's
26:15
not because the robot didn't. Yeah,
26:17
unless the robots malfunctioning in some way. If
26:20
you see an and butt in China and
26:22
there's a little protrusion extending
26:24
from it and sparks are coming off, just walk the other
26:26
way, yeah, don't. You
26:29
don't need to go in that direction anymore.
26:31
There are other places they're China is big there's
26:33
a lot of stuff to see. Yeah, so anyway,
26:36
um, I thought that was interesting. But there are of course
26:39
tons of robots that sort of fall
26:41
into the spectrum of security. But
26:43
but most of them are really not meant
26:45
to be security robots. They are robots
26:48
that have features on them that can
26:51
be part of security system. Typically
26:54
we're talking about cameras, so vacuum
26:57
robots that happen to have a camera on them. While they
26:59
don't necessarily need the camera for doing
27:01
the vacuum job, but it allows you to go
27:04
through an app and see what's going on in your house
27:06
while the vacuum robot is active
27:09
and it acts as sort of a surveillance camera.
27:11
So really it's it's not so much a security
27:13
robot. It's more like just a webcam. Well it's
27:17
it's a robot that does
27:19
Yeah, it's it's on the it's on the light
27:22
end. It's in the spectrum. But
27:25
this is just where we are. Now, where
27:27
are we going? This is forward thinking think
27:30
about the future. Uh yeah,
27:32
well, I mean okay, like
27:35
we were saying earlier, this this could
27:37
potentially if we if we follow
27:39
this out into having full fledged
27:41
security robots, we we were gonna
27:43
have to deal with a with a with a job deficiency
27:46
UM and and Night Scope says
27:48
themselves that that robots like the K five are
27:51
not intended to replace human
27:53
security guards certainly not yet UM,
27:56
but rather to keep them out of harms away as much as possible,
27:58
and to arm them with better ADUTA about
28:00
the environment that they're guarding. But
28:04
but what if we do create robots that
28:06
are sufficient enough to replace replace
28:08
HUM human staff Like they they have the
28:11
features that will allow them to do a security
28:13
guards job without the
28:15
need for the team
28:19
team right right now, nothing to to make the decisions
28:21
or to come out and zap people themselves. It would
28:24
be tough, uh tough
28:26
for many many reasons. One
28:28
of the big ones that we've said it other
28:30
episodes. There's some things that humans can
28:33
do really well, like naturally well we you
28:35
learn how to do it as a kid and you do it for the rest
28:37
of your life, and robots are
28:39
not good at those things. So,
28:41
for example, the K five isn't
28:44
really great at staying upright on uneven ground.
28:46
There was one or one article
28:49
I read where they talked about as they were chatting
28:51
with a representative in the background,
28:53
they saw a K five get too
28:56
close to the edge of like a curb and
28:58
tilt over, and it
29:00
doesn't have arms. It can't push itself
29:02
back out. It can't. Yeah, so then you have to pick up
29:04
a three pound robot and put it back on its
29:06
base. Um, not
29:09
necessarily easy to do. And the Darbo
29:11
robot challenge that we talked about in
29:13
a previous episode, Yeah, the best
29:15
people in the world working
29:17
on robots that can do a multitude
29:20
of tasks are still pretty
29:23
crap at making robots that can walk around.
29:26
Yeah, like that. It's hard to make a robot
29:28
that can climb stairs or open a door and
29:30
step through things that
29:32
that most people find you
29:34
know, at least, if if not effortless,
29:38
at least not so challenging.
29:40
It's impossible, right, but
29:42
there are robots that literally like you can watch that
29:44
that video clip of the robot fails
29:46
of the Darba challenge, and usually
29:49
it's set to yakety sax, which makes hilarious
29:51
and also sad at the same time. But
29:55
it's it's crazy to see the little things
29:57
that robots just find incredibly
29:59
challenge JNG. And you start when
30:01
you start to think about you realize, yeah,
30:04
replicating what humans can
30:06
do creating a machine that can operate
30:08
within a human world, a world
30:10
that we have shaped, that that we able
30:12
bodied more or less average
30:14
sized humans have shaped. Right, It
30:16
is not easy to do that. Um.
30:19
So Also we have to point out security guards would
30:21
likely need at minimum the basic skills
30:23
of a typical, able bodied human, and
30:26
maybe more than what just
30:28
a typical person would be capable of doing, depending
30:31
upon the type of security guard and the type
30:33
of gig right, and and plus in addition
30:35
to that, if you don't have that team of people
30:37
that it can phone home to in order to make decisions
30:40
for it, you need the capacity to autonomously
30:43
decide when, when someone's a threat,
30:45
how much of a threat they are, and
30:47
then how to stop them. Um. Computers
30:50
only just learned how to identify cats.
30:52
Yeah, I'm not sure how long
30:54
it's going to take us to get to a
30:57
design of of that level
30:59
of decision makeing and intelligence until
31:01
I can encounter a robot that not only
31:03
knows what what cats are,
31:06
but can tell you don't go see the show cats,
31:09
it's really not that great, and furthermore,
31:11
don't scritch its belly immediately. Cats.
31:15
Yeah, there are a lot of like
31:17
a lot of subtle things. I
31:19
think exactly that if a robot
31:22
isn't able to tell me what those are, I'm
31:24
not going to trust it to be guarding
31:26
something of supreme importance,
31:29
especially in a way that is at
31:32
least in a way that's not going to hurt somebody,
31:35
Right, which brings us to the next point,
31:37
the idea of security guards
31:40
occasionally have to use some form of
31:42
force in order to protect a person
31:44
or protect property. Yeah. Yeah, we would
31:46
be talking about designing a robot
31:49
to physically incapacitate humans,
31:52
and instinctively that seems bad.
31:54
Yeah, I mean I
31:56
mean at the very least, like potentially hazardous
31:59
and litigious. Yeah, absolutely,
32:02
I mean, like that goes
32:04
back to our discussion from European Union
32:06
and liability. Who's liable?
32:09
Well, if you've got a security robot
32:11
zapping people, I mean, even if they've done
32:13
something wrong, there's there's
32:16
some liability issues there, and it does seem
32:18
to go against those laws of robotics we talked about
32:20
in our last episode do not harm humans.
32:22
Yeah, that's a big one. It's right, there's number one.
32:24
In fact number one that the rest of them
32:26
are kind of all based on number one. So yeah,
32:29
and there exactly
32:31
so. And again these are laws
32:34
that robots are born with. But
32:36
these are the sort of ideas we've had of saying,
32:39
if we're going to go down the road of
32:41
building robots, we probably
32:43
want to make sure they're not going to hurt us. That
32:46
seems like a pretty wise thing to build
32:48
into them. Yeah, and I do think that
32:50
that part of that,
32:52
that intrinsic instinctive
32:54
sense that we have about creating robots
32:57
that could hurt people is based in
32:59
science fiction. Um, is based
33:01
in all of the fun
33:03
exploding movies that we see where
33:06
all Arnold Schwarzenegger has a metal skeleton
33:08
and Any's stomping around and terrifying
33:11
the living daylights out of everyone and all
33:13
of that, all of that kind of stuff that that
33:15
is so far away from being any kind of
33:17
reality. Even if you do create a
33:19
security robot that can zat people, UM,
33:22
that's that that robot is not are Schwartzenegger.
33:25
Uh, it probably doesn't even have an
33:27
Austrian accent. I can't
33:30
I can't be positive. I don't know. I don't know what they're
33:32
up to in China. Kind of want them all to have it now.
33:35
But but yeah, but you know, it's the idea
33:38
of a robot running a muck is
33:40
pretty pretty unlikely. Yeah,
33:43
really, yeah, I mean it's yeah,
33:46
we we have a ways to go before that
33:48
that becomes before Paul Paul McCartney
33:50
butt is a reality. Yeah yeah,
33:55
but okay, so I don't know, you know,
33:58
it's like, if you're really just designing
34:00
something like this to prevent crime,
34:03
which is bad. Crime is bad. And
34:06
if and if it's a robot that's freely
34:08
good at making decisions and it definitely
34:11
super never kills or maims anyone
34:14
accidentally or on purpose, those would
34:16
both be that would also fall into
34:18
the realm of bad. Um.
34:21
If all of that is true, then
34:24
then it would be a good thing to have a
34:26
security robot. Right, So we just need
34:28
a robot that can have like a whole bunch
34:30
of arms and it just very
34:32
gently hugs the criminal and
34:35
holds the criminal there like they're
34:37
all all arms are soft, so as
34:39
the criminal is struggling to get away, he or she
34:41
is not actually hurting him or herself and
34:45
yet can't get away because robot
34:47
also weighs like seven pounds,
34:49
just like just just a giant panda robot.
34:52
Is what I'm imagining that kid
34:54
that can hug a criminal into
34:56
submission. Yes,
35:01
I approve of this plan. I
35:03
will pitch this series to sci Fi
35:07
call us. Yeah, if
35:09
we can get this on, we can get this on sci
35:11
Fi. I think we got a future in
35:13
uh in science fiction programming.
35:16
No, no, I mean if if we can create
35:18
this, this panda security, but you know, it's
35:20
it's it means fewer humans being
35:23
exposed to risky situations, which is always
35:25
kind of one of those like like golden
35:27
standards of how robots will do us
35:29
good in the incredible future. Yeah,
35:31
so there's definitely a place
35:34
for it. The question is to what extent I
35:36
think. I think the K five is a great implementation
35:39
of this because it is taking
35:43
what appears to be a very responsible
35:45
approach to incorporating robots
35:47
into the realm of of security,
35:50
um without causing at
35:52
least initially anyway, at least at least
35:55
in no obvious way risk
35:58
or danger. Yeah. So, uh,
36:01
I can't wait to
36:04
revisit this topic in like ten
36:06
years and talk about remember our
36:09
innocent days when the robots
36:11
were being friendly and when we thought
36:13
that panda robots were a great plan.
36:15
The other day I went and by myself
36:18
a Soil and Green bar and it
36:20
was a little slow reaching for my money,
36:22
and um, well that's why I got the cybernetic
36:24
leg. Now coming
36:27
up next, we're gonna talk about the future
36:29
Soiler Green. This episode
36:31
brought to you by soil It. No, I
36:34
mean that's I'm hoping that's not going to be
36:36
the way this show isn't ten years but we'll find
36:38
out. Um, maybe thought
36:40
that would be a really interesting accent for you to spontaneously
36:43
developed. Well, it may be one of those that's mandated
36:45
by our robotic rewards. Like
36:48
Jonathan, you have to have a weird cartoonish
36:50
accent for the rest
36:52
of your life, at least when you're on microphone.
36:55
Um. Alright, So guys, if you have any comments
36:58
about this episode, or robots
37:00
in general, or even just some other topic
37:02
that you want to know, like what is that going to be like
37:04
in the future, right to us. Let us
37:06
know what you think we want to hear from you.
37:08
You can send us an email. The address is FW
37:12
thinking at our stuff works dot com,
37:14
or you can drop us a line on Twitter or Facebook.
37:17
We are FW thinking on Twitter, or
37:19
you can search f W thinking on Facebook. Our
37:21
profile will pop up. You can leave us a message and we
37:23
will talk to you again. Really. See
37:30
For more on this topic and the future of technology,
37:33
visit forward thinking dot Com,
37:46
brought to you by Toyota. Let's
37:48
Go Places
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More