Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Hey, I'm Ryan Reynolds. At Mint Mobile, we
0:02
like to do the opposite of what Big Wireless
0:04
does. They charge you a lot, we
0:06
charge you a little. So naturally, when
0:09
they announced they'd be raising their prices due to
0:11
inflation, we decided to deflate
0:13
our prices due to not hating you. That's
0:15
right! We're cutting the price of Mint Unlimited from $30
0:18
a month to just $15 a month. Give
0:21
it a try at MintMobile.com slash switch.
0:23
New activation and upfront payment for three-month plan required.
0:26
Taxes and fees extra. Additional restrictions apply. See
0:28
MintMobile.com for full terms.
0:30
Welcome
0:32
to Recode Media with Peter Kafka. That
0:35
is me. And today I've got a conversation
0:37
for you that I've wanted to have for years. Literally
0:39
years. I talked to Charlie
0:42
Brooker. He is the creator of Black Mirror, a
0:44
tremendous series you can watch on Netflix.
0:47
As you know, I don't think we need to say
0:49
much more about Black Mirror. Like we talk
0:51
about in our conversation, Black Mirror is synonymous
0:54
with the way we talk about tech and media, or at least some
0:56
kinds of tech and media. Like
0:58
I told Charlie when I was writing about the Apple Vision
1:00
Pro reveal, I referenced Black Mirror
1:02
because that's what we think about when we see
1:05
crazy dystopian tech that we're
1:07
supposed to... I don't know what we're supposed to do with it.
1:09
Anyway, you'll hear about it in the discussion.
1:12
I guess the only other thing I'd say here is that sometimes
1:14
when you really admire someone's work, you worry that
1:17
when you meet them or talk with them, you're going to be underwhelmed.
1:19
That was not the case here. So here's
1:21
me
1:22
and Charlie Brooker. I'm
1:25
talking with Charlie Brooker, the creator of Black Mirror. It is
1:27
one of the all-time great shows. I'm delighted to talk to
1:29
him. Welcome, Charlie. Hello. Hello. Nice to
1:31
be here. I appreciate
1:33
it. The newest season of Black Mirror
1:36
debuted a couple of months ago. I'm
1:38
sure you are exhausted talking
1:40
about it, but I want to use it as a jumping off point
1:43
for our discussion. If you've watched
1:46
any of the new season of Black Mirror, you realize
1:48
that it's different from other seasons. Many of the episodes
1:50
are less explicitly about
1:52
technology and its perils
1:54
or promise. Is that because you're sick
1:57
of talking about tech?
1:59
No. I... think, I mean, to be honest, so
2:03
the season, the order
2:05
of episodes that this season
2:08
six appears on Netflix is
2:10
the reverse order. This
2:12
is the order they were written in. And
2:15
it started out, to be frank,
2:18
it started out, I was doing, I was almost doing
2:20
a different season. I wanted to take a bit of a break
2:22
from doing stuff about just technology
2:24
because I kind of felt like I
2:27
was boxing myself in a little.
2:29
And so it was a deliberate move to
2:32
set some episodes in the past and do some sort of horror
2:36
stories and to put out a
2:38
season called Red Mirror and do it that way, like
2:40
do a sort of sister reboot
2:42
in a way. Then, as
2:45
is always the case, I think when you're doing any kind of creative
2:47
endeavor, as soon as you like cut
2:49
yourself off from doing something, you're then
2:52
like slightly pining to do it immediately.
2:54
So actually, I sort of reverted
2:56
back to Black Mirror. And you can see that
2:59
through if you look
3:01
at the episodes in reverse order,
3:03
they get steadily more and more Black Mirror as they go.
3:06
So by the time Joanie's Awful is
3:08
the first episode in order, but
3:10
the last one to be made. It's gonna be a test on this
3:12
afterwards. Sorry, yeah, this is very, I made
3:15
this unnecessarily convoluted.
3:18
That was the most Black Mirror of them all. But
3:20
I think what it was was at the time
3:21
when I started doing
3:23
that particular season, I was aware
3:25
that I was
3:28
aware, obviously, that Black Mirror had become kind
3:30
of synonymous for sort of tech dystopia
3:33
and things of that nature. And I was getting people
3:35
saying to me, I bet you're gonna do what you're
3:37
gonna do an episode about NFTs
3:40
and you know, the blockchain
3:42
and stuff like that. And I just think, oh,
3:44
so depressing, because that
3:46
was never what
3:49
the purpose of the show was
3:51
way back when when it first started.
3:54
The idea was not to be let
3:56
satirize the tech pages, you
3:58
know, it was it was much
3:59
I want to do a show that's unlike
4:02
any other show on television and I want to do something
4:04
that speaks about now or satirizes
4:09
the media and the culture and our society,
4:12
but is not necessarily, you
4:15
know, I wasn't thinking, but
4:18
it's all about technology. So I was slightly,
4:20
and I was aware that I felt like I
4:23
trod familiar territory quite
4:25
a few times in Black Mirror, so I wanted to really shake
4:27
things up. Also back then, that
4:30
was in the midst of the pandemic, in the grip
4:32
of the pandemic. And really it kind of felt the time,
4:35
like a lot of the tech
4:38
was plateauing a bit. Hence people saying
4:40
to me, are you going to do an episode about NFTs?
4:43
There wasn't that much, it didn't feel like there was that much
4:45
disruption going on so
4:47
much as...
4:49
People stretching for something new, but there
4:51
wasn't. Stretching, desperate for anything new, we'd
4:53
have loved it. But we were sitting, the newest thing was
4:55
using Zoom, like everything happening on Zoom was
4:57
basically it. And so it was also, so it kind
5:00
of felt like things had slightly plateaued at the
5:02
time. Of course, then as you start emerging
5:04
from it, now we're at a point like where
5:07
everyone's talking about AI, everyone's talking about
5:09
Elon Musk, everyone's talking about... There's all of, a lot
5:11
of those things are now
5:14
heating up again. So I kind of feel
5:16
like I got that out of my system, if you see what
5:18
I mean. What it meant was I put several classic
5:20
Black Mirror ideas on the back burner
5:22
a little.
5:23
How do you feel about the fact that people use
5:26
Black Mirror as shorthand for
5:28
tech dystopia? I was looking, I was like,
5:31
I do that all the time. I did it in print in
5:33
June when Apple rolled out its
5:35
Apple Vision Pro goggles
5:37
and it had that video of the mom wearing
5:39
the headset and her daughter is trying
5:42
to talk to her and it seems terrible,
5:44
but Apple thinks this is great. What
5:46
do you think of the idea that I see that and go, that's
5:48
Black Mirror and everyone else? Lots of other people do
5:50
as well.
5:51
I mean, I kind of love it actually.
5:53
Like I say, when it seems to have sort of restricted myself
5:55
from doing something, I was pining for it and came back to it.
5:58
And so doing an episode like...
5:59
Joan is awful, which is like, it's
6:02
like a sort of AI generated show
6:04
is at the heart of that. That
6:07
yields so deliciously black mirror to
6:09
me. I kind of love that stuff. I mean, I say I
6:11
love it. I'm delighted,
6:13
obviously, that it's become short
6:16
hand because on one on the one hand,
6:18
that's kind of free
6:20
publicity for the show, frankly.
6:23
But equally, it's, it's
6:26
often depressing on a human level that
6:29
that's the stuff we're looking at and confronted
6:32
by a lot of the time. And it's interesting
6:34
because it's not just, it's not
6:36
always people, and this was a thing I was kind of leaning
6:38
into, I guess, this season, it's not always about technology.
6:41
When people say that, sometimes they're talking about just,
6:43
I don't
6:43
know if I can use, I'm not, I will use
6:46
clean language. No, no, cross it up. Oh,
6:49
okay. A fucked up situation.
6:51
Basically, people will often sort of use it for short
6:54
and for a fucked up situation. If you look at our first
6:56
ever episode with the Prime Minister and the pig, that's
6:58
the very definition of a sort of fucked up situation.
7:01
And so black mirrors simultaneously
7:03
in to me seems to stand
7:05
in for
7:07
that, that kind
7:09
of situation. And also, yes, as you say,
7:11
if Apple launched their vision, stupid
7:14
or whatever they are, $500,000 flipping icon
7:18
goggles.
7:19
What do you make of the fact that you've been making the show
7:21
for more than a decade? And
7:24
it's very popular. So clearly
7:26
people in Silicon Valley have seen it and you're saying,
7:28
this vision of the future that I have
7:30
is bad. This is not good.
7:33
And then they come out and say, we think this
7:35
is great. We're going to productize this,
7:38
whether it's VR goggles or
7:41
AI generated people
7:43
or whatever it is.
7:45
And you've been saying, no, this is explicitly bad. And
7:47
they go, this is great. We're going to sell it.
7:49
What are you making that disconnect? I
7:51
would say so. One thing I would say there
7:53
is that you don't
7:55
know that. Okay. Sometimes clearly
7:59
in the show. I'm highlighting something and
8:01
saying this is bad. Usually,
8:03
however, the technology isn't actually the villain.
8:05
We've done an episode definitely, we did an episode with autonomous
8:08
robot killer dogs going around
8:10
killing people. It's pretty positive. Not
8:13
really a positive read on that, but they were still presumably
8:15
created by a human in that story, we just
8:17
don't say. Most of the time, when
8:20
an episode is
8:21
what I would class as classically
8:24
Black Mirror, you've got something that's
8:26
actually quite miraculous that you
8:28
can immediately see as a viewer,
8:31
you can see the desirability of it. You can
8:33
see why it would be useful. You can see why
8:35
it would be transformative and in many
8:37
ways, extremely positive. It's
8:40
usually the human beings, the
8:42
messy human beings who are using this stuff in the
8:44
story, who managed to balls things
8:46
up. I guess that reflects
8:48
how I feel about a lot of things because in real life, I'm pretty
8:51
geeky and techy. I used to be a video games
8:54
journalist way back, yonder
8:56
in the 90s. I
8:59
love all this stuff. I love technology,
9:01
I love computers, but
9:04
I'm also a natural warrior.
9:07
I'm somebody who catastrophizes
9:10
at the drop of a hat. It may come as no
9:12
surprise to anyone who's watched the
9:14
show to realize that. I'm often
9:16
worried about the power
9:19
we suddenly, some
9:20
new development or gizmo
9:22
will give us the power and the responsibility
9:24
that comes with that and how easy it is to
9:27
misuse that or just unintended
9:29
consequences or obvious clumsy consequences.
9:31
We see that time and time again with things, but
9:34
usually all technologies give with one
9:37
hand and I
9:39
was going to say take away sort of slappers
9:41
around the back of the head with the other. But that's
9:43
been the case with every, I mean the printing press. It's
9:46
been the case with everything. It's like, I wouldn't want to put
9:48
this stuff, I wouldn't want to delete
9:51
this stuff from existence necessarily. We
9:53
were talking before we started taping about, you know, you started
9:55
this show, the show started airing in the UK
9:57
in 2011.
9:59
I'm remembering, I'm thinking a lot about what
10:02
that was like back then, how
10:05
we viewed tech back then and generally, especially
10:07
in the tech press, but broadly, we
10:09
were excited about consumer technology.
10:12
There was an iPhone or Facebook or Twitter.
10:14
There were serious people saying that social
10:16
media could bring democracy to the Middle
10:18
East, et cetera. Enormously optimistic
10:21
about it. Things have swung back
10:23
dramatically in the opposite direction.
10:25
Do you think that was always inevitable that we would go
10:28
from tech is great to tech
10:30
brings on catastrophe
10:32
or do you think there was another way to have
10:34
gone about this? I mean, whether
10:36
it was inevitable that we
10:39
went from, certainly that we
10:42
were clearly looking at it through extremely
10:45
rose-tinted goggles at the time.
10:47
You're absolutely right. I remember that
10:49
was the thing in a way I was
10:52
tapping into in my head in
10:54
some, certainly in some of those early
10:57
episodes, it launched against
10:59
the backdrop for me. I would
11:01
watch adverts. There would be adverts for like
11:04
most Apple adverts looked to me like, have
11:06
you seen
11:07
Soylent Green? Of course. So
11:10
you know, there's a scene, who is it? Is it Ernest
11:12
Borgnine? Who plays the old guy? Or
11:14
is it Burgess Merida? I think it's Ernest Borgnine,
11:17
who plays the old guy. All I remember is Charlton
11:19
Heston at the end. Right. Well,
11:21
of course, while we all remember the punch line, but there's
11:23
a bit where somebody is euthanized and
11:25
an old guy is euthanized and he's
11:28
taken into a sort of euthanasia clinic and
11:31
shown the last thing he's shown
11:33
is images of the natural world, which has now
11:35
been destroyed and it sort of moves him to tears.
11:37
And then he's like
11:38
killed and turned into food,
11:41
basically. Which
11:45
someone in Silicon Valley thought would be a good brand.
11:47
Oh, yes, exactly. Yes. Which was
11:49
amazing. So I mean, well,
11:51
there you go. I mean, that's
11:53
the ultimate sort of example. But
11:56
that's the imagery there, the
11:58
sort of pleasant
11:59
imagery. that this guy was shown against this
12:01
extremely dystopian black backdrop. That was
12:03
the sense I was always getting from sort of Apple ads
12:05
at the time, was they just seemed to be showing
12:08
everybody having fun and dancing and smiling
12:10
and you just think well hang on a minute, things
12:13
usually aren't this
12:15
positive and if we
12:18
suddenly have extremely powerful tools
12:20
at our disposal we will do incredible
12:22
things, we will also make incredible fuck-ups.
12:25
So that seemed to me a
12:27
well-founded concern I had that I felt
12:29
wasn't reflected, at the time
12:31
wasn't being reflected necessarily. And
12:34
I remember the Arab Spring, I was doing, my
12:37
background in the UK is in comedy and stuff like that
12:40
so I was doing stuff for
12:42
a topical comedy show at the time and
12:44
I remember the positivity around the Arab Spring
12:46
and people feeling that
12:48
Twitter was bringing exactly as
12:50
you say democracy to the Middle East and
12:52
now that all seems extremely naive. I think it
12:54
was always inevitable that we were going
12:56
to
12:57
cock things up a bit. We all
12:59
have also, I wouldn't want to be just
13:02
completely cynical, I do always, the analogy
13:04
I always use is that especially something
13:06
like social media or a lot of this stuff, it's like we've suddenly
13:09
grown an extra limb,
13:11
which is amazing because it means you could
13:13
juggle and scroll through
13:15
your iPhone at the same time but
13:18
it also means that we're not really sure how to control
13:20
it yet and we're clumsy, we're not knocking
13:22
things over all the time, we're still
13:24
grappling with sort of butterfly
13:27
effect consequences of this stuff all
13:29
the time. So the problem is
13:31
it probably wasn't inevitable but I think a lot
13:33
of it was probably
13:35
hard to foresee, do you
13:37
know what I mean, without necessarily putting things completely
13:39
back in the box and being a total luddite. I
13:42
do get frustrated sometimes when sometimes people sometimes
13:44
characterise the show as, you
13:47
know, the techies bad show sort
13:49
of thing and I think sometimes I react to that probably too
13:51
much, there's probably something I was doing this season, there was
13:54
a deliberate sort of like I'm
13:56
not going to do that for a while.
13:57
But I don't know... I'm
14:00
trying to think what we could do. What
14:02
could we have done between 2011 and now to
14:05
mitigate against things? Well, I guess
14:07
we could have all been a bit more
14:09
suspicious.
14:11
We should have all been more paranoid. I don't
14:13
know how we could have. I mean, you know, so
14:15
much of the backlash was so directly
14:18
tied to Donald Trump's election and
14:22
I believe our desire to
14:24
pin that on something and someone
14:26
other than ourselves and our
14:28
citizens who said, I want to vote for Donald Trump. So
14:31
lots of us wanted to blame Mark Zuckerberg
14:33
or Facebook. Some of us wanted
14:36
to blame Fox News and they all have some responsibility.
14:38
With Brexit here, there was a
14:40
definite desire to pin it on Cambridge
14:42
Analytica and pin it on that
14:44
this is a sort of PsyOps campaign that
14:46
people have somehow been magically persuaded by
14:48
machines
14:49
to think like this. But I kind of, I mean,
14:51
essentially there were things like
14:53
Brexit. Well, actually Brexit was something I didn't
14:55
see coming. I just didn't, I didn't
14:57
see that coming. Donald Trump, I felt like
14:59
I saw that coming. Like he seemed obvious
15:02
to me really early on
15:04
that he was going to win. But I was, there was something
15:07
clear as day to me. I don't,
15:09
I just thought, well, of course he will. He's like,
15:12
of course he will. He's a celebrity
15:14
who's interesting and is pissing people
15:16
off and is,
15:19
of course, there's a huge number of people
15:21
are going to vote for him. Yeah. In retrospect, it
15:23
seems like an obvious culmination
15:25
in things. I know you get this question all the
15:27
time, but when you have Donald Trumps
15:29
in the world, when you have Elon Musk in the world,
15:32
these crazy, outsized
15:35
people who are characters,
15:39
what does that do for you as a creator? Do
15:42
you think, I can't touch this stuff.
15:44
There's literally nothing else you could say. These
15:46
things are beyond parody. Or do you
15:48
go, this is something I really want to mine or none
15:50
of the above?
15:51
I don't know. I mean, so it's interesting because
15:53
with, say, Trump, and
15:55
there's a lot of parallels between Trump and
15:58
Boris Johnson here in the UK. It's
16:01
just messed with my head Boris Johnson by buying
16:03
a giant mansion in these tiny village that
16:05
I grew up in, by the way. Now he's
16:08
a notable resident on the Wikipedia page.
16:10
Really messing with me. But
16:14
we kind of covered that
16:17
there was something that I didn't, I didn't feel
16:19
I got the script right. There's an episode
16:21
of Black Mirror
16:23
called The Waldo Moment in 2013, where I was
16:25
trying to get at something. That
16:27
was very much I was looking at Boris Johnson,
16:31
who was
16:33
mayor of London and
16:35
was, and had
16:39
and was a TV personality. Like
16:41
he'd been on comedy satirical,
16:43
he'd been a guest on comedy shows and things like this. And he
16:45
was
16:45
widely regarded as sort of bumbling, like Paddington
16:48
Bear, like a sort of prototype Paddington
16:51
that's posh and slept
16:53
outdoors in a field for a couple of days. But
16:55
I was slightly fascinated by the fact
16:58
that he seemed bulletproof
17:00
in terms of those, like the fact that he
17:02
fucked up
17:03
and was bumbling and kept it. Built into
17:06
the character. Was built into the character
17:08
and was a definite plus. And
17:11
that he appealed to people on the basis
17:13
of him being a sort of disruptive
17:15
personality because there was
17:18
a widespread sense that all politicians
17:20
were these bland cookie cutter, I guess
17:22
in the UK were kind of
17:25
echoing Blair, basically.
17:28
But they all seemed a bit like they
17:32
could have been from The Sims or something. Do you know what
17:34
I mean? They all seemed like a bland
17:37
neighbour in a bad drama. Yeah,
17:39
I think of them in the US as
17:41
news anchors, news presenters, you would
17:43
call them there. Yeah. And
17:47
daytime news present, like doing a sort
17:49
of like a coffee morning
17:51
type show. So he was so
17:53
clearly a different flavour. He was unique.
17:56
And so and there was
17:58
something about the
17:59
thought. that a
18:01
lot of people would find that attractive.
18:04
And it was obvious why they would find that attractive
18:06
because all the politicians they had weren't really speaking
18:08
to them and didn't appear to be representing them. And
18:11
that there was something very,
18:13
I found very frightening about that situation
18:15
that hopefully we covered in that episode. And
18:18
I think, like I say, at the time, I felt like I
18:20
didn't get the script right. I didn't get the stakes in the story
18:22
right. And I could have written it a lot better. That
18:25
actually should have been a mini series weirdly, that
18:27
particular episode, I think. But
18:30
it looked more prescient
18:32
come 2016, come Trump getting elected
18:35
or Boris getting elected. It looks, it
18:37
suddenly seemed quite prescient. And I think
18:39
now it is difficult because in a way the
18:42
new cliche is for the sort of like a
18:45
tech bro boss to be an Elon Musk
18:47
type person who, probably
18:51
rides into the office on a little e-scooter and
18:54
with tattoos on his face and pissing in the corner
18:57
and like crazy. Elon Musk
18:59
posted a photo of himself with a t-shirt
19:01
yesterday. And the joke is, the t-shirt,
19:04
the way he's got his jacket over
19:06
it, it says, I love anal.
19:08
That's the joke. He put that up and
19:10
he's one of the richest men in the world. I mean, maybe
19:12
he does. I
19:15
don't judge that. I just, the idea that like, it
19:18
seems beyond like he's, and he wants
19:20
to be funny, right? But he's, so he's beyond. He really
19:22
wants to be funny. Boris Johnson
19:24
held his own actually on sort of comment. On
19:28
comedy panel shows, people clearly
19:31
warm to him because they thought he was a, he came
19:33
across a bumbling, Trump is sort of, I guess
19:35
inadvertently funny, but also
19:37
more terrifying.
19:38
Elon Musk is trying
19:40
very hard to be funny a lot of the time, isn't he? As
19:43
far as I can tell. As a professional funny person, when you see
19:45
that, what's your reaction?
19:48
It's just not very good. It's
19:51
like, it's a bit Rupert Pupkin,
19:54
isn't it? A lot of the time
19:56
in one respect. It's weird, isn't it? Because if he was
19:59
genuine.
19:59
very funny. I'm
20:01
just querying myself
20:03
here. If he was genuinely very funny, would
20:05
I like him? There's
20:07
something I think, I don't know, I don't,
20:11
I don't know. I always used to say I don't think right-wingers
20:13
are funny, except for,
20:15
oh no, it's not true that right-wingers aren't
20:17
funny. Have you ever seen that picture of Mussolini
20:19
hanging from the lamppost?
20:22
That's fucking hilarious. But
20:25
anyway. Trump actually has some,
20:29
he's not a comedian, but he's got some
20:31
of that performer in him. He's definitely, I mean,
20:33
he's a consummate performer. Yeah, he's a consummate
20:35
performer. Do
20:36
you get feedback from Elon Musk or Mark Zuckerberg
20:39
or Tim Cook or their underlings? Do they reach out
20:41
to you with fan mail? Do they reach out to you and say, hey, don't
20:43
make fun of me?
20:45
No, I haven't heard from
20:48
any of these people, actually, that
20:50
I'm aware of. I occasionally
20:52
get asked to do
20:55
like events or something like that. I'm
20:57
trying to think, but no, no, I've not, no.
21:01
We'll be right back with more Charlie Brooker, but first a
21:03
word from a sponsor.
21:05
Hey, I'm Ryan Reynolds. At Mint Mobile, we
21:07
like to do the opposite of what big wireless
21:09
does. They charge you a lot, we
21:11
charge you a little. So naturally, when
21:14
they announced they'd be raising their prices due to inflation,
21:16
we decided to deflate our prices
21:19
due to not hating you. That's right. We're
21:21
cutting the price of Mint Unlimited from $30 a month to
21:23
just $15 a month. Give it a try
21:26
at mintmobile.com.
21:28
New activation and upfront payment for three-month plan required,
21:31
taxes and fees extra. Additional restrictions apply. See
21:33
mintmobile.com for full terms.
21:35
You mentioned AI at the beginning of this
21:38
conversation. So in less than the space of a year, we went
21:40
from look at this interesting AI art,
21:42
isn't that cool or trippy to, oh, AI could
21:45
make a script with chat GPT to
21:47
now maybe AI is going to make a whole movie
21:50
or a TV show. As
21:52
someone who makes things, how do you think about
21:55
AI as a tool and or as a threat? I
21:57
think it's kind of both. So I do see that...
21:59
I mean, it's depressing. The
22:02
thing that actually weirdly that depresses me almost
22:04
more than anything else is like, like,
22:07
I've got two kids and one of them is like, he's nine
22:09
years old and he's getting
22:11
into drawing and he's pretty, he's good. He's
22:14
really good, especially for his age. He's like, I
22:16
used to be a cartoonist. That was my first job. He's
22:18
way better than I was as an adult.
22:22
And you know, he's proudly drawing, doodling away.
22:26
And I was looking at this thinking,
22:28
you know, and encouraging him going, well done.
22:30
And then I thought that the next thought
22:33
that arrived was, yeah,
22:35
I mean, being an illustrator, that's no career
22:37
path these days, is it? Because that's going to be,
22:39
and then our oldest, our eldest
22:41
is really into coding,
22:43
really into that. And I'm like, and I'm thinking,
22:45
yeah, but are you learning? Is this like
22:48
learning, you know, mathematics
22:50
and now the calculators come along and rendered that
22:52
like, we're just going to, a machine's just
22:54
going to do the icky bits of coding for
22:56
you. So I do very
22:59
much worry about what
23:01
the impact on employment generally
23:04
is going to be. And
23:06
then when I look at like, and I toyed
23:08
around with all those things like mid journey and stuff like
23:10
that, like anyone else. And it's telling, isn't
23:12
it, that what those things, the immediate
23:15
images that go viral with that sort of thing and the things
23:17
that are appealing are immediately like
23:20
kind of mashups, aren't they? They're all combinations
23:22
of things. So I would type in like,
23:24
you know, show me, you
23:25
know, whatever,
23:28
Jack the Ripper in the Great British Bake Off
23:30
tent or something like that. You
23:33
know, show me Boris Johnson, you
23:35
know, shaking hands with Paddington Bear on the set
23:38
of Seinfeld or something. It could
23:40
do, but because it's parasitically
23:42
hoovering up stuff
23:45
that we humans have made or
23:47
created or are. And
23:50
so
23:52
quite quickly, it's
23:54
interesting with the A.I.R., I feel like there's
23:56
quite quickly something generic about it. Either
24:00
it was like riffs on existing
24:03
IP effectively, or it was fairly
24:07
somehow sort of too slick,
24:10
like an auto-tuned vocal. Right,
24:12
you can sort of, yeah, you can see it. Yeah.
24:15
You can see the auto-tune. And it's impressive, but
24:17
it's,
24:17
that said, and then things like chat
24:19
GPT, I can see, I can totally
24:22
see the value in using
24:24
it as a sort of
24:26
a hyper-powered Google, using
24:29
it as a kind of a
24:31
quick list 10 jobs
24:35
that somebody in Victorian England might have
24:37
done.
24:38
Do you want to make something like that? I can imagine that as a writing
24:40
tool. And I can imagine it being
24:42
a little,
24:44
well, I can imagine people using it. The
24:46
scary thing is I can imagine people using it to generate
24:50
something that they then claim to
24:52
own, which isn't good enough to
24:54
actually pass must've, you'd have to then hire a human
24:56
in cheaply to knock
24:58
into shape.
24:59
So there's that, but I mean, I guess it's like the tool,
25:02
it should be like the tools in Photoshop,
25:05
right? Do you know what I mean? No one,
25:07
I'm not scared by most of the tools in Photoshop.
25:10
I think they're super useful for
25:13
artists, digital artists. The
25:15
interesting thing is I, hopefully, hopefully
25:18
one outcome is it makes us up
25:20
our game. It's interesting at the moment,
25:23
I think, isn't it, that we've had a lot of formulaic
25:27
movies and stuff for years,
25:29
I think. Not to slight superhero
25:31
movies, it's just that there's a lot
25:33
of them. And the audience seems to have exhausted.
25:36
The audience seems to have exhausted because I think that
25:39
it does feel like, you could say to chat GPT,
25:41
give me,
25:43
knock out the beats of a superhero,
25:45
you know what the
25:46
story beats gonna be. Do you imagine using
25:49
it? I mean, I've talked to folks who say, yeah,
25:52
it's good to make a terrible first draft because
25:54
that gives me, I'd rather look at a bad
25:56
first draft than a blank page and I can fix
25:58
something or I can. can kick around
26:00
ideas and a hundred ideas will be bad, but
26:03
one will be good. And that's, that's useful for me.
26:05
Well, I, I don't know that it's, I
26:07
don't think it's at the point where it could write
26:10
an even servicing, not an
26:12
even, not, not even a vomit draft. I
26:14
don't think it could really do
26:15
it the moment. I mean, I, the thing
26:18
I, the first thing I did when I got it was try to get
26:20
it to make, to do unacceptable things. Right.
26:22
So I'd sort of say, okay, right. A
26:24
trip, generate a transcript of the Beatles
26:27
sitting around and coming up with a song
26:30
about going to the toilet and
26:32
use really explicit language and
26:35
this, that, and the other, and then have them actually start
26:37
going to the toilet in the studio. And then George
26:39
Martin comes in and eventually, even
26:42
though it would keep saying, I'm not going to do that as a,
26:44
I'm a, I'm a good little AI and I won't do that,
26:47
eventually you could always get, you could
26:49
trick it into doing it. God bless you, Charlie Brook. And
26:51
it would do it, but it would keep trying to bring
26:53
the conversation background to how,
26:56
Hey guys, you know, Hey, Ringo, we shouldn't
26:58
really do this. Should we now let's get
27:00
back to peace. It was sort of
27:02
black now, but also
27:04
it's so, so I don't, I kind of feel like
27:07
it's. I don't
27:09
know that I trust it's, it's, it's ability
27:11
to generate an idea. Now
27:13
riffing on an idea that you've, you've got
27:16
yourself, I can
27:17
potentially see that, but because
27:20
it's hoovering up other
27:22
people's stuff, I got miss, I got slightly
27:24
misquoted cause there wasn't obviously another thing I did
27:27
was I typed in like, you know, okay, go
27:29
on then tell me an idea. Tell
27:31
me, give me an idea for a black mirror episode.
27:33
And I wasn't doing that cause I wanted one
27:36
so much as I wanted to see what the competition
27:38
is from this machine. And what it does is
27:41
it immediately came back with things that
27:43
sounded on the
27:45
face of it, I've certainly
27:48
been pitched worse things, I think,
27:51
you know, or not worse, but things that
27:53
are similar. It came
27:55
up with things that were fairly generic because they were,
27:58
it
27:59
was what. what
28:02
a little bit of emulation software's idea
28:05
of a Black Mirror story is, if you see what I mean. So,
28:08
it was sort of trying to do that and that
28:10
just made me feel kind of self-parasitic,
28:13
if there's a phrase for
28:15
a bit like it felt like, oh,
28:18
why am I doing that when it's just
28:24
leeching off me? And I'd be quite
28:26
close if somebody else was using
28:28
it to leech off me. And it's probably
28:30
leeching off, it's probably seen
28:33
somewhere that Black Mirror is a bit like the Twilight
28:35
Zone. So it's probably leeching off Rod Serling
28:38
as well. It's probably leeching off,
28:41
I don't know, RoboCop, Starship
28:44
Troopers, all these other brilliant things
28:46
that I found very influential. And you
28:48
did. As a human, you took all that stuff and met
28:50
that up. Well, so yeah, I can see that argument
28:53
as well. I can see certainly there's
28:55
episodes of Black Mirror that are directly
28:57
inspired, say, by we
29:01
did an episode called USS Callister, which
29:03
is a sort of Star Trek story. And it's very
29:06
directly
29:07
inspired by there's an episode of the
29:10
Twilight Zone called It's a
29:12
Good Life, where there's an ultra powerful,
29:15
I think it's played by a kid called Billy
29:17
Moomy or Guy, Bill Moomy.
29:20
And yeah, he's an ultra powerful sort of six
29:22
year old boy, something like that, who can... It's
29:25
terrifying. It's one of those. And it still holds up today
29:27
as absolutely chilling and terrifying.
29:29
And at the point, I had the idea of
29:31
like, oh, well, what if you got to, you know, you got to... I
29:34
was trying to think of weirdly a very different story
29:37
idea to do with people in a workplace
29:39
put into originally, I wanted them put into a musical,
29:42
like a virtual musical, like Grease, the musical,
29:44
and they wouldn't know what their roles were, if
29:46
I said to mean so. I might be Sandy and
29:48
you might be Danny, and
29:51
we wouldn't know the real us wouldn't know. And
29:53
we have to sort of as a team building exercise, we have
29:55
to do anyway, I was sort of toying around with that idea. And then
29:57
I thought, God, you could do so many powerful things.
29:59
And as soon as I thought, well,
30:02
what if this is a story about a tyrant? I remember
30:04
that Twilight Zone episode in which
30:07
he's like ultra-parasitically hoovering up
30:13
something that Rod Serling wrote, putting
30:16
it through my own little AI
30:18
in my brain. My
30:20
own art, I guess you just call it
30:23
AI, don't you? Not AI. There's nothing artificial
30:25
about it, just my eye. You created
30:27
something wholly new. I watched it again
30:29
last night. It's one of your best episodes. Oh, thank
30:31
you. Most acclaimed episodes. I think it's
30:33
a very, I think I'm very proud of that episode. But
30:36
I
30:36
do think that's,
30:38
hopefully that's a different process
30:41
because A, I'm saying, well,
30:44
I owe Rod Serling a debt there for,
30:46
you know, like there was a heavy
30:48
influence.
30:49
I suppose this
30:53
feels like this is doing it on an industrial
30:55
scale.
30:56
Right. Yeah. And so
30:58
for someone like you, you recoil at that.
31:01
As you know right now, there's a debate
31:04
into the actor and writer strike about, you
31:06
know, how much AI are we going to allow
31:08
into our entertainment? I'm
31:11
a little confused as to whether this is a real fear
31:13
or a negotiating point. But do you
31:15
think that that's a real fear for
31:17
writers and actors? That
31:20
studios, etc., would really want to use
31:22
AI to replace much of what they
31:25
do?
31:26
I think it's a real fear. The difficult thing is
31:28
I think, like I say, I think the fear
31:30
with writing is that it would be
31:32
used.
31:33
You could use it, a studio
31:35
could use it to generate vomit drafts
31:37
of things and then hire in
31:39
human writers to
31:41
depressingly rewrite it, you
31:43
know, and make it human.
31:46
Humanize it. And that's a very depressing
31:48
state of affairs. The
31:51
thought of that, because if that's used
31:53
to basically that simultaneously
31:56
leeching off
31:58
existing work by other that it's hoovered
32:00
up and is trying to. But
32:03
let me just take Devil's Advocate for a second, right? Because it's
32:05
very standard in your business to have someone
32:07
write a draft and then fire that person
32:10
and then bring in multiple people many multiple
32:12
times to come and make that draft
32:14
better or oftentimes worse. And it goes
32:16
and you know, there's, you know, it's very
32:19
cynical. Well, I don't know you see, because
32:22
I've grown up in rosy old Britain, haven't
32:24
I, where it's like quite different here. Like
32:26
the the ruthless sort of Hollywood side
32:28
of writing looks terrifying
32:31
to me. I've had a very lucky
32:33
existence as a
32:35
writer. I've been exceptionally fortunate
32:37
throughout my career that I was always allowed.
32:41
I didn't came up to a weird route,
32:44
basically. So I haven't I haven't
32:46
been there,
32:49
if you said to me, like, like through that machine.
32:52
So maybe I do have a maybe
32:55
I've got too rosy tinted of you. But
32:58
but but I kind of feel
33:00
like there's a difference here,
33:02
isn't there in in hoovering at least
33:05
like if an if a human executive has
33:08
an idea for something that they want to hire somebody to do,
33:10
that seems fair enough
33:12
to me. They thought of it using
33:14
their own brain, not they
33:16
haven't sort of tasked a machine with
33:19
going, let's see what other humans
33:21
have done and just automatically
33:23
or to like churn out a list of 1000 things
33:26
that
33:27
may or may not be actionably similar
33:30
to that and then hire a human
33:32
in cheaply to knock it into shape. It
33:34
does that it does it that still feels
33:36
like a real cheapening of that process.
33:39
No, it doesn't it doesn't seem good. It also
33:41
seems somewhat inevitable. It's great.
33:44
It just seems to say they be saying that's wonderful
33:46
and you can't wait. Well,
33:48
they haven't made a good podcast yet.
33:50
But they I mean,
33:51
you'd think I mean, again, I mean, surely
33:53
the first line of jobs that could get replaced is a sort
33:56
of exact role of executive
33:58
is assessing these things.
33:59
would probably be born of the first. But
34:02
Kenny, I mean, would
34:04
people want to read it, listen to it, hear
34:06
it, watch it? That's the thing I don't really...
34:10
I still doubt. I
34:13
still doubt. I can see
34:16
the value and there's all sorts of things. An eerie,
34:18
I saw that thing that went viral the other week, which is an eerie
34:21
sort of AI generated. So somebody said,
34:23
I've generated a trailer for Heidi,
34:26
the movie. I don't know if you saw this. It was like, David Lynch.
34:28
So it was like,
34:32
it really made you feel like your
34:35
brain was being sick on itself on the inside, looking
34:37
at this thing. It was nightmarish imagery, like hilariously
34:39
creepy.
34:41
But a human has clearly been
34:44
nudging. So there's uses for it. I
34:46
think if I was an actor, I would genuinely
34:48
be worried about, well, what
34:50
I'd be worried about, I guess, is if
34:53
I was a upcoming actor, what
34:55
if I find myself perpetually in
34:58
competition with Tom Hanks
35:01
or Marilyn Monroe? What if you get a... But
35:04
that presumes
35:08
that,
35:09
I suppose that for me to think that, that presumes
35:11
that we're going to be interested in watching a CGI,
35:14
we'd get invested in the
35:16
CGI generated Harrison
35:18
Ford. More
35:20
than we would a young whoever the next Harrison
35:23
Ford is going to be. So to
35:25
bring this all around, the first episode
35:27
of this season is awful. And it's a
35:29
Netflix service called Streamberry.
35:31
We're going to do this exact
35:34
thing. What did
35:36
Ted Sarandos have to say?
35:38
Oh, I don't know what Ted Sarandos
35:40
specifically had to say. I mean, he
35:42
must have seen it. He must
35:45
have seen it. So when I wrote
35:47
that, it was interesting because
35:50
it happened slightly by degrees and that when I wrote
35:52
it, the first script
35:55
just said that Joan
35:57
is watching a Netflix
36:00
slash Disney plus style streaming
36:02
service is how it was described, just
36:04
a sort of generic streaming platform.
36:07
Came up with the name Streambury.
36:09
And then it was when we came to, like
36:11
when we were doing the graphics for the show,
36:14
it was obviously, we had a really talented graphics
36:16
team. They'd say, well, what do you want this to look like?
36:20
And you realize you're asking them, it's a bit like, you know
36:22
when you're watching a drama and somebody's using Google,
36:24
but it's not Google, you know,
36:27
it's like a clearable version of Google and
36:29
they've like, it's always best when they just. Or
36:31
the phone number is 555. Exactly,
36:33
that is like, it slightly breaks your, or
36:37
they're having to sort of make it look a bit shit.
36:40
So it doesn't look like the actual, cause actually
36:42
the front ends of most of these things are incredibly
36:44
professionally done and look wonderful and like, and
36:46
they're instantly recognizable. So anyway, we just
36:48
thought, well, why reinvent the wheel? Why
36:51
don't we just see,
36:53
rather than having to make this bright orange or something, so it
36:55
doesn't look like any existing service, why
36:57
don't we just ask Netflix if we can make it
36:59
look like this?
37:00
So we did and they said yes.
37:03
And they were like, okay. And then later on we're like,
37:05
well, actually, well, if you're doing that, I mean, what happens when she
37:07
presses play with the sort of like,
37:09
ribbons type animation, what would
37:11
it do? It would,
37:13
if it looks that much like Netflix, it would go,
37:15
right, so can we use that? So
37:17
we asked them if we could use that and they said yes.
37:20
Again, I don't quite know
37:22
how this exactly went up and down the chain
37:24
and whether that's like asking somebody whose job is
37:26
just to look at
37:28
clearance and go, well, yeah, that's clear
37:30
and book is where it's all on Netflix file or
37:32
whether there was a, there didn't seem to be
37:34
any alarm or panic or anything like that
37:36
about it. It's interesting because then when you get to the
37:38
episode, when you're watching it, the final
37:41
thing,
37:42
it was sort of in a way when people said, well, my
37:44
God, he's really having a pop at Netflix here.
37:47
Do you think, oh, it does look like
37:49
that, doesn't it? But
37:52
really it was sort of like, it
37:54
was, I mean, it was no game for me doing
37:57
that. And it was, it was sort
37:59
of, I mean. again, maybe because it's a slightly cartoonish.
38:02
Yeah.
38:03
Yeah. It can't be that cutting if they allow it.
38:06
Yeah. I mean, that's- If you're
38:08
watching it on Netflix, then
38:10
it can't be that- That's, it wouldn't be that
38:12
dangerous because they're like, well, that's the message. It's
38:14
like season one episode two where
38:16
somebody railing against the system gets
38:19
commoditized. Commoditized? Yeah.
38:21
And it becomes part of this sort of, becomes
38:23
another entertainment slot on the system. Um,
38:28
yeah, that's depressing, isn't it? Um,
38:31
but again, it was sort of, but I think
38:33
also timing wise,
38:35
you see, when writing that episode,
38:37
it felt a bit more fanciful, probably. It
38:40
didn't feel like, well, this is something we might be looking at
38:42
in the next three years, necessarily,
38:45
the time, because we wrapped that
38:47
episode in like
38:48
October, 2022. And
38:52
then November, I think was when
38:55
chat GPT gets launched. Yes.
38:57
And that's the point at which every
38:59
writer on earth had a moment
39:01
of like, someone just walked over my grave,
39:04
you know, it's sort of like, um, and
39:07
you can,
39:08
it just, it's, I think everyone
39:11
woke up and went,
39:12
you know, hang on a minute, are we all replaceable here?
39:15
Charlie Brooker, you are not replaceable.
39:17
You say that it's interesting because there's
39:20
lots of ideas I'm thinking about.
39:21
We did an episode once before we did an episode could
39:24
be right back about, um, it's Haley Atwell
39:26
and Donald Gleason. That's in season two.
39:29
And, um, Haley plays
39:31
a woman who's called Martha, his husband played
39:33
by Donald. He dies and
39:35
she discovers after he's
39:37
dead that she's pregnant and she really misses him. She
39:39
wishes you could talk to him. She pays for this service
39:42
that has gone through all his,
39:45
his online footprint and he's generated a
39:47
sort of AI. She can talk to you that's based on him.
39:51
And then she, he becomes a sort of robotic
39:54
version of himself that she's, and
39:56
it's like a heartbreaking story. It's one
39:58
of my favorite episodes. underrated episode.
40:01
But it's a heartbreaking story. I realise
40:04
now, missed a couple of tricks. One,
40:07
I slightly am sad that we cut
40:09
out, there was originally a scene in the script
40:11
where when it's first talking to her, it starts
40:14
asking for money.
40:15
Like she has to upgrade it. It's basically starts advertising
40:18
or something like that until she pays for the full
40:21
fat version of it, basically, which
40:24
was wonderfully cynical. And Will, definitely.
40:27
And then also the other problem in a
40:29
way was that he was he was based on his, he
40:33
was based on his social media output, and he was bland.
40:35
This is the problem she has is he turns up and he's bland.
40:38
And he's not as now actually
40:40
now, if you were doing that, if it
40:42
was somebody based on people's
40:45
social media personalities, it
40:47
would either be like a game show host
40:50
version of them is just walked in, or a very
40:52
angry, demonstrative
40:54
version of them. Or an insight,
40:56
yeah, it would either be
40:58
Guy Smiley, Star of the Daytime television. Something like Vogue's
41:00
engagement. Yes, it would
41:02
be some flipping
41:06
awful exaggeration of
41:09
some aspect of them would show up. I
41:12
can't remember why I went there as soon as she said you're
41:15
not replaceable. I was very kind of you. Well,
41:19
I enjoyed that episode as well. And
41:21
again, I was watching the
41:24
USS Scalister last night, we were talking about
41:26
how we would have watched the other one if we had more time as well.
41:29
I wanted to get you off the hook here, because we could keep talking for a
41:31
couple more hours. But I promised your
41:33
people that I would not do that to you. So I
41:35
wanted to say, like I said, I've been looking
41:37
forward to this conversation for years. I'm delighted we got
41:40
to do it. Oh, thank you. I'm going
41:42
to do it again. Thank you, sir.
41:45
Yes. Thanks again to Charlie Brooker. Pretty
41:47
great. Jelani Carter and Joly Myers
41:50
produced the show. They're great. Our
41:52
sponsors bring you the show for free.
41:55
Great sponsors.
41:56
You guys are great because you listen to the show. This
41:59
is Recode Media.
41:59
We'll see you next week.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More