Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
Hey
0:01
there. I'm Osman Farooqke, culture editor
0:03
at the age in Sydney Morning Herald.
0:05
Today, I'm pleased explain where sharing an
0:07
episode of our sister podcast the drop.
0:10
It's a weekly show diving into the latest
0:12
in the world of pop culture and entertainment, featuring
0:15
culture writers and critics from the age and the
0:17
herald, and it's hosted by me,
0:19
Oz. AI,
0:21
it's all the rage at the moment from
0:24
business to art to journalism and
0:26
even TikTok. Artificial
0:28
intelligence is being used to create sometimes
0:30
hilarious and sometimes downright
0:33
terrifying content. But
0:35
is it something to fear or is
0:37
it just another tool to embrace, particularly
0:39
in the world of arts and culture where it can be incredibly
0:42
helpful for all sorts of different creators.
0:45
To help figure that out, I'm joined this week
0:47
by my colleague Jim Malo, a reporter
0:49
at the Age and the Sydney Morning Herald. Before
0:52
we get into the show, if you wanna hear more episodes
0:54
like this one, exploring the latest
0:56
in the world of culture and entertainment, just
0:58
search for and follow the drop in your favorite
1:01
podcast app. Alright. Here's
1:03
the show. Jim
1:13
Malo, how you going? Welcome to the drop.
1:15
I'm going great. Awesome. Thank you.
1:17
Good to have you on the show. So last
1:19
week, you alerted me to
1:21
this TikTok trend that think is actually
1:24
a really interesting kind of confluence
1:26
of these different threads that are
1:28
playing out across social media and in
1:30
technology and in the arts. And The
1:32
trend basically it's gonna sound
1:34
a bit weird, but it basically involves
1:37
AI generated voices of different
1:39
US presidents. Being
1:41
kind of toxic to each other -- Yeah. -- in
1:43
a video game chat
1:46
room, a Discord type thing. Is that
1:48
basically right? Yep. hundred percent really
1:50
fascinating sort of like look at
1:52
how AI voice generation is being
1:54
used currently. I mean, at the moment, it's a bit of like
1:56
a parlor trick. Right? That's not really doesn't
1:58
seem serious and scary
2:00
yet. But I mean, yeah, that could
2:02
change.
2:07
Joe, you know I love you, but that beat was a bit mid.
2:09
That beat was so ass. You were definitely
2:11
asleep while you made that. Shut the
2:13
fuck off Donald. What do you even know about making
2:15
sick beats? The 808 was out
2:17
of key. The melody sounds like something
2:19
a newborn baby would create
2:21
accidentally. That
2:22
clap sample makes me wanna kill you. And what is
2:24
it about these particular
2:27
clips of these videos. It's sort of it's
2:29
it's become its own universe in a
2:31
way. These things
2:31
are kind of ongoing. What? I mean
2:34
getting something to explain how a joke
2:36
is funny is never really a fruitful task,
2:38
but there's something about this and
2:40
the way that it's is it the the people involved,
2:42
the fact that the AI is weirdly quite good,
2:44
the topics that they're talking about, all of the above,
2:47
that make this both so funny to
2:49
the extent that it's
2:50
got, you know, millions and millions of views. Yeah.
2:52
I would say that's probably the case. I mean, like,
2:54
you're right that explaining joke kind of ruins
2:56
and I feel like I've been ruining it for myself a little
2:58
bit in the past. The past couple of days thinking about
3:00
this conversation. And,
3:02
yeah, it does sort of boil down to the fact that
3:04
it's like just kinda funny to hear like
3:07
US presidents argue with each other about
3:09
just, you know, random stuff that you might, you know, I'm
3:11
not sure if you're into gaming. I was, but
3:13
if you're, like, in a voice chat, like, you're
3:16
always, like, you know, needling each other and having a go.
3:18
And, like, it's it's funny to see these dynamics play
3:20
out with the president says the actors. And
3:23
I just think it's generally funny if you are
3:25
like the member of like a subculture to
3:28
sort of take like an established character
3:31
and then sort of insert them into your
3:33
subculture and how they might react and
3:35
and what positions they might take. So for
3:37
example, one of the one of the TikToks
3:39
I showed you was the
3:41
president's playing Fortnite and
3:43
talking about who is the the
3:45
best, like, rapper. Right? Is it Jake Hall or
3:47
is it Kendrick Lamar? Jake Cole
3:50
doesn't have a single piece of work that can go toe
3:52
to toe with TPAB or a good kid
3:54
mad city. Stay
3:55
crying. Come
3:56
back to the
3:56
chocolate factory. You delusional Oompa Lompa.
3:59
Cole is hip hop. Oh, yeah. Kindrick
4:01
puts on a dumbass voice for a song and
4:03
you call him a genius for. And how many Pulitzer
4:05
prizes does Jay Flop have
4:06
out of here? Yeah. That's what I thought. How about Macomart
4:08
guy? Macomar is so cool. I love
4:10
thrift shop. I got it on my iPod.
4:13
And you just like it hit a number of, like, pleasure
4:15
centers for me. But, like, it just, like, sort
4:17
of, like, stuck with me because it was, like, it was interesting
4:20
to see, like, who they characterize
4:22
as having what opinion and,
4:24
you know, how they sort of, like, or played
4:26
off each other. And one of the the key
4:29
things that I that sort of made me wanna
4:31
write about it or or talk to you about it
4:33
was that you know, it's it's kind of fascinating
4:35
to see how they, like, choose to characterize someone
4:37
like Trump who's, like, you know, very, like so who's
4:39
his favorite rapper in the video? Yeah. He's he's
4:42
a Kendrick fan. Obviously, like, he calls them.
4:44
Yeah. Yeah. And it's funny because, like, they position
4:46
Trump as, like, being sort of, like, morally
4:48
and, like, depending on who you ask, like,
4:50
objectively correct in the sense. Yeah. And
4:52
then they, like, there was, like I don't know.
4:55
It's the thing that, like, I'm really interested
4:57
in is Rod
4:57
this. The way they're sort of like white washing Trump's image.
4:59
Right? Like, I know I'm getting a little bit off course here, but,
5:02
like, I'm just sort
5:02
of There's an extra layer to it. Yeah.
5:04
It's like you're like, oh, yeah. He's right. This
5:06
is a guy arguing really coherently
5:09
about why Kendrick is the greatest rapper of all
5:11
time, but you're like, hang on. That's Donald Trump's
5:13
voice -- Yeah. -- really odd. Yeah. It's bizarre,
5:15
isn't it? Yeah. And it's I think
5:17
it, you know, it speaks to, you know, his position
5:20
in the white political land at this point because, you
5:22
know, he's in some ways like a like, politically
5:24
emasculated. Right? He doesn't really have the power
5:27
that he had, you know, just like, was it 334
5:30
years ago. Yeah. And, you know, he becomes
5:32
a bit of punch line, but it's it's interesting to
5:34
see how that will sort of play out in the years
5:35
ago. Yeah. And
5:36
it's not just current contemporary political
5:38
figures. Like, Obama pops up in
5:40
these videos.
5:41
And George's Bush pops up as well. Obama's characterization
5:43
is by far the funniest to me together. Drake,
5:45
I just wanted to say I love your music. Michelle
5:48
and I used to get turned to hotline bling while raining
5:50
drone strikes on hospitals.
5:52
Obviously, there's quite a lot of criticism of his
5:54
like track record when it comes
5:57
to wars. And in these
5:59
TikToks, they're constantly sort of characterizing as someone who,
6:01
like, has a, you know, a very sort of Itchy
6:04
trigger finger. He's always, you know, bombing just
6:07
random people overseas. And, yeah,
6:09
it's it's funny to see that he isn't
6:12
off the hook for
6:12
his, you know, his, like, political It's really interesting.
6:14
So it feels like to really I
6:17
guess this is what I'm interested in. Oh, to to really
6:19
get Every layer of
6:21
this joke, you have to sort of be
6:24
pretty aware of current online gaming
6:26
culture -- Yep. -- have some pretty
6:28
deep thoughts on this state of hip
6:30
hop and who the goats are and
6:33
to have a pretty specific
6:35
sense of the politics of
6:37
the last twenty five
6:40
years of American president. And,
6:42
you know, sometimes with jokes, the specificity
6:45
is where the the the real sweet spot lies,
6:47
like you said, activates these pleasure centers because
6:49
if you if you are at that Venn diagram
6:51
of loving all of those three
6:52
things, I'm being interested in them as I
6:54
think you are,
6:55
as you I am, it's clearly millions of people. It's almost
6:57
like that joke has been created for
7:00
you. Yeah. And I think that's part of
7:01
the, like, the the joy in these memes.
7:03
I know, like, you know, memes like,
7:06
I may be controversial, but I I do consider
7:08
them to
7:09
be, like, artistic in a way. That's
7:10
the controversial. No way. Of course. Excellent. Wonderful.
7:12
The safe space. I also think that
7:14
they can be you know, political cartoons in
7:17
a way as well, which political cartoons had it when
7:19
I that. Maybe they should get better at making means. Yeah.
7:21
Well, they should pick up some AI generation tools.
7:24
They also probably would hate me saying that too. But, you
7:26
know, I think it it sort of speaks to this ability
7:28
like it's the democratization of
7:30
art. Like, I'm the barrier to getting
7:32
involved in art constantly gets lowered and therefore
7:34
more people have the opportunity to become a part
7:36
of it. You know, niche Internet
7:39
communities have always been making memes for, you
7:41
know, since memes were a thing and since we decided that's
7:43
what we want to call them. And I
7:45
I really think that they sort of the more the bar
7:48
gets low, the more people can, you know, get involved
7:50
and the more, you know, pleasure centers
7:52
you know, the meme's or, you know, be
7:55
That's really interesting. Yeah. To to kind of make stuff
7:57
like this five, ten, twenty
7:59
years ago, required a very specific
8:01
set of tools and probably a lot of money and resources
8:03
to make something like this. But when you can use
8:06
AI and TikTok video editing,
8:08
the kinds of people who make
8:11
art
8:11
shifts, is this basically what you're saying? Yeah. Yeah. And
8:13
which I think is fascinating, but there is,
8:15
like, some questions about how
8:18
that undermines, like, the act of creation
8:20
itself when when it comes to, like, making
8:22
art.
8:23
But it So far, I mean, like for
8:25
me scrolling to my TikTok feed, it's, you know,
8:28
an ambiguous plus. You're
8:31
right that there are bigger themes here.
8:33
Like, we're talking about how funny
8:35
and and and engaging and political
8:37
and interesting these tools can be. There's obviously
8:40
also some other consequences as well, but before
8:42
we get into that, I think a really interesting example
8:44
of how democratized this
8:47
is and how easy it is to make
8:49
art using
8:50
AI. Like, you've got an example, an
8:52
app that someone you know has made.
8:54
What they've created is this this AI
8:56
sort of wrapping bot. So effectively,
8:59
all you have to do is put in like a prompt of what you want
9:01
the song to be about. And you can also put in some
9:03
sort of like a markers of what you
9:05
want the beat to sound
9:06
like, and it will spit out like a very short like
9:08
four bar verse.
9:09
Can we can have a look at it? Can we do it now? Yeah.
9:11
Absolutely. So what do you wanna hear? What sort
9:14
of input's gonna give to the beat? Trap
9:16
baits work well.
9:17
Cool. If it's a
9:17
trap bait. Okay. Do you wanna anything
9:20
in particular you want the subject to be? Yeah.
9:22
I think I want the subject to be you.
9:24
I want it to be is Jim
9:26
a good
9:27
journalist. Oh my
9:28
god. So
9:29
you're just typing that phrase into
9:31
this app? Yes. Against my will. It
9:35
is Jim good journalist. With questions,
9:37
I don't think it does so well. I did a good one the other
9:39
day that was like,
9:41
I ate too much chicken feet at yumcha and
9:43
like that was actually pretty good, but it couldn't say MSG.
9:45
But anyway, here we go.
9:53
Trying to tell the truth, but
9:55
not enough. He's a man of
9:57
interest twenty. That's what
9:59
he knows. He's never afraid
10:01
to take a stand. No matter
10:04
the cause. He's a voice of the
10:06
people and will never be lost.
10:08
Wow. I mean, if if we delete does feel like
10:10
it applies to you. Thanks, man. I appreciate
10:13
it.
10:14
Yeah. So this isn't beta, I I just
10:16
will add. So some of the words to come out little bit did,
10:18
but it's it's pretty interesting technology. And that
10:20
was quick. You said it takes a second. That took, like,
10:22
a second. Yeah. Pretty much. It's I
10:24
would say, less than a
10:25
minute, like, about thirty seconds maybe. Yeah. It's
10:27
It's very interesting. And I think what is interesting about
10:30
that and the TikTok videos
10:32
we were talking about is, like -- Mhmm. -- both of these,
10:34
as we were saying, a pretty irreverent examples
10:37
of using this kind of technology, using
10:40
this kind of approach to making a meme
10:42
or making art, making a political
10:44
statement. But
10:46
They're also part of a much bigger conversation
10:48
about where we are going with with
10:50
AI and tech and --
10:52
Yeah. -- this blurring effect in in
10:54
fiction. When you're watching those
10:56
videos. We're talking about the the presidential ones.
10:59
And when you're seeing people make wraps
11:01
like this, I mean, Is there any part of
11:03
you that thinks as funny and wild as
11:05
this
11:05
is? This also has the potential to be kind
11:07
of terrifying. Yeah. Look,
11:10
I I feel at this point,
11:12
it's like it's very interesting to see like
11:14
which areas of life are being
11:17
affected by it first. Because
11:19
you would think that AI would be
11:21
a tool. Right? And not necessarily a
11:24
sort of like a play thing. We're sort of at a stage
11:26
where it's like, it sort of is seeping into
11:28
art first because it has little consequence.
11:31
Right? Like I'm what is being done
11:33
with it. You know, art does have consequence,
11:35
and it it matters and and I would argue matters
11:38
and, you know, when I was going on my tangent
11:40
about the characterization of Donald Trump. Right?
11:42
Yeah. It matters because, like, it does influence
11:44
how we see the world and how we think about things.
11:47
But the fact that you can just like spit out something
11:49
funny to show your mates is or your
11:51
niche Internet community. It's
11:53
a fun thing you can do and, you know, you can just
11:56
sort of you know, enjoy yourself
11:57
and, you know, make others
11:58
happy with that. Yeah. It's interesting that when chat GPT
12:01
was released, that's the kind of service
12:03
where you can sort type in prompts and
12:06
it can create essays and there's a little bit of
12:08
a panic about what that means for academia
12:10
or journalism. Yeah. But
12:10
everyone I know just use that to make, like, gags.
12:13
One of our colleagues actually wrote an article
12:15
using Chachi PPT. And, like, you know, it's
12:18
weirdly good, but it like, it at this point,
12:20
it really only, like, puts up a facade of intelligence
12:23
and it's just a it's a mimicry of and,
12:25
you know, in in some cases, that's enough.
12:27
Right? But in most cases, it's not really
12:29
developed enough to be a serious tool,
12:32
I think. I think it's a good jumping off point and
12:34
can give you the frames to to build
12:36
something greater, but it's it's not really
12:38
anything more serious than that. And I think that applies to
12:40
art as well. I mean, like, means a, you know,
12:42
low effort, low reward. So
12:44
it doesn't matter that, you know, our AI presence
12:47
sound a bit weird or, you know, the the rap I made
12:49
in thirty seconds sounds a weird. You can pronounce
12:51
integrity. Right? Like, you know, that's not
12:53
the end of the world, but I feel like we're still a few
12:55
steps before we have to start worrying about it, you know,
12:57
taking our jobs or I think the concern
12:59
is, you know, propaganda. Right? Like,
13:01
can it it lowers the bar for people to
13:03
create
13:03
propaganda. That that I think is the most interesting.
13:05
Like, I the chat GPT thing hasn't
13:08
really worried me. Like, academics seem
13:10
slightly stressed about kids faking if
13:12
essays. I mean, like, kids have found ways to
13:14
cheap on essays forever.
13:16
Absolutely.
13:16
Who cares? Yeah. Absolutely. Who cares? University
13:19
is a joke. Sorry if you're studying when you look
13:21
into this. It's weird.
13:23
It's funny that, like, the first thing that's freaked me
13:25
out about all of this stuff is these TikTok
13:27
videos use that
13:28
advertisement. about Donald Trump wrapping Donald
13:30
Trump talking about his favorite rappers
13:32
-- Yeah. -- because it's so
13:34
realistic. And I I guess, like, there's two different
13:36
issues here. It's worth separating these things out because we're kind
13:39
of alluding to both of them, but, like, let's tackle them
13:41
one at a time. The first one is kind of this
13:43
use of deep fakes, which obviously
13:45
has the ability to create humorous videos
13:48
but also could have political and social consequences.
13:50
Yep. And then there's a separate issue of art
13:53
and what it means to have tools
13:55
and artificial intelligence that can generate a lot
13:57
of art very, very quickly. Let's talk
13:59
about deep fakes first, like aside
14:01
from the fun stuff we've been
14:03
discussing. There's all
14:05
sorts of examples that have already gone
14:07
viral from kind of more nefarious
14:10
or or weird ones. Like, this is one I remember,
14:12
Jerome in the Podcast in an
14:14
AI generated clip is debating Rata
14:16
Tuohy with Ben
14:17
Shapiro, like another conservative. I
14:20
just think would love to get Rata Tuohy.
14:22
Rata Tuohy? Like, have a little guy up there. You
14:24
know, making me cook delicious meals. No. I understand.
14:27
I I've seen the film. You wouldn't want that? I A little
14:29
guy pulling your hair, making you cook. It's a ridiculous
14:31
concept. You're telling me you wouldn't want that. I just
14:33
don't know why you're bringing it up as if it's something that
14:35
could actually happen. You wouldn't want a little ratatouille
14:38
guy up there doing whatever. It's on insane that
14:40
a rat would even be able to cook in the first place. He doesn't
14:42
have to be cooking. In the movie, he's cooking.
14:44
Yeah. But in this scenario, though, he could
14:46
do other stuff. Yes.
14:47
It's so funny. And it's like and I think the fact that
14:49
it was so it's believable. It was so
14:51
credible. That was so good because,
14:53
like and, like, that's another thing. Right? Like,
14:55
I'm the characterization of Venture hero
14:58
in these AI videos is so
15:00
good. Like I'm
15:00
sorry, I know this is not talking about the issue, but it's just it's
15:02
hilarious to see how people choose to percent
15:04
totally. And I and I think that still falls into the category
15:07
of, like, more humorous than nefarious.
15:09
But
15:09
then very quickly, like, a
15:11
a quick comp to that is Emma Emma Watson
15:14
you know, from Harry Potter and various other things,
15:16
her voice was simulated to have a reading
15:19
out of Hitler's mind calm, which
15:21
you can sort of see if you didn't
15:23
understand the context or if that was
15:25
originally popped up on Reddit and then
15:27
you saw that on Twitter, you would think what the hell
15:29
is going on? Yeah. I mean,
15:32
Do you think we're heading into a
15:34
political election campaign in the states
15:37
where there's already so much concern
15:39
about fake news
15:40
misinformation, like, what does something
15:42
like this add to that?
15:45
think it adds like a layer of danger. Right? Like,
15:47
because, like, you know, as we well
15:49
know, the initial untruth often
15:51
spreads further than the the actual
15:53
correction. Alright. You know, imagine if these
15:55
defects had existed when Donald Trump was in office.
15:58
Yeah. You
15:58
know, he was constantly saying the things that he,
16:00
like, very obviously said were not true on Yeah.
16:02
How could you tell the difference? Exactly. Yeah. It gives
16:04
it gives people like him more cover to sort of, you
16:06
know, perpetuate untrue or push propaganda,
16:09
you
16:09
know, he could he could be reading from my comp and then
16:11
he could be like, oh, actually, that was a defect.
16:12
Yeah. That's really interesting. And, you know, I I feel like
16:15
there is, like, a a real danger of of
16:17
that becoming a problem. And, like,
16:19
that is that's concerning, right, because, like, you
16:21
know, there's like, there there are real dangers in,
16:23
you know, the the policies and the
16:25
the positions. And it creates like
16:27
this sort of environment where everyone else will be
16:29
playing catch up, you know, trying
16:30
to, you know, dispel the untruths or, you know,
16:32
saw the fact from fiction. So I think
16:34
there are real, like, real issues
16:37
with it. I think the most interesting thing
16:39
here is what you are willing
16:41
to consider is possible
16:43
or viable out of this. Like, if if
16:46
Joe if a video gets released of Joe
16:48
Biden reading my account for example,
16:50
it's, like, that is
16:52
so obviously, like, a deep fake
16:54
that it's not really much of a threat to his
16:56
political career. Yeah. But as you mentioned, someone like
16:58
Donald Trump, it's like, you don't know if that could be real
17:01
or not. So maybe the debate
17:03
over, like, deep fakes and what sort of impact
17:05
that could have on our perception politics
17:07
on campaigns. There's actually a bit of a fig leaf
17:10
for just like pointing to
17:12
the kind of deeper risks and challenges
17:14
of having certain candidates. Like, you should you should
17:16
imagine in a world even with deep fakes
17:19
that if someone says something so out there,
17:21
that's obviously fake. But then I wonder, like, what
17:23
about those instances in the middle where? It
17:25
is not Joe Biden reading from Meinckhardt,
17:27
but it is him announcing the
17:29
placement of military warships
17:32
-- Yeah. -- you know, in an area that is contentious,
17:34
so it's possible, but he
17:36
hasn't actually done it. But a video site quite said
17:38
he's done
17:39
it, and then another country reacts.
17:41
don't know if we're gonna get like, a Cuban
17:43
missile crisis out of deep faith. Like, is that
17:45
sort of what you yeah. I'm just I'm just asking the question. I'm
17:47
just asking questions here too. Not
17:50
to go to taka calls. Yeah.
17:51
Yeah. So true. But why why wouldn't we get that?
17:53
Like, if
17:54
those are the stories, guys are talking about -- Yeah. --
17:56
if you don't look at that video, and you just
17:58
listen to Donald Trump and Joe Biden arguing
18:00
about Kendrick
18:01
Lamar. It sounds like Donald Trump and
18:03
Joe Biden arguing about Kendrick Lamar. Yeah.
18:05
Yeah. And I I think that's the thing right. It's like
18:07
the the way the voices sound has
18:09
gotten so, like, scarily good in
18:11
such a short space of time. Like, there's a noticeable
18:13
difference between the early ones and the and
18:15
the the late ones. I think the first 1II
18:18
saw was Joe Biden
18:20
saying, like, a whole bunch of unprintable
18:22
things about, you know, smoking
18:24
weed. Right? Right. And he's talking about this specific
18:26
brand of, like, you know, noobie and Cush or whatever.
18:29
It's just some some absolutely outlandish thing
18:31
eighty cents says a bunch of, like, awful and racist things.
18:33
But he also, like like, the idea
18:35
is to mimic, like, a specific type of
18:37
person. Right? And, like, sort of, like, put it out and,
18:39
like, it was so dilted and unclear
18:41
that, like, obviously, it was a defect,
18:44
but it was still weird to hear his voice saying But now
18:46
we're getting to the point where they actually sound like they're having
18:48
real conversations, you know, the intonations,
18:51
the the way they interrupt each
18:52
other. You know, what's to say that this conversation
18:54
isn't a deep take? Well, that's the great reveal
18:56
of this because it turns out neither was surreal.
18:58
It's a yeah. I generated Yeah. And I guess I guess
19:00
the other thing and you were alluding to this before in terms
19:03
of trust in media, like, where these things
19:05
emerge. Like if Joe Biden is gonna put
19:07
missile cruise ship
19:08
somewhere, he's not gonna announce that on
19:10
TikTok. On, like, discord. No.
19:12
No. But I mean, the the so
19:14
the Pokemon one, which is unprintable, sort
19:17
of keep bringing it up because I'm sure everyone's really
19:19
You can Google it if you want, but I'm not gonna talk
19:21
about that. I I'm not even gonna give you the information to Google
19:24
it. That came with video that looked like
19:26
him speaking at a rally, you know. And when
19:28
these so when those videos, like, the real ones
19:30
actually make to TikTok like, you know, that's how they
19:32
look. Right? They look a little bit out of fact, and they look a little
19:34
bit weird. And they
19:36
still make it onto the the platform. So it
19:38
wouldn't necessarily be like he's announcing it, but it'll look
19:41
like it had been announced somewhere else. Right. And
19:43
the problem is, do people actually verify things?
19:45
I mean, like I say, the TikTok says, Joe
19:47
Biden's putting, you know, warships
19:49
you know, say, the South China Sea. I'm
19:52
sure Xi Jinping is not going to be, like, I saw
19:54
some TikTok. I'm gonna go see it later. Like, he's
19:56
like he's gonna, like, you'll call the the ambassador
19:58
or whatever. Like, it's I don't think in those situations,
20:01
it's as much of a clear problem as it is,
20:03
like, sort of this, like, lifting
20:05
the baseline of sort of, like, tension within, like,
20:07
a, like, a country ride is probably
20:09
more the propaganda usage of it than it
20:11
is something else. Yeah. No, fair enough. I feel
20:13
like I'm less optimistic
20:16
than you are on everybody.
20:19
Yeah. Just think I just think, like, in in
20:21
Pakistan right now, there's a huge vertical price
20:23
because these audio tapes have been
20:25
leaked allegedly
20:26
with, you know, former prime minister Imran Khan
20:28
on
20:28
them talking about corruption, apparently
20:31
some stuff to do with him having affairs.
20:33
And it's that you remove the video of it and it's just
20:35
audio that seems great. In an election
20:37
campaign context, like you said, it
20:39
you know, it just pops up somewhere and a million
20:41
people might share it. Yeah. They might issue a denial,
20:43
and might even later out be
20:45
publicly recognized by some independent
20:47
agency this was a deep
20:49
fake. That's still like millions of people
20:51
who heard that first thing. Yeah. Yep. That's a good point. And
20:53
I mean, in a domestic
20:55
election campaign. That's definitely a few
20:57
notches, you know, less of importance than,
21:00
you know, say like a, you know, two nuclear nuclear
21:02
and states going out each other. But that could be
21:04
problem in some way like India and Pakistan
21:06
where, like, they are two nuclear
21:08
armed states that are very, you know,
21:10
there's quite a lot tension there. And, you know, something
21:13
happening there could be very very
21:15
that's very scary. That's a very scary and very
21:18
important thing to sort of, you know, point out
21:20
that the world is larger than our little backyard.
21:22
And, like, there could be implications out the one
21:24
there for sure. And I think it absolutely
21:27
would have the power to swing domestic election
21:29
campaign. I'm not sure or maybe not swing, but like
21:31
affected in some Yeah. It affected at least on the air.
21:33
I think it's interesting that it took
21:35
me watching these hilarious TikTok videos to
21:37
think about it. And I just wonder, like,
21:40
it'll take something significant happening
21:42
before everyone realizes, oh, wow. We should
21:44
really have kind of figured how how to deal with this.
21:46
Yeah. The other thing I wanna get your perspective on
21:48
before you wrap up this really interesting,
21:51
slightly terrifying conversation is
21:53
in terms of the Like, there's a debate
21:55
within the arts community -- Yes. Yep.
21:57
-- the role of artificial intelligence.
21:59
Like a lot of artists, I know, and and,
22:02
you know, our colleague was flocked who's a art editor
22:04
here at the age. She's written a fair
22:06
bit about this. You know, AI
22:08
is another tool. Right? can be used
22:10
to create new work.
22:12
It can be used to help you use
22:15
technology in a certain way, in a way that all
22:17
sorts of technologies used to create art these
22:19
days. But then there's another side
22:21
to this argument that says, well, no,
22:23
these kind of tools, a
22:26
derivative, they still are, they
22:29
devalue art by their ability to create
22:31
so much stuff of unknown
22:33
quality so quickly. Where
22:36
do you sit on this? Or what do you think about this
22:38
debate?
22:39
Yeah. Look, I don't think artists will
22:41
ever like, you know, a human artist
22:43
will ever fully be irrelevant.
22:45
I think, like, that should be fairly, you know,
22:49
uncontroversial. There's something like beautiful
22:51
about, like, expression that springs from nothing wrong
22:53
that is made with your own hands or, you know, whichever
22:55
tools you have available. And sort of like,
22:57
you know, removing the steps in between
23:00
your fingers and, like, you know, like,
23:02
actually making a beat on, you know,
23:04
in pro tools or whatever is, like,
23:06
removing some of that reverse from the magic. Right?
23:08
Some of the decisions or the choices you make, which is
23:10
like a part of like creation of art.
23:12
Right? Mhmm. But I don't necessarily
23:15
think that it will be
23:17
great for artists either because, you know, artists, as
23:19
we all know, like, generally, unless
23:21
they're, like, you know, touring the globe,
23:23
doing a sold out shows, or selling
23:26
art into that, you know, like, visual art and talking
23:28
that there's weird, you know, collector's sort of circles
23:30
like this. Unless you're one of those people, you
23:32
don't to the club. Yeah. You're not really well compensated.
23:35
Yeah. So there is like a bit of concern that
23:37
I share, you know, within my
23:39
my my circles, like, that
23:42
like, artists will just sort of no longer
23:44
be able to actually get paid for the
23:46
sort of mega things I do get paid for. So say
23:48
you're an artist writing jingles for like,
23:51
for ads, you know, that is
23:54
a way you can make money on the side when you might
23:56
be doing something that you'd rather be doing. Right?
23:58
You know, expressing yourself more freely. If
24:00
a big company has the ability to generate
24:02
that in the, you know, a few like, or generate
24:05
hundreds of options within a few
24:07
moments, they don't really have the need
24:09
to pay, like, an a suffering artist to
24:11
to create something like something
24:13
that is, you know, unless they're looking for a bespoke
24:15
solution. Right? So you know,
24:17
that's of concern. It like sort of closes
24:19
the avenues for artists to make a living otherwise,
24:22
offer their art and, you know, keep practicing. And
24:25
also, there's question of concentrating the
24:27
ability to make art within the hands of
24:29
a few. So say you're one of these companies who have
24:31
made you know, these AIR generators.
24:34
You then wield an enormous amount of power because
24:36
you can create as much art as you like. You
24:39
you hold the the men's production as it were
24:41
too. To actually create these things.
24:43
And I I think
24:45
history will show that the the consolidation
24:48
and the monopolization of of any
24:50
sort of, you know, platform or
24:53
or tool is generally bad.
24:55
It doesn't necessarily
24:57
create an environment where there is lots of innovation
24:59
there is lots of boundary pushing, genre
25:01
pushing. Yeah.
25:02
I use smart. I think that's right to think
25:04
about it from the perspective of who controls
25:06
these
25:07
things. Yep. Yep. And to what and are
25:09
they using them for? Yeah. Like, there is a world in
25:11
which this kind of technology is
25:13
used to liberate people from very
25:15
boring jobs and tasks and to help them
25:18
create exciting new stuff. But
25:20
under our current sort of
25:22
structures and and and economic situations,
25:25
they'll overwhelmingly be used to minimize
25:27
costs by, you know, laying people off
25:29
and extract as much value and revenue
25:32
for the people on them as
25:33
possible. Yeah. And I think, you
25:35
know, you don't even have to go far back history
25:37
to actually see an example of that happening. Right? Like,
25:39
the dawn of the computer. Exactly. Supposed to
25:41
be, like, you know, oh, we'll all work for four
25:43
day weeks, you
25:44
know, three hours a day, and that'll be it. You'll be able to
25:46
log off and go home. Less than what happened. No.
25:47
We're working twelve hours a
25:48
day, seven days a week. Yeah. And it's getting worse. Exactly.
25:51
And and I think that's the that
25:53
makes me very pessimistic about the application
25:56
in within businesses of AI
25:58
art and AI more generally. Right?
26:00
But then there's also on
26:02
the the positive side, like it it lowers
26:05
the the bar for people to enter, you know,
26:07
artistry. Right? Like, there's no way in hell I
26:09
could make a a beat that quickly and,
26:11
you know, come up with a, like, a four bar. Right?
26:14
And I think that, you know, lowering
26:16
the bar for that, like it gives people who
26:18
have ideas, but maybe not necessarily the skills,
26:20
the ability to express themselves. And I think
26:22
that will create and
26:24
that's not even going into, like, you know, the
26:26
AI generated wrappers, which are mine and controversy.
26:29
Right? But, like, I'm just saying,
26:31
like, some disembodied and,
26:33
you know, sort of, like, an anonymous voice.
26:36
You know, it gives you the opportunity create something
26:38
that may not necessarily be like a a top
26:40
forty hip or something that's
26:42
going to change the way we, you know, like,
26:45
the way we think about the world, like, say, like, to pin
26:47
the butterfly. Did once.
26:49
You know, like, those things may not be possible
26:52
yet or at all with, you know, AI
26:54
generation. But it does get people
26:56
who lack the skills like, you know, the the ability
26:58
to get started or to at least express themselves
27:01
in a way that they wish to. So I don't necessarily think
27:03
that that's a terrible thing. I think that's I think it's quite
27:05
interesting. It's worth considering, but
27:07
unfortunately, we don't get to pick and choose which
27:10
social changes we get. Totally. That's a
27:12
really interesting and succinct
27:14
place to end. A conversation that is
27:17
probably really just starting when you think about
27:19
how we as a society across the
27:20
globe, I think about this stuff. Jim, may I
27:22
thank you so much for your time today.
27:24
Thanks for having
27:24
me.
27:28
This episode of the drop was produced by
27:30
Chi Wang. We'll be airing episodes of
27:33
the drop here every Saturday, so you can hear
27:35
more from the age and Sydney Morning Herald's
27:37
culture team on the latest in the world of pop
27:39
culture and entertainment. And if
27:41
you enjoyed listening to today's episode, You
27:43
can also follow the drop in your favorite
27:45
podcast
27:46
app. I'm Osman Fureki. See
27:48
next week.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More