Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
ABC Listen, podcasts,
0:02
radio, news,
0:04
music and more. Hello?
0:09
Can you hear me?
0:15
Or are you just with Optus? Yes, this week on
0:17
Download This Show, how did one of Australia's
0:19
biggest telecommunications companies
0:22
leave millions of customers with no phone
0:24
service or access to the internet for several
0:27
hours? Also on the show, the crypto
0:29
king is found guilty of fraud
0:32
and Facebook's new rules on fact-checking
0:35
AI ads. All of that and much more coming up.
0:37
This is your guide to the week in media,
0:39
technology and culture. My name
0:42
is Mark Finnell and welcome
0:43
to Download This Show.
0:46
Yes, indeed, it is
0:48
a brand new episode
0:52
of Download This Show.
0:58
And a big welcome to Jessica Sire, tech reporter
1:00
for the Australian Financial Review. Don't laugh at me, you
1:02
already put the show in and started and you're laughing
1:04
at me. Hello, welcome to the show. And a
1:07
newly de-mustachioed Alex McCauley,
1:09
the founder of the Tech Council. Welcome
1:11
back. Thank you, Mark. I thought one of the things
1:14
about radio was you didn't have to talk about your appearance.
1:16
Some of us like to objectify the guests and today
1:18
it's you. It's lovely to have you both
1:20
here in studio today. And good news that you actually
1:22
got the message to be here given
1:24
one of Australia's biggest telecommunications
1:26
company was out not that long ago. Much
1:29
hand wringing has been done around the Optus outage
1:31
of last week. To the best
1:33
of our knowledge, and I should say that we do record
1:35
this a little bit earlier than it goes to air, so
1:37
this information may change over time. But to the best of
1:39
our knowledge, Jess,
1:41
give me the tick tock of what happened on the day. Well,
1:44
none of our phones worked. Some of the
1:46
trains in Melbourne didn't work. No
1:49
one could call triple zero for a while. Lots
1:51
of payments couldn't be accepted in cafes
1:53
and restaurants around the country. As far
1:55
as we understand, it was a network,
1:58
a technical network fault.
2:00
I don't know what that means. And I dare say
2:02
lots of people don't. Do
2:04
you know? I think you're from the tech
2:07
council. I definitely
2:09
don't know. When something like this happens,
2:11
where do they start with an investigation? Like just
2:13
in general terms? I presume Optus
2:16
knows more about what the fault is than they've said to the media,
2:18
or therefore that I've read about, and they
2:20
will no doubt cooperate with
2:23
investigators because it's in their interest to do so. But
2:26
I imagine it will take some time to really get to the bottom of where,
2:29
why, and how to stop it from happening again. How
2:31
did you evaluate the communication, Alex?
2:34
From Optus?
2:35
Well, it was slow to come.
2:38
It's almost like the people that work at Optus Comms had
2:41
Optus phones. Well, exactly. If
2:43
you could receive the communication, you know, you
2:46
were doing pretty well to start with. There
2:48
have been a few times over the last year or two where
2:50
Optus has been under the spotlight for their communications,
2:53
and I don't know if they covered themselves in glory yesterday
2:55
either. What do you think, Zis? I think it was awful,
2:57
I think, to not get on the front foot
3:00
of an event like this, which affected
3:02
so many people. Like, it's so different if like your
3:05
app doesn't
3:05
work, or your cafe doesn't open,
3:07
or your gym, or whatever.
3:09
This is like a piece of critical infrastructure
3:11
that affects so many lives,
3:13
and just the operations of the country,
3:16
and to not have clear communication
3:20
so that the media can do their job and broadcast that
3:22
to more people. I think it was really
3:24
like a failing. The way that it's handled, and I
3:26
don't know how much your listeners are interested
3:29
in
3:30
how
3:32
the media sausage is made, but the idea
3:35
of like the executive going out and speaking to
3:37
individual media outlets at disparate times
3:39
during the day and giving out bits of information
3:42
is just so unhelpful.
3:44
Not just for journalists trying to do their jobs, but
3:47
for customers trying to figure out, all businesses
3:49
trying to figure out whether or not they're going
3:51
to be online in the next 24 hours
3:53
or something. I think it was a real failing. We should have learned
3:55
something a little bit from the pandemic where
3:57
actually there is benefit in like communal practice. conference
4:01
where everyone finds out. That's all they had to do is
4:03
hold a press conference
4:04
and then all the press gets the same message at
4:06
the same time and then disseminates
4:08
it for their audiences. It's just not a
4:10
tricky thing to manage. And the thing is
4:12
because the outage went on for so long
4:15
and there was no information forthcoming,
4:17
the gap of information just gets filled
4:19
with speculation and you just kind of
4:21
want to manage that I think. When
4:23
this happens overseas, I know there was
4:25
a I won't say the same story, but a
4:27
story with some similar touchpoints in Canada.
4:30
I think it was copies some time ago and
4:32
they have looked at enforcing a situation where
4:35
if a telco goes down, there's almost like a forced roaming
4:37
thing that happens where if you go down, if you know
4:39
you go down, you almost
4:41
like global roaming. You get handed over to another telecommunications
4:44
company except this time it's not in another country.
4:46
It's the one in the same country. What's
4:49
stopping us from doing that?
4:51
Is there anything stopping us from doing that? I
4:53
presume just, I mean, firstly, brilliant
4:56
idea. Obviously it should be there. This
4:58
is, as we just said, critical infrastructure
5:01
that can't be allowed to just be down all
5:03
day. And I
5:05
presume the only thing that's happening is competition,
5:07
you know, unless they're forced to do it. Why
5:10
would they do it? You'd have to figure that that must
5:12
come up in the forthcoming inquiry,
5:14
right?
5:15
Absolutely. Yeah, I think that is the answer.
5:17
I think telcos in Australia have been
5:19
really reluctant to share their spectrum
5:22
or their bandwidth with each other because
5:24
of the fear that it'll eat into their market
5:26
share. But I think, I mean, the government's made
5:29
so many strides in the last 12 months, two
5:31
years to ensure that Australia's critical infrastructure
5:34
is protected, particularly from nation
5:36
states and cyber attacks and things like this. The
5:39
idea that Optus had some
5:41
kind of single point of failure like this
5:43
is just, yeah,
5:46
I mean, they deserve the scrutiny. They deserve.
5:48
And perhaps that's part of the communications thing
5:50
is like, we don't want to have someone saying,
5:52
well, why did you have such a single
5:54
point of failure that seemed to cascade throughout the entire
5:56
business and you couldn't fix it for 12 hours? I mean, don't you? war
6:00
game this kind of scenario. I
6:02
mean it's going to get more and more difficult for a
6:04
company of this size to avoid that kind
6:06
of scrutiny. Did it change how you feel
6:08
about our reliance on technology, Alex? No,
6:12
but it did make me sad
6:14
that it
6:16
will give a lot of ammo to people
6:19
who don't want us to be so
6:21
reliant on technology, don't want the world to move forward in that
6:23
way. I listened to the
6:25
Great ABC this morning on a bit of TalkBack Radio and
6:27
there were lots of people calling in saying, oh, I
6:29
only use cash and this is why. I think
6:31
that will give a bit more credit
6:34
to that kind of philosophy for a bit longer,
6:37
which I just think is annoying from my perspective. I always
6:40
might disagree. Did it change how you feel about
6:42
our reliance on technology, Jess?
6:44
No, it just changed the way I
6:46
feel about Optus as a manager
6:48
of critical infrastructure.
6:50
The amount of data that most of
6:52
the bandwidth of these companies or the spectrum, as
6:54
they say, is data moving
6:57
around in packets across
6:59
all of our devices. The way that we do
7:01
our jobs, the way the economy is run
7:03
is across these networks
7:05
and through these channels. If
7:07
you want to make money by providing
7:10
the channels to all of us, you've got
7:12
to make sure those channels work. Whether
7:16
or not we use technology less,
7:18
it's kind of nice. If the
7:20
phones go down for a while, sure.
7:23
If you're in the bush and you're having a nice time or whatever,
7:25
but I'm a journey man.
7:27
Every time I tried to make a call yesterday,
7:29
I couldn't do it and I'd have to go and signal
7:32
it. What's hilarious is some of the people I was calling
7:34
were like, Jess, why are you
7:36
calling
7:36
me on signal? Is something secret happening?
7:38
Because I was on the Wi-Fi. Oh, right. I was like, how
7:41
are you getting signal to work? Yeah, but I'm calling
7:43
through all these other channels and
7:46
people were really sus on why I was calling on
7:48
these, usually quite surreptitious apps. I
7:50
suddenly became hyper aware of free
7:53
Wi-Fi spots. In that way
7:55
that you only ever do when you're like, overseas
7:57
backpacking. Right. Yeah, it had a very
7:59
backpacking. It's interesting the
8:01
thing you were saying, Alex, about people
8:04
who are, let's
8:06
say, tech-averse, beyond the
8:08
fact that you're from the tech council. Why
8:11
does it bother you? It doesn't bother me at an individual
8:13
level at all. I think the point I was making
8:15
was really just, you know, there's always
8:17
resistance to things like going cashless,
8:19
et cetera. You know, I've loved this
8:21
sort of ubiquity recently of being
8:24
able to tap your card or tap your phone. I've
8:26
sort of been proud of Australia for being out in front
8:28
on that stuff when I travel to the U.S. and people are still paying
8:30
with checks. I'm like, so I kind of I
8:32
love that. And I love that we're sort of out in front
8:34
of it. And anything like this tends
8:36
to hold that back, I think. It's also
8:39
why I never know where my wallet is anymore. Download
8:42
this show is what you're listening to. It is your guide to the week
8:44
in media, technology and culture. I should say that
8:46
we are recording this a few days before it goes to where.
8:48
So there may be new information that comes out between
8:50
when we've recorded this and when we go to where. So
8:52
just bear that in mind as you're listening. It is your
8:54
guide to the week in media, technology and culture. Jess,
8:58
the biggest story in crypto has been unfolding
9:00
in the U.S. and you have been talking about
9:02
it. Tell me what's happened. So
9:05
Sam Bankman-Fried is the founder
9:07
of a crypto exchange called FTX, which
9:09
was very, very popular until about
9:13
November last year when it collapsed.
9:16
I think it's spectacularly collapsed. Spectacularly
9:18
blew up and about
9:21
eight billion dollars was missing
9:24
from this of customer money was missing from
9:26
this exchange and the
9:29
trial of the founder and the CEO of
9:31
the FTX exchange, Sam Bankman-Fried was
9:33
found guilty of seven
9:35
counts
9:36
of fraud and conspiracy
9:38
to commit fraud. And
9:40
we're just sort of hanging about waiting to
9:42
see how
9:42
much time in jail he's going to get.
9:45
Time and jail is important, but also what happened
9:48
to the money? Through
9:50
that process, did we find out? Oh,
9:52
yeah. So FTX was run by a bunch
9:54
of like 28, 29, 30 year
9:57
old crypto people and they all came from
9:59
like quant trading. land. So they were all...
10:01
Quantrading is like where you use really
10:03
fast internet speeds and
10:05
information to eke out tiny
10:07
little trading gains. And so they'd
10:10
set up this exchange. And in parallel
10:12
to this crypto exchange where we could all get on there
10:14
and we could buy and sell cryptocurrencies. And
10:16
what made FTX so popular in the first instance
10:19
was that you could trade on margin,
10:21
which means you could
10:21
borrow money to maximize
10:24
your trade. So if you had $100,
10:27
you could borrow, say $100 more. And
10:29
if you double your money, you double your money.
10:31
Right. But the problem with that is that you can
10:34
be wiped out as well. And so anyway,
10:36
so FTX is really popular because you could trade on margin
10:39
and so people could amplify their bets. They
10:41
also had a Quantrading
10:44
firm, a hedge fund that Sam Bank McFreed
10:46
and his geeky friends were running on the side.
10:49
The business of that hedge fund was only
10:51
to make money. They
10:53
were not very good at making money for
10:55
a period of time. And to plug
10:58
a massive hole in their balance sheet, they took $8
11:00
billion of FTX
11:02
customer money and brought it over to
11:05
Alameda Research. That was the hedge fund's name and
11:07
kind of
11:08
lost that. So that's
11:10
what we found out in the trial was exactly
11:13
how those mechanisms worked, how that
11:15
money moved across and the fact that
11:17
Mr. Bank McFreed knew about it the
11:19
whole time. And also earned Alameda Research.
11:21
Yeah. 100%. Yeah. It was his hedge fund. So
11:24
we can basically clock with an egg timer
11:27
how long it'll take for somebody to turn this into one
11:29
of those docudramas. I reckon it's underway.
11:31
Michael Lewis, this financial journal who
11:34
wrote Moneyball, that movie has
11:36
already published a book
11:37
because he was randomly tailing
11:39
Sam Bank McFreed or SPF as he's known in the
11:41
industry. He was tailing this guy for the 12 months before
11:44
the collapse even happened. So like on
11:46
the day... He
11:46
knew something we didn't. Well,
11:48
I think it was just like a crazy story, right?
11:50
This 29-year-old dude from like Stanford
11:53
University or MIT, but his parents were
11:55
Stanford professors, just all of a sudden
11:57
is like worth billions and billions
11:59
of dollars. in the space
12:00
of two years, like wild. So. Like
12:03
the greatest hits of everything that was terrible about tech bros.
12:05
All right, yeah. Has this changed
12:08
how you think people will take up and
12:11
engage with cryptocurrencies? Because cryptocurrencies
12:13
are unlikely to go anywhere, but
12:15
I do think you'd have to acknowledge it's gonna
12:18
take, it takes a hit on its confidence
12:20
level. Oh yeah, I mean, well, does the verdicts
12:22
change it? I don't know, probably not. I mean, the $8 billion
12:25
loss. Yeah, the $8 billion loss has changed already.
12:27
I mean, crypto is, and web three even
12:29
have been off the agenda really
12:31
for the last year or so. It's really been a huge hard
12:34
crash since about Q1 last year. Okay,
12:36
so there's a shift in the center of gravity
12:39
in the tech world where once upon a time, everyone wanted
12:41
to talk about cryptocurrencies. And
12:43
I suppose I'd probably put non
12:47
fungible tokens in there as well and web
12:49
three. And now everyone's gone AI. Well,
12:52
not everyone, but there's a shift in people's
12:54
focus and attention.
12:56
Cryptocurrency is,
12:59
for lack of a better term, still a thing.
13:01
There's still a number of them out there. Some of them are
13:03
very stable. Some of them are not. What
13:06
we've got here is a situation where it's one, you
13:08
know, spectacular crash of an exchange.
13:11
Do we see other exchanges kind
13:14
of doing things to build confidence
13:17
in it as a marketplace?
13:20
I mean, I don't have any real visibility
13:22
over any of the other exchanges doing anything particular. I
13:24
think most of the people I speak to aren't
13:27
just talking about crypto anymore. They're not investing
13:29
in it. They don't invest in companies
13:32
that are doing anything with it. So I suspect it's just in
13:34
a bit of a hiatus in between, you
13:36
know, phases. Crypto has always
13:38
been pretty cyclical. You look
13:40
at the price of Bitcoin over the last 10 or 15 years. It's
13:43
got huge ups and downs. And this is definitely a sort
13:45
of down in interest and price. But
13:47
I don't think it's done by the means.
13:50
Is it like, punk, it's not dead. It's just gone to bed.
13:52
Yeah, a
13:52
bit like that, I think. What's
13:55
remarkable about this massive collapse and Sam
13:58
Bankman Fried being found? guilty of all these charges
14:01
is like he was found guilty of fraud,
14:04
which is old as time. Like whether
14:06
you're doing it with digital assets, whether you're doing it with
14:08
whatever widget, whatever market,
14:11
what he did was failed
14:13
to have any
14:14
dominance controls or quality controls
14:16
on the
14:16
processes in a business. I
14:19
think the weird wash up of all this is
14:21
like all the crypto exchanges that are still in operation
14:24
who perhaps were running things a bit loosey
14:26
goosey are tightening
14:29
it up really, really fast because
14:31
they
14:31
have seen that this guy, you can be found
14:33
guilty of fraud even if you're operating in the
14:35
wild west of crypto.
14:37
We talk a lot about
14:39
there not being enough regulation for crypto
14:41
businesses and in laws they
14:43
aren't defined and so that's sure that's an issue
14:46
or whatever. But when you're taking customer
14:48
money to plug your losses in another
14:50
business, that's pretty straightforward,
14:52
that's pretty black and white and
14:53
the law will find you. Download
14:55
the show is what you're listening to and speaking
14:58
of things about AI, Meta, the
15:01
company that own Facebook and Instagram
15:04
announced an interesting new policy around
15:06
political advertising and AI. Alex, what happened?
15:09
Yeah, this is a really interesting story. So it's basically
15:12
Meta saying if you've used AI
15:14
to generate content, you've got to say so. You've
15:17
got to label it as having been made
15:19
with AI and so they're
15:21
envisaging people particularly in the lead
15:23
up to next year's US election.
15:26
They're envisaging people making
15:29
videos and images and stories about
15:32
stuff that didn't really happen, about a future
15:34
that they're imagining but which is
15:36
AI generated and they're saying
15:38
you've got to put a label on that so that people who
15:40
are looking at it know that it's not necessarily reality. Is
15:43
it specifically just stuff that's advertising
15:45
or is it all content? I understood it to be
15:47
specifically stuff that's advertising. Okay,
15:49
so here's why I'm curious about it, right? Because
15:51
when it comes to content around a US
15:54
presidential election, which I'm sure will be very
15:57
civil, I don't think the issue is necessarily
15:59
political. I think it's a good thing conceptually. I
16:03
don't think the issue is going to be stuff that comes from the DNC
16:05
and the RNC. I think it's going to be the wild
16:08
amount of meme content that
16:11
gets out there that has there is no
16:13
control over. That's the stuff that will
16:15
be the problem. Thoughts and feelings,
16:17
Jess? Jess You know,
16:18
yeah, you've just painted like such an image
16:20
in my mind of like this wave of just
16:22
garbage internet content
16:25
that's about to like be an onslaught
16:27
into our lives. I wish that there existed
16:29
something like drug testing for this, but you could
16:31
just sort of drop a couple of drops in your powder
16:34
just and it turns a color and you go, ah, fake
16:36
or like it's this. I wish that
16:38
there were more tools for running content
16:40
through so that users can start to
16:43
manage themselves.
16:44
I mean, well, it's actually interesting because there is an
16:46
idea called content credentials. I think we talked about last week on
16:48
the show where there you should
16:50
be able to click on images and see how have they
16:53
been edited. And I know Adobe,
16:55
who are the people obviously behind Photoshop, who
16:57
are actually out there selling their ability to,
16:59
you know, have generative
17:02
AI in pictures now. I've seen all their advertising.
17:04
They've now, I think in concert with that function,
17:07
added a tool where you can actually see like, we
17:09
cropped this, we added a unicorn and those things
17:12
like that. I think there's likely to
17:14
be more of that stuff that comes out, but whether or not
17:16
people actually use it and
17:18
are aware of it. Paul Well, I mean, this isn't a new
17:20
topic, right? I mean, we've been talking about fake news on
17:22
social media and particularly on Facebook with relation
17:24
to the US presidential election since 2016, really
17:28
since Trump won the first time. And
17:30
it's less about whether you use AI to generate the content
17:33
and more about whether the content is fake news. And
17:36
there's a continuing conversation with
17:38
all the social media providers about the level with
17:41
which they should be assessing content, not
17:43
just ads, which is what this is about, but
17:45
all content on the platform for
17:47
veracity and labeling it. I mean, I think Twitter
17:50
did it, has done it the last couple of elections. And
17:53
I think they XX now is not
17:55
doing it. I think they've moved away from that. No,
17:59
this is definitely a conversation. not just for the US, not just for
18:01
meta, but for all the social media platforms.
18:03
And it's about how we receive our information.
18:06
The drug testing idea in content
18:08
credentials are kind of interesting
18:10
models for this. But I think part of the
18:12
issue is it's like the wave, right? It's the wave
18:14
of content that's going to happen and it's
18:17
and it's completely uncontrollable, which
18:19
is, of course, one of the things we like about the internet. It
18:21
happens at great speed. But it does
18:23
mean when you do find these issues
18:26
around fake news or AI generated imagery,
18:29
you're sort of staring at a fire hose going, I wonder
18:31
if I could put you through a colander? Like, you know me,
18:34
like, is there is there are there models out there that
18:36
you've seen or heard of that might actually be able to navigate
18:38
a problem of this scale?
18:41
Oh, I would guess so. Analog.
18:43
And I would just say it's critical thinking is
18:46
the main thing. If you're really
18:48
going to address the wave, I think
18:50
is like the responsibility has to kind
18:52
of end up being on the person, not
18:55
necessarily the publishing platform. I
18:57
don't
18:57
know if I really think that surely it's I
19:00
think in reasonable terms, you need a combination.
19:02
You need there to be some you can't
19:04
let push all the responsibility on the user.
19:06
But I don't I also think we should
19:09
get smarter. Yeah. But at the same
19:11
time, companies must have a role as well. Yeah. And the truth
19:13
is, these companies are all have huge teams
19:15
of content moderation people. They have the
19:17
Twitter. Twitter
19:20
doesn't have a huge. It's not a burning room. Yeah.
19:22
So they have huge teams of content moderation people.
19:24
They have algorithms that are working on
19:26
stuff, you know, very sophisticated algorithms, particularly
19:28
on video stuff at YouTube and
19:31
Twitch and other video platforms so they can
19:33
assess content or videos because that's obviously
19:35
particularly precarious for online safety.
19:38
So this is happening at
19:40
scale at tech companies. They know they have
19:42
some kind of responsibility. It's unclear how much
19:44
we expect it to be them and how much we expect it to be users.
19:47
And I think probably with and I'm hesitant
19:49
to say this, given what I know we're going to talk about next,
19:52
but I actually has a role to play
19:54
here on veracity checking if you've
19:57
got software that understands what's
19:59
out there on the internet. and you know, can
20:01
kind of reason a little bit, then it can help
20:04
sort of at scale understand whether things
20:06
are just totally made up or whether they're built on
20:08
credible sources, etc. Yeah,
20:10
and I should, I'm sort of at pains
20:13
to point out that AI is not a monolithic
20:15
technology. What we're talking about is
20:17
a suite of technologies that can be both part
20:19
of not the problem, the
20:21
problems and also part of the
20:23
solutions, right? And I think
20:26
it may be pushing
20:29
everything into a singular like AI good,
20:31
AI bad is not necessarily the most
20:33
constructive way of executing this. That
20:35
balancing act between how much of this
20:37
is a responsibility of tech companies, and
20:40
I suppose government and us as
20:43
internet users, is
20:45
there a good version of that balance that you
20:48
see is attainable or at least something
20:50
to aim for? Yeah,
20:52
one factor that I would put out there is if you think
20:55
it's on the users to make critical assessments,
20:57
or at least you think part of it's on users to look at
20:59
stuff and make use their critical thinking to
21:01
make an assessment about veracity, you need to help them
21:04
have the information to make that call. Ah,
21:06
that's interesting. Which is part of what
21:08
the like labeling, you know, well, this isn't
21:11
backed up by, you know, credible sources
21:13
or check this fact, you know, sort of like...
21:16
Do
21:16
you think that there is like a huge
21:18
shift in like skepticism
21:21
of stuff on the web now because
21:23
of all this? Do you think that's like this major cultural
21:26
shift that we have like kind of across the
21:28
generations even where younger
21:30
people who are so digitally native, completely
21:33
technically fluent,
21:35
can sort of say, well, if it's a picture on the web,
21:37
I'm going to assume that a machine has touched
21:39
this at some point? Yeah,
21:42
I hope so. But I don't necessarily think that's
21:44
ubiquitous. There are groups
21:47
of people for whom that they
21:49
have more of that kind of filter
21:53
and groups of people for whom they've basically had less on, you
21:55
know, you've seen the rise of lots of conspiracy theorist
21:57
movements and the QAnon. stuff
22:00
and all the rest of it where they're
22:02
almost exaggerated by a lack
22:05
of engaging critically with whatever
22:07
you're seeing. I do wonder
22:09
that lack of trust that you're identifying how it changes us
22:12
and how it changes the way we interact with each other.
22:14
I mean that on a human level. I
22:17
don't feel good about that future. Yeah,
22:20
I mean how we interact with each other on a human level.
22:23
If I saw the guy or heard him say
22:26
it or her say it,
22:28
then I'm going to be skeptical of this whatever.
22:30
You know, I mean, you can sort of extrapolate
22:32
this out into the role of the media in
22:35
this as fact
22:37
checkers of information and whether
22:40
or not like what you can trust as
22:42
a source of truth. I think
22:44
the practitioner of being a reporter
22:46
or something that is changing like the
22:48
skills themselves are changing, but it's almost
22:50
like the need for them to exist
22:53
to do the checking on behalf of an audience
22:56
or the masses like that's picking
22:58
up. So if there's any like grade 12
23:00
students, come be a janitor. Yeah, please do. Just
23:03
to add to that point, I think one of the things that's been
23:05
really clear in the sort of changing land, hugely
23:07
changing landscape of the media globally
23:09
over the last 20 years with
23:12
the internet has been first a fracturing, but then
23:14
a real coalescence around really
23:16
valuable brands that have built
23:18
a lot of trust. And so if you see a
23:20
video on the New York Times or in The Economist or something,
23:23
you're going to believe it. Whereas if you see
23:25
it floating around on Twitter or on meta,
23:27
you might have a critical filter that
23:30
you apply to it about whether it's real or not. And
23:32
that's actually quite valuable for high quality
23:34
journalism, I think, in a way that then brings
23:37
back some of that cachet into the industry.
23:39
Just staying on the topic of AI, just
23:42
almost to look at it from another angle, an interesting story
23:45
in the last couple of days that KPMG has
23:47
lodged a complaint after AI generated material
23:49
was used to implicate them in non-existent
23:52
scandals.
23:53
I don't...
23:55
Just walk me through this one, Jess, because this one's actually
23:57
quite complex.
23:58
This is my favourite story. of this week,
24:01
a group of academics have put forward a submission
24:03
into a Senate inquiry that details
24:07
all of these bad things that KPMG is
24:09
said to have done. But it turns out that their submission
24:12
was AI generated and none of these case
24:14
studies ever happened at all. And now KPMG
24:17
is up in arms because it has been
24:19
misrepresented by a group of academics
24:22
who are professional thinkers
24:25
who used AI
24:26
and it spat out a bunch of lies. How
24:30
did this happen Alex? Seriously,
24:33
how? This is pretty tough, isn't it? I mean, I
24:35
don't want to go easy on the big four, but
24:38
you shouldn't as KPMG or anyone else
24:40
have to siphon through parliamentary
24:42
materials for fake news that's been
24:45
submitted by academic credible academics
24:47
outlining, you know, misconduct that you just
24:50
weren't involved in. I suspect
24:52
it's just teething around how
24:54
people use AI for important and serious
24:57
stuff. Ironically, one of the co-authors
24:59
who wasn't responsible for using AI
25:01
in this case, one of the co-authors on the submission had recently
25:05
published a paper about the dangers of using AI
25:07
in academic research, which
25:09
is just the icing on the cake for this story.
25:12
But anyone who's used AI in a professional context
25:15
or who has thought about ways that they might be able to use
25:17
it probably thought, well, you know,
25:19
it's going to have to be the computer plus human to
25:21
make it my work, not just something that's hooked up on
25:23
the, you know, by an artificial intelligence. So
25:26
did they just use AI
25:29
to trawl together a bunch
25:31
of stories around like scandals
25:33
to kind of add in the middle of a submission?
25:36
Is that because I'm thinking about how it would have happened,
25:38
right?
25:39
I think they asked chat GPT
25:41
and I don't know this.
25:42
I think it was Bard. Oh, sorry. Yeah.
25:45
They've asked Bard
25:47
when has an accounting
25:49
firm like KPMG come a cropper
25:52
in some way. And it spat out some
25:54
like case studies and no one's checked.
25:57
Like this is the this is this is the
25:59
human.
25:59
error part of this
26:00
whole story is no one checked. That
26:04
gives me nightmares. Every
26:07
time someone
26:07
tells me anecdotally in
26:09
an interview in the course of my work,
26:12
I am Googling whether or not that was an actual
26:14
thing. For the academics
26:17
not to have done that is just
26:19
like you said, I'm not here to stick up for KPMG
26:21
or any of these firms,
26:22
but you've
26:25
got to fact check your stuff. Is
26:27
it worth saying that there's totally room
26:29
here for this to identify something that's really clear
26:32
about what's happening with AI right now that needs fixing? The
26:34
company or person that comes up with a solution
26:37
to this part of
26:39
this bit of work that I've produced for you is questionable.
26:41
You should go and check this or a confidence
26:43
score as you work your way through a piece of work
26:45
that ChatJPT or Bata has produced where it's
26:47
like this is really, really true and this stuff
26:49
is a bit less true and this stuff we just made this up. A
26:53
self-assessment by AI of its own
26:55
work and its veracity based on it, it's got access
26:57
to lots of information. Presumably at some
27:00
point it knows which bits are not
27:02
linkable, which you couldn't go and find
27:04
on a Google search and which you could. Yeah, you could
27:06
have that as a heat map or something. Exactly. I
27:09
just like annotations. The
27:12
issue I find with it when I have
27:14
asked it for information and it comes
27:16
back to the information that I know to be wrong, I
27:19
come back with, well, where did you find that? Where
27:21
did you get that from? I feel like I never
27:23
get an answer from ChatJPT or Bata
27:25
or any of the rest of them. I think this should
27:27
be at least a function in there to almost self-annotate.
27:30
I got this from Bla. That would make it a
27:33
much more useful tool. That's part
27:35
of the black box thing though, isn't it? Oh, sure.
27:37
Then it would be Google. Then it's a system. Well,
27:40
but the people who built the AI don't know where it got
27:42
it from. There's no path to trace some of
27:44
the reasoning. That's what
27:48
neural networks often do is they put out results that you
27:50
can't actually necessarily draw a line
27:52
back through to the original source. That's
27:56
one of the scary bits about self-learning AI and
27:58
the new AI. new stuff. No
28:01
doubt there'll be more of this to talk about in the coming
28:03
weeks but for now we are out of time.
28:05
Huge thank you to our guest this week, Alex
28:07
McCauley from the Tech Council. Thank you so much for being here.
28:10
Thanks very much for having me Mark. And Jessica Sire from the
28:12
Australian Financial Review. See you
28:14
later. And with that I shall leave
28:16
you. My name is Mark Fennell and thank you for listening
28:18
to another episode of Download Vision.
28:29
You've been listening to an ABC Podcast.
28:31
Discover more great ABC Podcasts,
28:34
live radio and exclusives on
28:36
the ABC Listen app.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More