Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
ABC Listen, podcasts,
0:02
radio,
0:03
news, music and more. Is
0:12
it time to start regulating AI?
0:15
Well, the US President Joe Biden certainly
0:17
thinks so. Yes, this week on Download
0:19
This Show, the US has made
0:22
some pretty big demands of the artificial intelligence
0:24
industry. Exactly what is that going to mean for
0:26
the future? We will find out. So
0:29
Meta is introducing an ad-free
0:31
subscription tier for things like Facebook, and
0:33
we look at why one robot taxi company
0:36
has kind of failed to launch. All of that and much
0:38
more coming up. This is your guide to the week
0:41
in media, technology
0:42
and culture. My name
0:44
is Mark Finnell, and welcome to
0:46
Download This Show.
0:57
Yes, indeed, it is a brand new episode of Download
1:00
This Show and our guest this week, the head of content
1:02
at ByteSide, Seamus Byrne. Welcome back.
1:05
Good to be back, Mark. And Dr. Erica
1:07
Mealy, lecturer in computer science at the University
1:10
of Sunshine Coast. Definitely not USC
1:12
though, right, Erica? No, no,
1:14
no, not USC,
1:15
UniSC. Come
1:17
on, get with the program. UniSC.
1:20
So Seamus, let's start with you. There has
1:22
been some big announcements about AI coming
1:24
out of the US. Walk me through it.
1:27
Yeah, President Biden has signed a
1:29
big fancy executive
1:31
order, basically sending a signal
1:33
to the artificial intelligence industry,
1:37
which of course has exploded over the last
1:39
year, that the US government
1:41
is watching closely about what's happening right
1:43
now. And he's getting a whole range
1:45
of different departments to actually start doing
1:47
some really specific investigations into
1:50
everything from what kinds of
1:52
security risks could be attached through
1:55
to consumer privacy implications,
1:58
all this sort of stuff, really trying
1:59
to now push forward with that
2:02
idea of saying we need to get on the front foot
2:04
as a regulatory industry
2:07
and actually govern this stuff. Is there
2:09
anything Seamus in the remit
2:11
that you think they've missed that they should be putting on
2:13
the table? Look,
2:16
I think that's actually a good question and
2:18
I think that they will start to kind of get these different
2:20
departments to bring in some of
2:22
that extra expert help
2:24
rather than just trying to make it up themselves
2:27
based on not really understanding
2:29
it. Erica, your thoughts?
2:31
Well, I find it really interesting that some of the commentators
2:34
have been saying that this executive order doesn't
2:36
have enough teeth because
2:39
AI pictures can really have too many
2:41
teeth. And so the fact that they
2:43
picked the word teeth... Not enough fingers.
2:45
Too many teeth and not enough... No, not
2:47
enough fingers, too many teeth. Yeah. So
2:50
I don't see that in their executive order. We want humans
2:52
that look like real humans and
2:54
the pleasing part of it was alongside
2:57
this executive order, they've also come out and talked
2:59
about setting up a safety in
3:02
AI and an ethics in AI committee
3:04
through their commerce department. So I think
3:06
they really are going to try and put some
3:09
strength behind it and hopefully eventually some
3:11
regulation. But there is questions
3:13
if it's just a PR exercise to kind of
3:15
remind everyone that US are still
3:18
a big player while the UK summit is going
3:20
on at the same time, while the EU are
3:22
looking at their act, China has already beat them out
3:24
with regulations. So there's
3:26
a few people that are a little bit cynical, but I think overall,
3:29
hopefully it's a move in the right direction. What
3:31
actually can be done in terms of regulating it at the
3:33
moment, Erica?
3:34
Because it's so worldwide
3:36
and spread, I think part of the
3:38
problem is we don't know what we don't
3:40
know. There's no sort of
3:43
understanding of what is the data
3:45
being used for? Where is the data going? How
3:48
are they going to then use it further?
3:50
What are these training data sets? And how
3:52
are they trying to future
3:55
proof the tools? So we
3:57
actually have some kind of consistency.
4:00
So there's some interesting concerns and
4:03
hard to regulate, but I think until there
4:05
is some kind of law that says you must
4:07
or you shall not, there
4:09
won't be any kind of regulation or any
4:11
kind of holding back on what we're doing.
4:15
What do you think, Seamus?
4:16
Yeah, look, I think regulation is
4:18
also one of those really tricky points right now
4:20
where there are a few companies that have exploded
4:23
into the lead. And they're also
4:25
quite often being proponents of saying, yes,
4:27
this is dangerous and we need to regulate. A
4:29
lot of people do feel like that's partly
4:32
because they kind of feel like it's
4:34
a lot easier for them to manage regulations
4:36
now that they are gigantic
4:39
companies that have swallowed up large
4:41
percentages of the entire internet to feed
4:44
their training data right now. And
4:46
it would be really nice to stop other companies from
4:48
sort of chasing them down. So there
4:51
are a lot of those kinds of aspects
4:53
of regulatory capture that kind of get fed
4:55
into that. And there are elements in
4:57
some of these orders and codes of conduct
4:59
and different things that are coming out where they're actually trying
5:02
to say, if you're below a certain
5:04
scale, we want to let you actually
5:06
continue to run a
5:08
little bit more freely for a while in the name
5:10
of ensuring that we don't just end up with Google,
5:13
with Microsoft, with OpenAI and
5:15
Facebook basically controlling this whole
5:18
next wave of technology after they've just
5:20
controlled the last 20 years. I'm glad you brought
5:22
that up. Erica, how have the big tech companies
5:24
reacted to this announcement out of the White House?
5:26
To be honest, I actually haven't seen a great deal
5:29
of their reaction. But I think along the lines
5:31
of OpenAI and many of the others
5:33
have said, yes, we think it is very dangerous.
5:36
So I would imagine that they're going to be
5:38
supporting this, probably up until
5:40
the point where someone says, actually,
5:42
you've taken a whole pile of stuff you weren't supposed
5:45
to take it out. Because the way
5:47
the AI works, we don't know which
5:49
decisions have come from where. So take
5:52
it out means start from scratch. And I don't think
5:54
they'll support that. But I think on the
5:56
whole, they're keen to be able to be within
5:58
regulation. Like Sheamus was saying,
6:00
it's definitely something where now
6:03
that they're the big fish, they get to control
6:05
it a little bit more. On the whole, they're
6:07
keen to be part of the discussion and
6:09
maybe lead it into a direction that helps them
6:12
in the long term. Was that your experience,
6:14
Sheamus? Do you think that they are going to react that
6:16
way?
6:17
Yeah, look, I think that's kind of a
6:19
good fit for what's going on. And again,
6:21
especially when we're talking about different
6:24
jurisdictions as well right now, like literally
6:27
in the same week as this announcement, we've had the
6:30
UK held a big AI summit
6:32
with lots of countries involved and 20 something
6:35
countries all signed
6:37
an agreement as part of that. But of
6:39
course, the EU is always expected to
6:41
become in a lot stronger with these sorts of
6:43
laws. And so I think right now
6:46
the American companies would probably find
6:48
these executive orders quite
6:51
nice in that, again, it's not pushing
6:53
too hard. It's about really kind of
6:55
saying we're going to monitor harder and we're going to start studying
6:59
more of what's going on here. But
7:01
it still gives plenty of runway. Whereas
7:04
I could imagine that in the next few months, we might
7:06
see a lot stronger action
7:09
out of the EU when it comes to actually what are
7:11
you doing with people's data online.
7:14
And yeah, that could kind of change the game in a
7:17
lot of ways. Download the show is what you're
7:19
listening to. It is your guide to the week in media
7:21
technology and culture. I guess this
7:23
week, Seamus Byrne, head of content
7:25
at Biteside and Dr. Erica Milley, a
7:28
lecturer in computer science at the University
7:30
of Sunshine Coast. Mark Fennell is my name.
7:33
And we move now to Canada. Canada
7:35
has announced it will ban the hugely
7:37
popular app WeChat on government
7:40
services, Erica. But
7:42
why?
7:42
Well, this is an interesting
7:45
one, because not only have they come out
7:47
and said we're banning WeChat, which
7:49
a lot of countries are seeing very similar to TikTok
7:52
because they're Chinese owned. So
7:54
they're seeing it as a security
7:56
risk that we need to actually control.
7:58
But they've also come out and made. Kaspersky,
8:01
which is a Russian cybersecurity company
8:04
as well. So they're not just targeting
8:06
China. They're really starting to look at,
8:08
well, what is available? What
8:11
is potentially exposing this information?
8:14
And how do we actually deal with
8:16
it? And interestingly enough,
8:19
Australia seems to be a little
8:21
bit behind, as we were the last
8:23
one of the sort of five eyes that we talk
8:25
about, to Ben TikTok. And
8:27
despite the fact that our Senate committee actually recommended
8:30
back in August that we should also ban
8:32
WeChat, we haven't done
8:33
it yet. Seamus, good
8:35
call or overreach?
8:38
Look, I think one of the easiest answers
8:40
about this whole thing, right, is that when
8:42
it comes to, you know, what is an app for?
8:45
And then how does its security
8:47
system work related to that task? WeChat
8:50
is fundamentally a messaging app that
8:52
does not support end-to-end encryption.
8:55
It is genuinely set
8:57
up in a way that means it is
9:00
able to be comfortably monitored by the
9:02
Chinese government. And so using
9:05
it for general chats and
9:07
whatever you might sort of think, I'm just going to connect
9:09
with other people through this app, it
9:11
makes total sense to not let this be on
9:13
government devices. You know, I looked
9:16
up sort of a quick background on the safety
9:18
implications of the application. And,
9:20
you know, it is noted for the fact that
9:23
if people in China send messages to each
9:25
other on WeChat and somebody says something that's negative
9:27
to the government, that it will be removed
9:30
from the app during that chat. Like
9:32
it is being directly monitored in those kinds
9:35
of ways within that Chinese environment.
9:37
Now, you know, they might sort
9:39
of claim that it isn't monitored in the
9:41
same way as all around the world, but it ultimately
9:44
flows through centralized WeChat
9:46
servers and doesn't have any encryption
9:48
attached at all. So it makes total
9:50
sense that this is not an app that
9:53
you should be using when it comes to thinking
9:55
you're having a private conversation with anybody. Are
9:57
there other countries around the world that have banned WeChat?
10:00
I know we've talked about TikTok, but are there other
10:02
countries that have banned or throttled WeChat
10:04
around the world? Does anyone know?
10:06
Yeah, I'm not sure, to be honest. I know TikTok has got
10:08
a lot of news around its bans,
10:10
and it's surprising that WeChat has been so
10:13
far behind TikTok in
10:15
that. They've really flown under that radar.
10:17
Why do you think that is, by the way? I'm
10:18
not really sure, but the thing that gets me is that Elon
10:21
Musk has actually said that he wants
10:24
X slash Twitter to become
10:26
like WeChat. He wants it to be the everything
10:29
app. And so, whether
10:31
it's just because it's been flying
10:33
under the radar, whether that perhaps people
10:36
in those environments, I mean, perhaps people
10:38
are less likely to put it on their work devices.
10:41
I mean, I just say, why would you do that? If
10:44
you're working in a security-intense environment,
10:46
why would you put TikTok or WeChat
10:49
or Facebook or Instagram, any of those onto
10:51
your feed? I don't understand that at all. Do
10:54
you think WeChat cares about this move,
10:57
Seamus?
10:58
Look, Yeah, WeChat is owned
11:00
by Tencent, and they are one of the biggest, certainly
11:02
like game developers in the world. But
11:04
as Erica was saying, that this is within
11:06
its core market. It is this everything
11:09
app these days. So I think
11:11
you can send payments to people. You
11:14
can do all sorts of really useful functions
11:16
through that app if you're in the kind of environment
11:19
where it is used by a really
11:22
large percentage of the population.
11:24
So there are lots of reasons why
11:27
it's great
11:28
for its core markets,
11:31
and its core markets happen to be the biggest
11:33
markets in the world. So yeah, I
11:35
think it's a bit of a blip by
11:37
comparison. In fact, probably the biggest issue,
11:40
and again, this is government-focused, so it's kind of not
11:42
as widespread as it might otherwise be. But
11:44
when there's often been talk about banning things like
11:47
this, there is a reasonable
11:49
problem with actually causing
11:51
problems for Chinese folks who
11:54
live outside of China and might need
11:56
to use it to continue staying
11:59
in touch with their families. family back home and all
12:01
that sort of stuff. So I think widespread
12:04
bans, not necessarily all that functional
12:08
or it becomes a problem for people. Whereas
12:11
saying that people who work in government
12:14
can't use it, that really does sort
12:16
of feel like a far more reasonable idea
12:18
when it comes to risk management. Download the show
12:21
is the name of the program. It is your guide to the
12:23
weekend media, technology and culture. And
12:25
interestingly is that Facebook and Instagram
12:28
are launching subscriptions in most of Europe. That
12:30
will remove adverts
12:33
from the platform. Now, hypothetically,
12:36
Erica, if you had an option to have
12:38
an ad free experience of
12:40
Facebook, would you take it?
12:42
Interesting question. Honestly,
12:45
I'm a bit of a cheapskate. So I probably just live
12:48
with it. Also, I like
12:50
a number of my friends do tend to
12:52
use pseudonyms on our Facebook accounts. So
12:54
what it's tracking may not have any
12:57
bearing on the reality. And so
12:59
in that case, they can serve me whatever they like.
13:01
But I have
13:03
a fundamental problem with you
13:06
pay to have no ads or you leave
13:08
the platform. Honestly,
13:11
they're not great options. That's not what the
13:13
EU consent laws are really about. They're
13:15
really about do you choose to be part
13:17
of this and do you want to continue
13:19
to be part of it? But it is
13:22
interesting that they've come up with this idea. Seamus,
13:24
do you agree?
13:26
I mean, I feel like I would try it for at least
13:28
one month. That would be my take. Just
13:30
at least get a good look at it and see how
13:32
different does it feel. I know I'm one of those people
13:34
who pays for for YouTube premium
13:36
because it really does make a difference in
13:38
your life if you're not seeing a million Facebook
13:41
ads. I'm sorry. Yeah, YouTube ads.
13:44
But yeah, I think with this one, the
13:46
really sort of big thing here is exactly
13:48
kind of like what Erica was mentioning, it's
13:51
going to be so important for Facebook
13:53
in its battle with how is
13:55
it using the data of European
13:57
citizens that they're
14:00
trying to claim, well, if we give them the option to
14:02
pay, that is effectively the opt-in
14:05
or opt-out when it comes to being tracked
14:07
for advertising. And I think we're
14:09
pretty quickly seeing that the
14:12
EU itself is looking
14:14
to respond to that and basically say, that is
14:17
not the option at all. That is not the either-or.
14:20
And actually, people should be able to make a
14:22
much more informed choice about saying,
14:25
I don't think I should have to be tracked in order
14:27
to stay in touch with all these
14:29
other people who happen to use this same platform. So
14:32
if you are in the EU, you'll end up paying around 10
14:34
euros per month for this
14:36
ad for experience. Erica, how do you think
14:39
their price point is going? Is it too expensive or not expensive
14:41
enough?
14:42
Well, it seems to
14:44
be less than X and Twitter.
14:47
That's always a good thing to be less than Twitter. Less than
14:49
Twitter on every front would be great. And
14:52
TikTok, though, is also
14:54
a little bit on the cheaper side. So it seems
14:56
to be about the middle of the road. Whether
14:59
that's the best price point or not, it will be
15:01
interesting to see. It's also interesting
15:04
that if you pay via the app
15:06
stores, you pay a premium because
15:08
an amount has to go back to Google or to Apple.
15:11
So they're actually saying, please don't pay via the
15:13
app store. Come and pay us directly, which
15:16
is a very interesting
15:18
way to be able to say, oh, yeah, but they're taking their
15:20
slice. It's like, yeah,
15:22
but how many people
15:23
did you pick up by being in that app store? Do
15:26
you think ultimately it will work? Because,
15:29
OK, the reason I asked Seamus is because it
15:31
feels like a thing that's been done in reaction
15:34
to legislation rather than a thing that's
15:36
been done for business reasons. Yeah,
15:39
that's right. And so I think in that sense,
15:42
this is probably framed around the
15:44
idea that they know they're not going to make money
15:46
from it. I think they're very good company at getting
15:48
data from experiments like this.
15:51
They'll probably find some interesting demographic
15:53
data out of who does decide
15:55
to opt in or out of this particular
15:58
system. But I don't expect. they're
16:00
thinking that it's going to replace
16:02
it in any way, particularly with the
16:04
idea that from March next year, they're even making
16:06
it more complicated by saying that if you
16:08
have a business account and your personal
16:11
account, that those accounts will
16:13
have separate fees attached
16:15
so that you'll have that first 10 euros
16:18
a month, and then you'll need to pay
16:20
an extra six euros a month for
16:23
each extra account that you run through
16:25
your account setup. And again,
16:28
that even feels like the kind of additional structures
16:31
that will probably start to make
16:33
the EU authorities even more annoyed
16:36
at the fact that this is not the solution
16:38
they were thinking should be brought
16:40
in for this kind of problem.
16:42
This is really interesting that
16:44
they're looking at doing this because after X
16:46
has added its subscription costs
16:49
alongside all the other crazy
16:51
changes that Elon has made, they've
16:53
actually found that the primary
16:56
user group in Twitter
16:58
is now sports fans. So
17:00
going from a very tech heavy platform,
17:03
suddenly it's the people cheering
17:05
on major league baseball that really
17:07
can't do without Twitter. So
17:10
it would be interesting to see, because they're talking about
17:12
the number of users if Facebook and Instagram
17:14
having increased with the death
17:16
of Twitter, how this actually
17:18
works over on that platform. So that doesn't actually surprise
17:20
me at all, because the last thing,
17:22
the last live thing that people genuinely
17:24
watch on television will be sport.
17:27
So it kind of makes sense that the thing that
17:29
you have to engage with live and in real time,
17:32
that's the last holdout for Twitter. In a sense,
17:34
it doesn't overly surprise me, Erica.
17:37
Seamus, what are your thoughts? Yeah, look,
17:39
I think there's a really important sort of additional
17:41
factor on this Facebook EU
17:46
subscription offer. And that is the fact
17:48
that within days
17:50
of this having been announced, and actually it
17:53
should kick in this week, is
17:56
the fact that actually the European Data Protection
17:58
Board has issued a decision
18:01
that actually says that Facebook and Instagram
18:04
cannot continue to use
18:07
private data to target their
18:09
behavioral advertising. This
18:11
is going to become, I think, a very big
18:14
talking point, I'm sure, in coming weeks, on
18:17
how Meta responds to the fact that the EU
18:19
has now effectively said that you are banned
18:21
from using data
18:23
to actually target behavioral advertising
18:26
on your customers. It could have
18:28
a huge impact on their ability to sell
18:31
ads in Europe.
18:32
Download the show is the name of the program. What
18:36
happens when a robot taxi
18:38
or robot taxi
18:40
decides to shut down
18:42
in the name of trust,
18:44
or so it seems? Shamus, talk me through what's
18:46
happened with a US company called Cruise. Most
18:50
of the headline reports have pointed out that
18:53
Cruise is voluntarily pausing
18:56
its autonomous fleet of robot
18:58
taxis, which have heavily
19:00
been driving around San Francisco over
19:02
recent months because
19:04
they want to make sure they want to, quote,
19:07
earn public trust and
19:09
restore some faith in their operations.
19:12
It also just happens that a couple of
19:14
days before they announced that
19:16
they would voluntarily do this, that
19:18
the California DMV suspended their robot
19:21
taxi permit effective immediately. Unrelated.
19:24
Unrelated fact, surely. Yeah. They
19:26
have operations in a couple of other cities,
19:29
I think, that have allowed this sort of testing. So
19:31
they are definitely pausing in some places
19:34
that they were allowed to continue. But
19:37
it's pretty dramatic given that it was just August
19:40
that California expanded their
19:42
allowance to run robot
19:44
taxis in San Francisco. What
19:47
does this mean for the future of, I
19:49
mean, not just driverless cars, but sort of robot taxis
19:51
in particular, because I do think that it's
19:53
one of those ideas that sounds
19:55
great, but actually in practice, there's
19:58
a whole bunch of unexpected kinks in the future.
19:59
the story Erica.
20:01
It's definitely something I was genuinely
20:03
shocked when I first heard about it that San
20:06
Francisco were willingly letting
20:08
their citizens basically
20:11
be crash test dummies for these these vehicles.
20:13
There's you know this poor woman was dragged 20
20:16
feet apparently underneath the robo taxi.
20:19
I mean allegedly it's horrific
20:22
and allegedly though it was caused by
20:24
another human driver but they've
20:26
hit trees, they've hit fire
20:29
engines. There's I think my favorite
20:31
part though of the whole shenanigans
20:34
is that there was a guerrilla protest
20:36
movement that discovered that you could disable
20:39
the taxis by putting a traffic cone
20:41
on their bonnet. So there were people riding around
20:44
San Francisco and putting traffic cones
20:46
on bonnets of cars to try and protest
20:49
the robo taxis. Look I
20:51
really think it's going to put a significant
20:53
dent in the idea and
20:56
I was shocked that they came
20:58
to 24-7 so quickly. The idea of a 3
21:01
a.m. when there's no other cars on the road
21:04
seemed to be quite a sensible idea. Let's
21:06
get drunk drivers out of their cars, let's
21:08
get people into these vehicles and if they
21:11
are the only vehicles around then
21:14
you really are constraining the independent variables.
21:16
You're really locking it down from that computer-centered
21:21
design perspective but I think
21:23
it's going to take a while to be
21:26
able to recover and I don't think anyone
21:29
believes that this is more than
21:31
a PR exercise. Oh we're volunteering
21:33
it for trust.
21:35
No, you were
21:36
banned and this
21:39
is not for trust but good try,
21:41
good try marketing department.
21:42
So what would it actually take?
21:45
If we take it on face value shames, what would it
21:47
actually take to rebuild trust
21:49
in a service like this? Yeah
21:51
there's definitely a bit of a going back
21:53
to square one element here of getting
21:56
back to basics on the
21:58
way in which these things are being tested. the kinds
22:01
of inputs,
22:03
like Eric was saying about controlling your variables,
22:05
it's like, what are the inputs that these are actually responding
22:08
to? Because that idea that there's
22:12
been lots of these second order effects
22:14
when it comes to the way that they've been involved
22:16
in accidents. So that
22:19
fire engine incident, it was the fact that
22:22
it was driving into an intersection where it
22:24
had a green light. So it was making all
22:26
of the normal assumptions, but it somehow
22:28
wasn't noticing or hearing
22:30
that there's a fire engine steaming through
22:33
this intersection. And that's how you know that
22:35
a robo taxi was never raised with Thomas the Tank
22:37
Engine.
22:37
That's right. But
22:40
also, is it possible to actually get to that stage
22:42
Erica without having cars on the road? Well,
22:45
there was a talk I went to many, many
22:47
years ago, which the RACQ were discussing
22:50
and they firmly believed that
22:52
the only way to commingle
22:55
autonomous cars and people was to have them
22:57
physically divided. So in sunny
22:59
Queensland, we have our busways that are completely
23:02
removed from the road. And so if
23:04
the autonomous vehicles were able to use some
23:06
of these ones where you're limiting
23:09
the chaos that humans cause,
23:12
then possibly it could happen. But then at
23:14
the same time, if you're making them go on a busway,
23:16
why don't you get on a bus? So
23:19
there's those questions that it
23:21
raises. But you know, do we divide
23:23
the motorway so that there is an autonomous vehicle
23:26
lane and or things like
23:28
that? Take them out of city centers, perhaps
23:30
where there's lots
23:31
of people, lots of cars and lots of unpredictable
23:33
events that are going to happen. But
23:36
yeah, downtown San Francisco
23:37
has a lot to try and grapple
23:40
with to work out if they can make this work. But
23:43
one of the essential pitches of
23:46
driverless cars is that if you have enough driverless
23:48
cars on the road, they will be more efficient
23:50
because they'll understand the roads better and they'll
23:53
be able to produce less traffic in
23:55
time. Like if you start to give them their own lane,
23:58
are you?
24:06
And
26:00
I think even in the context of passengers
26:03
and pedestrians and people, there's
26:05
a lot of the city centres that are moving to no vehicles
26:08
full stop and having those kinds of concession
26:11
charges and congestion
26:13
charges and not letting the cars
26:16
intermix with people, let alone autonomous
26:18
ones. So
26:19
we have to think about that context too. Just
26:22
so we're absolutely clear, Niamh, the producer, shook her head
26:24
so hard at me when I made the wheels reference.
26:27
Seamus, do you think the idea
26:29
of, for lack of a better term, segregating
26:32
driverless cars from driver-full
26:34
cars make sense? Yeah, certainly
26:36
in the way that we do
26:39
have bike paths and bus lanes, all these
26:41
sorts of things, I think that idea of sharing
26:44
bus lanes could be good within a certain volume
26:46
context. But the thing that really does strike
26:48
me about all of this testing that's happened
26:51
is when I think maybe
26:53
there's a lot of lessons that have
26:55
been learned and data that could have been gained,
26:58
but then you have to stop and realise these
27:00
are private companies who are not sharing their data.
27:02
And I feel like that is perhaps
27:04
the thing that we need to revisit, is if we're going
27:07
to let these people test these
27:09
kinds of vehicles in public spaces, then
27:11
they should be doing it in a way that actually elevates
27:13
the entire autonomous
27:16
vehicle industry and that they have to share their
27:18
data publicly and create
27:20
a much more of a public
27:23
open source of a sense of how
27:26
these things are working and what they've
27:28
been learning through these processes, and
27:30
especially when there's been any kind of an incident
27:32
at all, because that should be the kind of process
27:34
here. If we're going to let them do this, then they
27:37
should be getting better much faster. But
27:39
if it's just private companies being allowed
27:41
to do this and keep that data to themselves,
27:44
then actually, yeah, we will ban
27:46
them in this case and then none
27:49
of us will be better for it. And with that,
27:51
we are out of time. Huge thank you to
27:53
our panelists this week. Seamus Byrne, head of content
27:55
at Bikeside. Thanks for joining us in the show.
27:57
Thank you. And Dr. Eric Amelie, let's...
28:00
in Computer Science at the University. Sunshine
28:03
Coast, thank you so much. Thank you,
28:05
pleasure as always. And with that, I shall
28:07
leave you. My name is Mark Fennell, and I'll catch
28:09
you next week for another episode
28:11
of Brown notation. Thank you.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More