Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
ABC Listen, podcasts, radio,
0:02
news, music and more.
0:12
You're fired. No, wait, you're rehired. Yes.
0:14
This week on Download This Show, we
0:16
are wading into one of the most
0:18
tumultuous weeks in the world of technology.
0:21
Open AI, the company behind ChatGPT, has
0:23
reinstated their CEO, Sam Altman, after firing
0:25
him just four days ago. So how
0:27
did it happen? Why did it happen?
0:30
And where do we go from here?
0:33
All of that and much more coming
0:35
up. This is your guide to the
0:37
week in media, technology and culture. My
0:39
name is Mark Finnell and welcome to
0:41
Download This Show. Yes,
0:52
indeed, it is a brand new episode of Download
0:55
This Show and a big welcome to our guest
0:57
this week, social media strategist, who actually is quite
0:59
critical of social media, Meg Coffey. Welcome back.
1:01
Hello. And the
1:04
co-founder of Patient Notes, Sarah Moran.
1:06
Welcome back. So good to be here.
1:08
Thank you. Right. So it
1:10
has been the most famous, the most talked
1:12
about piece of AI technology in the last couple
1:14
of years. And it seems inside
1:17
the world of open AI, the home
1:19
of ChatGPT has been
1:21
a bit of a bin fire. Meg, walk me through
1:23
what's happened in the last couple of weeks at the
1:25
top of open AI. The
1:27
kids have been fighting. They
1:31
they're not getting along. They really don't
1:33
like each other. And it
1:35
has captivated the world. The
1:38
amount of episodes of succession or or the
1:40
movies, the social network that could have been
1:42
written. The tech journalist did not get any
1:44
sleep in the lead up to Thanksgiving. That
1:46
was for sure. So what happened? Well, Sam
1:48
was out and then he was in and
1:50
then he was out and then he was
1:52
in. So Sam Altman,
1:55
who is the CEO co-founder of
1:57
OpenAI, he is the
1:59
main man, right? But whilst he is the main
2:01
man, he is beholden to a board. And
2:04
over a weekend or a Friday night, the board kind of
2:06
said, well, we don't really like you anymore, so we're gonna
2:08
get rid of you. And so
2:10
they ousted him. But the problem is, is the board
2:12
wasn't really a good board. They were not really like
2:14
understanding governance and they weren't really a powerful board and
2:16
a smart board. And so they didn't really go about
2:19
it the best way, is the simplest way for me
2:21
to explain it. And so then there was this
2:23
tussling and Sam has some
2:25
really good friends, one of them being
2:27
Greg Brockman, because his last name Greg's
2:29
the main guy. And Greg was like, well, Sam's out,
2:32
I'm out. And then it
2:34
started this tussle and then Microsoft stepped
2:36
in because Microsoft has invested a lot
2:38
of money. And the problem is, is when they
2:40
got rid of Sam, they forgot to tell Microsoft. Kind
2:43
of bad, right? Like you tell your number one
2:45
investor things and they forgot to tell him. What's
2:48
happening underneath the hood? Like enough time
2:50
has passed that we kind of get a sense of it. So what's
2:52
actually going on? What happened to precipitate these moves? So
2:55
you need to understand a little bit about the structure of open
2:58
AI itself. It is a
3:00
little bit confusing in its structure and that
3:02
there are some companies and there are some
3:04
nonprofits and they all sort of own each
3:06
other and sit on each other's board. So
3:08
that in itself is quite complicated. But the
3:11
overarching goal is to pursue what's called artificial
3:13
general intelligence. And that is the North Star
3:15
of open AI. And
3:17
it's become quite public as to
3:19
why Sam Altman was fired. And
3:22
they were essentially saying that he was hiding stuff
3:24
from the board. Now, what
3:26
was he hiding has been the question that has
3:28
kept us all awake. And although
3:30
it's not been exactly confirmed, it is
3:32
about this concept of Q-Star. We've
3:35
been talking about how the
3:37
dramatic leaps that open AI has been able
3:39
to make. Like you can't understand just how
3:42
far this tech has come so, so
3:44
quickly. And Q-Star
3:46
is this idea that we're
3:48
getting really close to what
3:51
is that sort of artificial general intelligence? And
3:54
that this needs to be disclosed to the
3:56
board so that they can make accurate decisions
3:58
about the future of open AI. What
4:00
most people know about OpenAI if you've
4:02
used ChatGPT is that up until now,
4:04
the way that the
4:07
large language models work is that there
4:09
is a statistical way of predicting
4:11
what is the next word that may come next. And
4:13
so that is, you know, where OpenAI is good is
4:15
that kind of maths, the stats. Like the chances are
4:17
the next word you want is going to be blah.
4:20
Where this changes with QStar is that it's
4:23
actually conquering the ability to do maths, like
4:26
real maths. And that is
4:28
going to accelerate things quite dramatically. And
4:30
then there's the question about what does that mean and
4:32
what are the implications and how could that be applied
4:35
and is that dangerous. And
4:37
this is where the tips start
4:39
to kick off because you
4:41
have a number of different camps, particularly in
4:43
San Francisco, about the future of technology. And
4:46
is it that you just add field on the fire, see
4:48
where it goes, let's see where this is going to end
4:50
up? Or do you make conscious
4:52
decisions about planning what it is
4:54
we're going to do with this general
4:56
intelligence once it's created? In the
4:59
days after all this happened, there were
5:01
reports that several staff researchers had written
5:03
to the board warning of a powerful
5:05
discovery that it is said, and this
5:07
is three degrees of hearsay, could threaten
5:09
humanity. Is that what we're talking about?
5:11
Is that where this development falls in
5:13
or could that be something else that
5:15
remains unknown and like deeply
5:17
dystopian in nature? That is definitely,
5:19
you know, that unknown thing is always the
5:22
question. And we know we weren't in the
5:24
board meetings. We don't know what the letters
5:26
were. We're not there. It's the implications. So
5:28
the fallout has been that actually OpenAI are
5:30
sitting on something that really could draw massive
5:33
implications for our future. And
5:37
you know, maybe that's something they should tell their board
5:39
about when they're having these meetings and when they're planning
5:41
these things. And so you're right, this dystopian future is
5:43
up for grabs, but it always has been really right.
5:46
Like that has always been a risk. And now it's becoming
5:48
closer. People like we need to start making calls and
5:51
making decisions about that. Oh, see, as I said, the
5:53
kids aren't getting along. Right.
5:56
So Sam is let go. very
6:01
and then becomes quite public
6:04
then a few key leadership people
6:06
like Greg as mentioned they also go what
6:09
happens next because there's been a lot
6:11
since that initial firing a lot has
6:13
transpired well yeah well look open AI
6:15
is a very important company right
6:18
we can't discount that whether it's the nonprofit
6:21
side or the for-profit side and there is
6:23
you know there's a lot of debate is
6:25
it this Q-star development is it what is
6:27
this thing that that Sam isn't communicating to
6:30
the board some people go it's the fact
6:32
that he wants to make profit it could
6:34
simply be that open AI isn't
6:36
is important to the world and as I said
6:38
you know earlier Microsoft stepped in they've invested a
6:40
lot of money so Microsoft over
6:43
this whole chaotic weekend steps in and
6:45
goes well hey wait a second how
6:47
about you guys come work for us
6:49
and there was this weirdly worded statement
6:52
about how we're excited for
6:54
the next chapter with Sam
6:56
but this so this was all
6:58
done and so Sam all of a sudden is working
7:01
and open AI is now at Microsoft
7:05
right but then within
7:07
like 12 hours it's
7:10
not and Sam is back as
7:12
the CEO of open AI because
7:15
all of the employees I mean this is
7:17
this is the kind of founder and CEO
7:19
you want to be everybody has threatened to
7:21
quit if Sam's not the
7:23
boss yeah oh and
7:26
then and don't forget in between all
7:28
of this there was an interim CEO
7:30
right some guy named Emmet Shear who
7:32
was like a co-founder of twitch so
7:34
in the span of something like a
7:37
weekend open AI
7:39
has four CEOs right
7:42
it's just it's absolutely wild
7:44
what happens I think you
7:47
know Microsoft Nadella he
7:49
gets he gets a lot of credit for
7:51
the masterful strokes in this because again open
7:53
AI is an extremely important company and it's
7:55
not something that can just be left to
7:57
sort of fall over and when I talked
7:59
about the board and that they're an
8:01
inexperienced board, they really are. They didn't
8:04
have a lot of experience in tech
8:06
or in board and understanding governance. And
8:08
I think what we will see out
8:10
of this is a much stronger company
8:12
and maybe it is best,
8:14
I don't want to say it's good that this
8:16
happened, but it's good that all of this chaos
8:18
happened now and there is some structure to this
8:21
company than, you know, 12 months
8:23
down the line after they've discovered something
8:25
dangerous. That drama is likely to
8:27
unfold in sort of granular ways over the coming weeks,
8:29
but let's take a step back, right? So one
8:32
of the reasons why we're talking about it is because TATGPT
8:34
is quite simply the most culturally
8:36
impactful AI of the last year or so.
8:38
It's the one that most people have interacted
8:40
with. And you know, I
8:42
think it's for many people, it'll be their first experience
8:44
of, you know, what AI can do. And we know
8:47
that AI has been around, often
8:49
used on us, but it's one of the first
8:51
sort of consumer facing things that lots of people
8:53
have interacted with and really have come to
8:55
understand its power. When dramas
8:57
like this unfold, it has an effect on
8:59
how users and how investors and other companies
9:02
in the space behave. Now,
9:04
Sarah, you work in the AI space.
9:06
Have you sensed people that work in
9:08
other tech startups changing behaviour
9:11
in light of what's happened with OpenAI?
9:13
It's quite interesting. So I've been looking, or
9:16
my team and I have been looking at
9:18
what has been the impact on the tech
9:20
itself. So for example, there was an outage
9:22
last week. We've seen
9:24
high lag times on a lot of the
9:26
calls to the API, which has just sort
9:28
of been happening, which has not happened before.
9:30
So even the tech itself is a little
9:32
bit, I mean, I won't say
9:34
unstable, but there has been instability within that.
9:37
So, you know, running mission critical workflows
9:39
on OpenAI's APIs at the moment is
9:41
a bit of a crazy idea. But
9:44
as you say, it's also been the most
9:46
accessible one. So what
9:49
I mean by that is if you're
9:51
looking to build a startup, you
9:54
basically, you buy access to the API, you
9:56
plug in and away you go. I mean,
9:58
that's oversimplifying it. So
10:00
it means that there's a lot of
10:02
startups who are starting to build on
10:04
these platforms, and then the
10:07
fact that they're kind of on shaky ground themselves
10:09
really calls into question like, should you be doing
10:11
that as a business? But yeah, that uncertainty around
10:13
what you should and shouldn't be doing is definitely
10:15
a conversation that we're having. And the way people
10:17
are covering for that, at least I hope they're
10:19
covering for it in this way, is
10:21
to basically use a bit of everyone's tech, right?
10:23
So use a little bit from Amazon, use a
10:25
little bit from Meta, use a little bit from
10:28
Google, use a little bit from OpenAI,
10:30
and weave them together in such
10:32
a way that you're creating that
10:34
stability yourself in your organization. Meg,
10:36
have you seen a change in people's attitude and
10:39
behavior in the tech scene in response to
10:41
OpenAI? I think one of the
10:43
biggest things is the fact that
10:45
Microsoft is their biggest investor, and all of
10:47
this went down on Google Meet. Right?
10:53
That is the biggest, funniest thing. Look,
10:56
that's the biggest funniest thing.
10:59
No, I mean, it's the nerdiest thing. No, it
11:01
is the nerdiest thing. I think it's hilarious. There
11:04
has been a shift, and I think
11:06
what Sarah just said about diversifying is really
11:09
important, but it's what we've said for the
11:11
beginning of time. You're always on rented land
11:13
in a lot of this stuff, like in
11:15
social media. If you're building your entire community
11:17
on a social media platform, that's on rented
11:20
land. That's not on your own, so you're beholden
11:22
to them. It can be taken away from you at any
11:24
time, right? So if you're building your startup
11:27
on OpenAI's API, well, then you're
11:30
beholden to them. By
11:32
diversifying where you're relying
11:34
on things, that's just smart
11:36
business in today's world. Download
11:40
the show is what you're listening to. It
11:42
is your guide to the week in media
11:44
technology and culture. I guess this week, Meg
11:46
Coffey, social media strategist and the co-founder of
11:48
Patient Notes, Sarah Moran. Mark Fennell is my
11:50
name. Now we're talking about Tesla. Using
11:52
legal cases unfolded overseas where the
11:55
question has arisen, did Tesla know that
11:57
there were defects in their self-driving technology?
12:00
and still let people drive anyway. Sarah,
12:02
what's happened? Well, in the
12:04
States at the moment, there's an active case
12:06
where a judge has found that there is
12:08
reasonable evidence that Elon Musk and other people
12:11
at the company knew
12:14
that the self-driving technology
12:16
was defective, but they still allowed people
12:18
to drive them anyway and so that
12:20
they didn't disclose to customers that there
12:22
was potentially an issue with their self-driving
12:24
technology. So this is, you know, the
12:26
first time this has actually come out
12:28
and they're saying that Tesla engaged in
12:31
a marketing strategy that painted the products
12:33
as autonomous when in fact the company knew
12:35
that that wasn't the case,
12:37
that there are issues, there are
12:39
bugs. It's had a significant effect on
12:41
the belief about the capabilities of the
12:43
products in the eyes of consumers. So
12:46
that's where we're at. So the question
12:48
comes up, you know, is a car actually driving
12:50
itself and what are
12:53
the limits to that technology and what
12:55
are some of the other lawsuits that
12:57
may be coming afterwards? Do you get
12:59
the sense, Meg, that there's a lot of
13:01
jurisdictions around the world that are just waiting
13:03
to see how it shakes out in a
13:05
few different cities and a few different countries
13:07
legally to kind of provide us a roadmap,
13:10
pun intended, provide us a roadmap for
13:12
how to legislate? Because I will confess,
13:14
a part of me thought we would
13:16
be further along with self-driving cars by
13:18
this point, given the technology has been
13:21
sort of in development for 15, 20
13:23
years. I was sort of
13:25
expecting by this point that there'd be more of them on
13:27
the roads and it feels like it's encountered
13:29
enough legal and I
13:31
guess you would say cultural pushback that it hasn't
13:33
really developed. Do you think that there are countries
13:35
and cities that are just sort of waiting to
13:38
see what happens in a few test markets? Yeah,
13:41
definitely. I think with
13:43
any type of legislation, you need a couple
13:45
test cases to get something through. You want
13:48
to see how it goes and
13:50
then the floodgates will be open. And
13:52
I wonder how much of the pushback
13:54
recently on a lot of this
13:56
Tesla stuff is
13:58
a of
14:01
Elon. And I'm being serious when
14:03
I say that, like, is his
14:05
personal brand starting to impact the
14:08
way we feel about certain
14:11
things, like the way that we feel about autonomous
14:13
driving, the way that we feel about the Tesla
14:15
car, the Tesla brand? Now, back to
14:17
the lawsuit and your question. Yes, I
14:19
think that, you know, should we
14:21
be further along? Possibly.
14:24
I think that, you know, it is humans
14:29
are litigious by nature. Americans are
14:31
extremely litigious. If there's a way that they
14:33
can control something, they're
14:35
going to. And so I think that they're
14:38
going to be uber, uber
14:41
careful about this one before they
14:43
just let completely the machines take over.
14:46
Right. What do you think, Sarah? Well,
14:48
I've driven Tesla's, many of
14:50
them. And I think actually the
14:52
gap is I like
14:55
the autopilot ish features, but
14:57
I don't need a car
15:00
to drive itself. I
15:02
think we come so close to it
15:04
with, you know, all
15:06
the whizbang lane keeping and, you know,
15:08
all of the things that are close
15:10
to autopilot. Like, I'm satisfied. I think
15:13
that the the jump to being full
15:15
of autopilot and the the risk that
15:17
a company would take on to do that, the,
15:19
you know, the mind change that
15:21
you would need in a consumer to say,
15:24
just let the car drive itself. That's
15:27
such a jump. And I think
15:29
most people, the luxury
15:31
of having a lot of the
15:33
features that are autopilot close is
15:36
good enough. And at least in Australia, in consumers anyway,
15:38
when I do jump it, you know, if I do
15:41
jump in an electric car and I show people how
15:43
that works, people are like, wow, this is so much
15:45
better than than what we've had in the past. I'm
15:47
really satisfied with that. So I do wonder if that
15:49
lag, so you're talking about that, you know, 15 year
15:51
uptake, in part is that consumers
15:53
aren't knocking down the door saying, hurry up
15:55
and give me my autopilot car. Like, we're
15:57
just we're just not doing that. Yeah,
16:00
like so like my car can parallel park itself
16:02
and like find a parking space and do all
16:04
those kinds of things itself, right? I don't
16:07
need it to do that. I don't let it
16:09
do that. I'm good at parallel parking I can
16:11
I can do it on a two-point turn right
16:13
so and I don't trust it. I don't want
16:15
to like I don't know I'm not there. Maybe
16:17
I'm old right? I'm not ready to trust the
16:19
machines I know that I'm better
16:21
than the machines at this stage,
16:23
but maybe you know the younger
16:25
generations are ready for it I
16:28
don't know. I like yeah, I'm happy with what we
16:30
have. I'm not ready to let my car completely bad.
16:32
It's all I'm
16:35
on the other end of the spectrum and I've driven
16:37
it with my partner. I'm happy to let it park
16:39
I'm happy to let it do all the things My
16:41
partner freaks out when it happens because it does it
16:43
so much more quickly and not in a very human
16:45
like way And I think that that's probably the difference
16:47
But even though even though you can doesn't mean you
16:50
should and I'll be honest half the time. I just
16:52
forget to turn it on So
16:54
I don't know. I I do wonder I mean it's
16:56
it's It is interesting
16:58
though to have a company that perpetuates
17:02
the myth that this can be delivered and
17:04
then to have a judge find that hey,
17:06
that's actually not true so to your point
17:10
What does that do for trust about the
17:12
brand and trust in electric vehicles? I I
17:14
don't appreciate the damage that that that actually
17:16
causes I guess there's something to be said
17:18
about the fact that both the two of you are you know
17:21
People that engage with very high-level technology a lot
17:23
of the time and and and I think Part
17:26
of what I wonder about this is actually There's
17:29
going to be a very long period of
17:32
time where people again culturally Socially adjust to
17:34
the idea of the technology and only that
17:36
can be built on a pretty thick layer
17:38
of trust and convenience Right, you know the
17:41
thing that normally makes us sign up to
17:43
technology is just like convenience It's
17:46
like oh, it's it's so much easier to just pay for everything on
17:48
my phone now than it is to walk around with a wallet It's
17:50
that kind of territory, right? And I just
17:52
think that you you can't even begin to
17:54
shift that culture and I'm not by
17:56
the way suggesting that you should Net it is if I to
17:58
complete and we should by the way But you
18:01
have to build that on top of a very wide Wide
18:05
base of trust to convenience and I feel like
18:07
that is quite a far way away But tell
18:09
me if you think I'm wrong like yeah,
18:12
I don't know I'm I'm
18:14
thinking about it as we're chatty like
18:16
I love driving. I love going
18:18
for road trips. I thoroughly enjoy
18:21
Driving I'm not ready to give
18:23
that up and so then I was like, okay autonomous
18:25
cars But then if you're just gonna get into autonomous
18:27
cars, then just get taxis right like if you're gonna
18:30
get to the point where you're not for
18:32
I mean for me then I if I
18:34
would just Get taxis if
18:36
I'm in a place where I'm not needing
18:38
a car or I'm not driving but back
18:40
to the autonomous car But
18:43
again like in a city like San Francisco, I think I
18:45
would probably be okay with that in a in this
18:48
I Would be
18:50
okay in the city like San Francisco taking
18:52
a car an autonomous car I
18:57
don't know the trust though back to what
18:59
Sarah said. I'm not okay with you
19:01
know, false advertising I don't think it's good
19:03
that that Tesla's out there saying one
19:05
thing and it's and it's not the
19:07
the truth I think that when things
19:09
like that happen, it puts us back so
19:11
many years and it and it is
19:13
harmful to technology I
19:17
think it's interesting what you said about San
19:19
Francisco specifically and I think There
19:22
are cities I would be happy in an
19:24
autonomous vehicle in but that's because there's probably
19:26
a lot of autonomous vehicles And I think
19:28
think about the times when I've used those
19:31
auto features and it's been a long drive
19:34
from Melbourne to Byron Bay doing the
19:36
big road trip and you're driving for
19:38
long stretches at a time and And
19:42
you know So I drove from Melbourne
19:44
to Byron without the auto features and
19:46
Byron to Melbourne with the auto features
19:48
So I could really compare the difference
19:50
in experience and I thought it
19:52
would make back all difference But actually I found
19:55
I was less fatigued. There was a lot of it was once
19:57
I got used to it I found it was a much more
19:59
delight experience. And I think
20:01
that that is where things make a difference
20:03
is I'm happy to go on the open
20:05
road and help me do those like long
20:07
stretches. But in a city, I
20:10
want all the other, like I think we
20:12
thought that autonomous vehicles meant they'd all be
20:14
autonomous. Like once, you know, the Jetsons picturing
20:16
all of those, you know, all of the
20:18
vehicles going around in a grid and they
20:21
all know where each other is. Whereas I
20:23
think at the moment it feels so independent,
20:25
like my car's autonomous, but yours isn't. So
20:27
if you do something really stupid that we
20:29
haven't programmed the AI to do, like even
20:31
think about kangaroos. How many kangaroos jumping out of
20:34
the road as this has been tested on? So
20:36
I think we have all of that doubt until
20:39
we have a critical mass where actually if all
20:41
the cars are autonomous, they can talk to each
20:43
other. And that's what I envisioned that would look
20:45
like in the future. Download the
20:47
show is what you're listening to. It is your guide
20:49
to the week in media technology and culture. And
20:52
it was one of the stranger websites
20:55
online where you could basically
20:57
roulette talking to people. It
21:00
was called Omegle. Does
21:02
anyone remember Omegle? Out of curiosity? I
21:05
never used it. I never used it. Well, you're
21:07
probably not going to do it anytime soon. So the
21:09
way it would work is you'd open it up and
21:11
it's basically like playing Russian roulette with the internet and
21:13
webcams. You'd click through and you'd get surprise new person,
21:16
surprise new person and hope to God that they'll wear
21:18
clothes, which often didn't happen. But
21:21
it has been taken offline in a lawsuit.
21:24
Why though, Meg? Because
21:27
it should have been. Yeah. No,
21:29
look, so Omegle was one of those sites
21:31
that has sort of been it was one
21:33
of the fringe sites. It was not something
21:35
that I ever used. I'm aware of it
21:37
simply by nature of what my job is
21:39
and I've had to talk about it. And
21:42
it was a place that was actually really prevalent
21:44
among some of the younger children because it was
21:46
a place that you could go and just
21:49
meet random strangers. Right. And and sometimes when
21:51
you're a kid, that's exciting. And that's kind
21:54
of how it was originally sort of built.
21:56
It was this place that you could go and
21:58
just meet anyone across the world. and
22:00
the optimist in me is saying that, oh,
22:03
cool, I can go meet anyone across the world.
22:05
I can have pen pals and I can make
22:07
interesting friends. Well, that's not the case. And what
22:10
it turned out to be was a bunch of
22:12
pedophiles and mean evil people. So
22:14
basically what happened was there was this woman in
22:17
the UK who, very
22:19
long story short, brought a lawsuit against
22:21
Omega that said she was groomed through the
22:23
platform. And she
22:26
had wanted to do it through court. She hadn't,
22:28
when she brought the lawsuit, she hadn't
22:30
intended for the site to be
22:32
shut down, but that was the end result. And
22:35
as she said, she couldn't have asked for a
22:37
better thing because this site wasn't a good site.
22:39
And the way that it was connecting people and
22:41
the way that it, because it didn't have the
22:43
filters, it didn't have the moderation that the, shall
22:46
we say, in air quotes, traditional sites had,
22:50
there was no way to protect people. So
22:53
over 220 images and videos of Alice from
22:55
the age of 11 carrying
22:58
out sex acts under the duress of
23:00
someone on the site. And this is
23:02
what the legal case is about. It's
23:04
this idea that there were many,
23:08
many children being exploited on
23:11
the site. And this particular
23:13
abuser had done the same thing to five
23:15
other girls. So meeting them and grooming them,
23:17
all on the site. And so these people
23:19
had become very skilled at doing that. And
23:22
the site was exposed for
23:24
doing all of these horrible
23:26
things. And that's a big reason why it was
23:28
shut down. How
23:30
did this thing manage to run as long
23:33
as it did make? Look,
23:36
I think that with any of these social platforms,
23:39
they do run unregulated until
23:42
enough people scream and shout,
23:44
until enough people bring suits
23:46
against them, until enough people
23:49
bring them in front of regulators
23:51
and say, there is a problem
23:53
and we need to do something
23:55
about this. You know, that
23:58
is, I think the great... some
26:00
of the many instances that were known since 2009
26:05
that these sorts of things would happen. Whereas
26:07
in Australia, we've really gone to great
26:09
lengths to really think about, well, what
26:11
do users on websites need
26:14
to be able to do? We have
26:16
the eSafety Commissioner. We have quite strong
26:18
restrictions, and even just cultural restrictions
26:20
about what young people can do online or should
26:22
do online. And we have a lot of education
26:25
to young people about how to engage or not
26:27
engage with strangers on the internet. So I was
26:29
really shocked that it was in the US. The
26:32
other thing just contextually is that this was created
26:34
by a child. So it was created by an 18-year-old
26:36
in 2009. And
26:39
I think there is this desire that young
26:41
people have to connect with other people and
26:43
to explore the world, whether that be their
26:45
friends or whether that be someone else. That
26:48
desire doesn't go away, but the
26:50
environments need to be safe. And they
26:53
need, we have most social
26:55
media, actually you'd be able to speak to this
26:57
probably a bit better than I can Meg, but
26:59
most social media platforms do have restrictions that you
27:01
need to be 13 to be able to use
27:03
them. But this one didn't. And so I'm really
27:06
confused how they were able to skirt a
27:08
lot of the laws that other social media
27:10
platforms had to abide by in order
27:12
to be able to provide service without being cut up.
27:15
I'm just really confused how it was able to go on
27:17
for so long and as long as it did. The
27:20
other thing just briefly to mention is that popularity
27:22
around the pandemic, I saw it take
27:24
off on other platforms. So what people would do
27:26
was they would record their screens so record your
27:29
amigle. And so you've got, oh, I'm gonna go
27:31
meet a heap of strangers and I'm gonna record
27:33
that and then recreate that as
27:35
content. So putting it on YouTube, putting
27:37
it on TikTok. And so, you
27:39
might have someone, oh, I'm gonna randomly play
27:41
a song to a stranger and see how
27:43
they react. So it really came
27:46
at a time where there was reaction videos and
27:48
people were looking for connection during the pandemic. But
27:50
unfortunately at the same time, the
27:52
pedophiles didn't go away from that website.
27:55
And with that, we are in fact out of
27:57
time. Huge thank you to our guests this week.
28:00
Meg Coffey, social media strategist who has
28:03
a critical eye let's say on social media. Thank you
28:05
so much for joining us on the show. Oh,
28:08
thank you for having me. And Sarah Moran,
28:10
co-founder of Patient Notes. Thank you so much
28:12
for joining us on Download This Show. Always
28:14
fun to be here. And
28:16
with that, I shall leave you. My name is Mark Fennell
28:18
and thank you for listening to another episode of Download This
28:20
Show. Please
28:27
be listening to an ABC podcast.
28:30
Discover more great ABC podcasts, live
28:32
radio and exclusives on the ABC
28:35
Listen app.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More