Podchaser Logo
Home
Why were Meta and TikTok grilled by the US Senate?

Why were Meta and TikTok grilled by the US Senate?

Released Thursday, 8th February 2024
Good episode? Give it some love!
Why were Meta and TikTok grilled by the US Senate?

Why were Meta and TikTok grilled by the US Senate?

Why were Meta and TikTok grilled by the US Senate?

Why were Meta and TikTok grilled by the US Senate?

Thursday, 8th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

ABC Listen, podcasts,

0:02

radio, news, music and

0:04

more. Metta,

0:12

TikTok, Discord, Snap and X, which is

0:14

actually called Twitter because X is a

0:16

terrible name. What do they all have

0:18

in common, though? Well, this week on

0:20

Download This Show, all of their CEOs

0:22

were grilled by US Congress about the

0:24

risks their products pose to young people.

0:26

But even after two apologies from CEOs,

0:28

will any of it lead to change?

0:31

Also on the show, what are the problems that

0:33

users of Fitbit seem to be pointing out? All

0:35

of that and much more coming up. This is

0:37

your guide to the week in media technology and

0:40

culture. My name is Mark Vanell, and welcome

0:42

to Download This Show. Yes,

0:53

indeed, it is a brand new episode of Download This Show

0:55

and... I

0:57

don't quite know what to do with myself because I've actually got two

1:00

people in studio. Normally, one of

1:02

you is remote on another part of

1:04

the world or country, but actually, I have two fleshy

1:06

humans in front of me and I am freaking

1:10

out. Yep. The fleshiest. OK,

1:12

let's not make it weird. OK, so joining us here,

1:14

we're the head of content from Biteside, Seamus Byrne, welcome.

1:16

Good to be here. And from the

1:18

Game for Anything podcast, good friend of the

1:21

show, Ang Haradyo. Flesh, flesh. OK,

1:23

enough of the flesh. OK, I understand I

1:26

started it. I understand this is my fault

1:28

and it ends now. All

1:30

right, first things first. Why are people complaining

1:32

about their Fitbits? These are the trackers you

1:35

wear on your wrist that track your heart

1:37

rate and all the other things you do

1:39

when you live an active lifestyle. But lots

1:41

of people are complaining about it at the

1:44

moment, Seamus. Why? Yeah. So this is an

1:46

age old device dilemma where you own a

1:48

device, you bought it a few years ago.

1:50

And then, of course, over time, the company

1:53

releases updates for your device. And so that

1:55

means your device is going to get maybe

1:57

a couple of new features here and there.

2:00

even security updates, things that kind of make

2:02

sure it keeps working safely. And

2:05

then we've got that situation where we've

2:07

had an update. Unfortunately, Google's not fitbit

2:09

is not necessarily saying it's totally the

2:11

fault of the firmware update, but a

2:13

firmware update for a lot of people

2:15

seem to coincide very closely with

2:18

a seven day battery life becoming

2:20

a hours, minutes battery life instead.

2:22

This is something that happens quite

2:24

often, but what is intriguing to

2:26

me about this one is Google,

2:28

who of course own fitbit, I probably

2:31

should have said that earlier, they are

2:33

being quite reticent to acknowledge whether

2:35

the firm that from the update

2:38

of the software is

2:40

actually responsible for it. Why

2:42

would they not just be like, how bad do

2:44

we do another update and we'll fix it? Look,

2:46

I am a big believer that

2:48

correlation does not equal causation, but

2:51

I feel like in this instance,

2:53

it's a little bit too close.

2:56

But I think saying, you know,

2:58

yes, it is our firmware update puts

3:00

all of the onus back onto Google. And

3:02

if it's something that's going to be really

3:04

difficult for them to roll back, that also

3:06

puts the onus of replacing a lot of

3:08

devices on them. So it is a lot

3:10

easier if they don't think that they're going

3:13

to get caught to just be like, oh,

3:15

no, I don't know what's happening, but we're

3:17

looking into it. It's kind of an easy

3:19

way to pass the buck back onto the

3:21

consumer because there are already a lot of

3:23

people who just go, damn, it's not working.

3:25

I guess I'll buy a new one. I'm not going

3:27

to lie. Half the reason we're doing this is because

3:29

I happen to maybe own one of these devices and

3:31

I've noticed that it stopped working. It

3:35

has updated and actually what struck me about

3:37

it is it's less about the battery life

3:39

but actually about the redesign of the app.

3:41

And the redesign of the app is infinitely

3:43

unclear. Whether or not the battery

3:45

life has been affected by the firmware or not, obviously,

3:47

we can't say. Google says no, users say yes. But

3:50

it did strike me that it was a worthwhile

3:52

conversation about updates, how to do

3:54

them well, how to do them badly, when they

3:56

go terribly and when they go well. I

3:58

think it's a worthwhile conversation of when you redesign something

4:00

that people use and check on a daily

4:02

basis? What are the keys to get right?

4:05

Well, it's a thing where we're stuck with

4:07

two competing kind of ethos, right? Like, one

4:09

is users being like, if it's not broke,

4:11

why fix it? I use this every day.

4:13

I understand it. I can use it efficiently.

4:15

And that's all I care about. And then

4:18

on the other side, companies going, we need

4:20

to feel fresh and forward moving and new

4:22

and of course implement new things that are

4:24

going to draw people to spend more time

4:26

on the platform. That's always important. And we

4:28

don't want to get in the way of

4:30

that kind of innovation. But at

4:33

the same time, sometimes doing something like

4:35

back when Twitter changed its

4:37

font and introduced its own

4:39

proprietary font that was weirdly

4:42

spaced and just felt uncomfortable.

4:44

It's those moments where you really go like, what

4:47

for other than to give

4:49

people jobs, which is great, and say that

4:51

you're making changes and moving forward. And

4:53

I think this does coincide with

4:55

the moment for Fitbit where as

4:57

a business, it is

4:59

being more and more integrated into Google

5:01

itself. The fact that over the last couple

5:04

of years, essentially the premiere

5:06

device that is made by

5:09

Fitbit is actually now the Google

5:11

Pixel Watch, which has all those features.

5:13

And so in some ways, when you

5:16

talk about an app redesign, we're almost

5:18

talking about that shift where Fitbit, the

5:20

small kind of startup that emerged 15

5:23

years ago with these cool little devices,

5:25

has really now over the last few

5:27

years sort of started to become fully

5:30

Googleized and in that sense, a

5:32

new update that after hundreds of

5:34

Fitbit staff have been sort of

5:36

fired as part of wider Google

5:38

layoffs, it might be starting to be

5:40

that thing where things are just kind of going, well,

5:42

let's just kind of fit it all together so that

5:44

it's more part of this holistic ecosystem. And

5:47

when you just want your health data, that

5:49

might start to actually hide it from you.

5:52

If I can be completely cynical, you

5:54

could also see this as a situation

5:56

where Google are gobbling up a segment

5:58

of the market that overlaps with something

6:00

that they want to be doing. Even

6:02

though Fitbit were a considerably smaller company,

6:04

they were, as you said, a really

6:06

big name in the fitness tracker space.

6:08

If Google gobbled

6:10

them up, slightly undercut

6:13

the brand name of it, and

6:15

pushed their products more, then it's

6:17

just a boon for them. Not

6:19

saying that's what's happening. I'm being

6:21

fully cynical and throwing

6:24

some conjecture in there, but I just don't

6:26

trust big tech. I think

6:28

there is a market out there for people that

6:30

don't necessarily want to wear a full-blown smartwatch and

6:32

just want to wear a fitness tracker. But I

6:34

guess if you're going to mess with the core

6:36

functionality of it, which is being able to keep

6:38

track of your basic stuff, how

6:41

far into an update do you have to go

6:43

before you start hacking away at

6:45

the very essential purpose of a product? Yeah,

6:47

look, when you add in things like

6:50

the heart rate tracking, because of course

6:52

Fitbit started just essentially as a motion-based

6:55

tracker to do step counting, all that

6:57

sort of thing. All those kinds of

6:59

extensions have been great. Yeah, heart rate

7:01

monitoring, perfect fit. Some of the latest

7:03

devices even do things like they'll do

7:05

a Bluetooth connection to gym equipment. Really

7:07

clever stuff. But you're right, once they

7:09

started doing all that, and it'll show

7:12

you all of your phone notifications, there's

7:14

so many times I've actually heard people say

7:16

kind of what you're suggesting, which is, how

7:18

about you simplified it all again, made all

7:20

the other things go away, put those extra

7:23

sensors in it, but just kept it so

7:25

simple that it maybe could have 30 days

7:27

battery life at this point instead of seven,

7:29

that kind of thing. I

7:32

think consumers also don't necessarily

7:35

get as riled up about things though,

7:37

because if there is no longer that

7:39

simple product on the market, they just

7:41

get used to the extra features and

7:43

once you get used to extra features,

7:45

you often grow to like them. So

7:47

Mark, maybe one day you will find

7:50

yourself in smartwatch land, come and join us.

7:52

I do think it's pretty great. I've

7:54

been in your land. I've

7:56

been in your land and I got out the skin of

7:58

my teeth. But would you go back? Would

8:01

I go back to Fitbit? Yeah, probably. I mean,

8:03

let's be honest, I'm a fair-weather friend at the best of times. Let's

8:06

take a rollback from the Fitbit specifically and talk

8:08

more about updating software that people know and love

8:10

and how people react to it and how to

8:12

get it right and how to get it wrong.

8:15

When is an example of

8:17

an update that has been

8:19

seismic but not resulted in

8:21

backlash? Like a good

8:24

update? I don't know if

8:26

that exists. Because even when things

8:28

get better, people don't like change,

8:30

particularly for something that they use

8:33

all the time. So I remember even back

8:35

in my early, early Twitter days, you used

8:37

to have to copy and paste a tweet

8:40

and write RT in front of it when

8:42

you wanted to retweet something. And

8:45

then they introduced the retweet button. And when it

8:47

came out, people didn't like it. They

8:49

were like, this is not what Twitter is. This

8:51

is a big change. We don't like it. They

8:54

introduced 280 characters instead of 140. Again,

8:57

there was backlash. But I would argue that

8:59

those changes did in fact make the platform

9:01

better. Are there examples of

9:03

when software updates, firmware updates have

9:06

gone truly, horrendously

9:09

wrong? And are there lessons to be learned

9:12

from the machinists? I'm not sure. I can't think of one right

9:14

off the top of my head. I mean,

9:16

any Tesla update, right? They just constantly

9:18

are rolling out updates that probably haven't

9:21

been properly tested and thus

9:23

can create issues or not deliver the

9:25

features in the way that they were

9:27

intended to be delivered and thus get

9:29

recalled and rolled back. This

9:32

is Download This Show, it is your

9:34

guide to the week in media, technology,

9:36

culture and Mark's personal grievances against smartwatches.

9:38

This week in Washington, D.C., saw some

9:40

of the biggest tech CEOs turn up.

9:43

And there were some very strange things that

9:45

happened. But I think one of the more

9:47

important things is Mark Zuckerberg essentially apologized to

9:49

a certain subsection of people, right? Yes.

9:52

So there has been a U.S. Senate hearing into child

9:55

protection on social media. And

9:57

there was a four hour questioning session.

10:00

which as you said included Zuckerberg, as

10:02

well as the heads of Discord, Snap,

10:04

TikTok and Twitter slash X. And

10:07

during this, one of the senators

10:09

basically said to Zuckerberg, well, don't

10:11

you want to apologize to the

10:13

families who are in this court

10:15

of children who have either self-harmed

10:17

or killed themselves as a result

10:19

of things that have happened on

10:21

social media? And obviously that's a

10:24

very, very heavy topic. And Zuckerberg

10:26

got up, turned around to the

10:28

people sitting behind him and

10:30

said, I'm sorry for everything you've all gone

10:32

through. It's terrible. No one should have to

10:34

go through the things that your families have

10:36

suffered. What's the

10:38

significance of this or is there significance

10:40

in this moment? In kind

10:43

of marketing terms, I think he did

10:45

play the situation very well. He didn't

10:47

apologize for anything they did. He

10:49

apologized for the experience they had.

10:52

And in that context then said,

10:54

yeah, this is why we're continuing

10:56

to invest in the

10:58

sort of standard spiel about we promise we're spending

11:00

lots of money to try to stop these things

11:03

from ever happening again. But in

11:05

the context of that hearing, Josh Hawley in

11:07

particular, who asked that question, who then

11:10

tried to follow up saying, well, you're going to

11:12

personally put money on the table to compensate these

11:14

people. And that's where Zuckerberg started to say, that's

11:16

not really what this discussion

11:19

should be about even. Because

11:21

it wasn't just Zuckerberg, as you mentioned earlier, there's

11:23

a whole host of CEOs that appeared. Was

11:26

anything useful and actionable? Did anything useful and

11:28

actionable come out of it? No.

11:32

So here's the key thing, right?

11:34

And I think it's fascinating that

11:36

only Meta and TikTok voluntarily came

11:38

along. Snap, the

11:40

CEO of X and the CEO

11:43

of Discord all had to be

11:45

subpoenaed to actually sort of come

11:47

along to that hearing. There is

11:49

currently legislation on the table in

11:51

US government talking about a kids

11:53

safety act to protect kids. But

11:56

so much of this, as much as they are telling

11:59

off these. The eyes. There's.

12:01

A lot of it. Where it is essentially

12:04

these politicians who have had apparently eleven different

12:06

pieces of legislation on the table over recent

12:08

years and done nothing with a me of

12:10

them that it's about the a soundbite and

12:12

the moment to raise funds and proved to

12:14

their constituents that that they gonna shout at

12:17

these people and tell him that they should

12:19

do better but they're actually not actually going

12:21

to pass any laws to to do anything

12:23

real about. It. And I think there

12:25

are in a in a difficult spot

12:28

because essentially what the legislation is trying

12:30

to enact is accountability for social media

12:32

companies face and what is posted on

12:35

their platforms. And that is an ongoing

12:37

extremely harry discussion because it's a little

12:39

bit like saying that you know a

12:42

street corner is responsible for what someone

12:44

stands on there and says. And obviously

12:46

it's not that simple as moderation. There's

12:49

a bunch of other things, but at

12:51

the end of the day is. The

12:54

question is, can you hold someone

12:56

responsible for what someone else posts

12:58

on there platforms? You can certainly

13:00

hold them responsible for what motor

13:02

racing they haven't place. You can

13:04

hold them responsible for how they

13:06

react to things that are reported.

13:08

That can. you truly hold them

13:10

responsible for the posting to begin

13:12

with. And I think that that

13:14

is a really kind of difficult

13:16

argument to sell, but it is

13:18

what this legislation has typically. Can

13:20

have been targeted towards the legislation that

13:22

they're discussing the Us. Whatever happens with

13:25

it will end up having an impact

13:27

on how you use these services In

13:29

the same My, that's when legislation passes

13:31

through Europe. It fundamentally changes how the

13:34

world's interacts with with some Asos as

13:36

sued. The legislation move through. What

13:38

will science about the way that we actually

13:40

interact with the serves As easy as it

13:43

is that clear at this particular juncture set

13:45

one of the my significant things that they

13:47

trying to pushes the idea that apps and

13:49

services food actually verify the age of people

13:52

who opt is biting in apart from were

13:54

setting up an account on that platform. And.

13:57

This is where it does get pretty hairy and is

13:59

plenty of. safety advocates

14:02

and certainly civil liberties advocates

14:04

are saying that this is actually

14:07

a terrible plan because they're basically

14:09

saying they should have to upload

14:11

verified ID to the service and

14:14

then you're like, well, how does that get stored and managed

14:16

and secure it and all those other issues that go along

14:18

with that? So that in itself

14:21

is quite problematic. Actually thought

14:23

Zuckerberg made a great suggestion out of

14:25

all of it, which was that if

14:27

when a device is being set up and

14:30

in that process of creating an Apple account

14:32

or a Google account, at that point, you

14:35

have to give a lot of personal details just to

14:37

create your personal account that runs

14:39

the device that could essentially

14:42

have kind of like face ID and things you

14:44

could almost have a system. Yep. At

14:46

the start of someone setting up this device, we know

14:48

how old they are and we can pass a yes

14:51

no through to apps and services instead of having 100

14:53

different services all being forced to try to

14:56

manage this incredibly sensitive data, you could

14:58

instead have that being run sort of

15:00

by the device itself. But then

15:03

that also then becomes scary because you're

15:05

turning over massive amounts of power and

15:07

data to essentially two companies. I think

15:09

the other thing to think about is

15:12

the fact that I don't think any

15:14

of these companies necessarily particularly care about

15:16

being a platform for people to post

15:19

stuff, right? They're companies that are there

15:21

to make money and to get advertiser

15:23

dollars. And a lot of the way

15:25

that they do make money is by

15:28

collecting data on people and being able

15:30

to do really targeted advertising. Like that

15:32

is where all of the source is.

15:35

So you need the ecosystem of

15:37

people creating and consuming in order to

15:39

create that environment. Yeah, certainly, certainly. But

15:42

I think as people kind

15:45

of continue to be complacent about what

15:47

data is collected and how it's used,

15:50

It is potentially possible that more of

15:52

these services move to kind of more

15:54

private messaging where data can still be

15:57

collected that they can still use for

15:59

advertising. That is sort of the space

16:01

the Discord exists in and I was actually kind

16:04

of surprised that they were involved in this hearing

16:06

because I would think that they could fly under

16:08

the radar so maybe I'm wrong and then I

16:10

could be able to pivot to that space. But

16:12

yeah, at the at the end of the day

16:15

I don't think that these companies are focused on

16:17

being social media platforms and we could see them

16:19

all pivot one. Thing I will say about

16:21

these events is that bite off theater.

16:24

And. Sometimes things go horribly wrong in the

16:26

theater of congress and one thing I would

16:28

say and eyes is particularly amusing because of

16:30

at both I heard you and I have

16:32

Singapore and heritage we families that was once

16:35

and for all but the head of tic

16:37

last is not on a decision as easier

16:39

for him. This is one of the weed

16:41

is my was responsible for what happens. At

16:43

said ahead of Tic Toc the says he the

16:46

who is Singaporean has a. Very strong thing about

16:48

a consensus says like all my cousins, yeah.

16:50

Yeah, but you know you classic

16:52

honey Chinese Singaporean say he was

16:54

asked several times if he was

16:57

a citizen of any country other

16:59

than Singapore, of what country he

17:01

was a citizen of, if he

17:03

had any ties to the Ccp,

17:05

and he was us at like

17:07

three or four different ways. Essentially,

17:09

are you a spy for the

17:11

Chinese. Government: Are you a member of

17:13

the Chinese Communist Party? Like? if you

17:15

could? It's It's not generally considered wise

17:18

to let use the same language as

17:20

the Mccarthy Trust. Doesn't

17:22

strike me as like a good thing to

17:24

do but that's what Sen Tom Cotton did

17:26

and it was yes oh embarrassing to as

17:29

you'd like the way he so emphatic. thanks

17:31

know a Singaporean like out it has left

17:33

a sadist. It was funny because he

17:35

would say i'm Singaporean It was almost

17:37

like that wasn't understand that men have.

17:40

it also could not be. You know

17:42

a Chinese citizen and Singapore is one.

17:44

Of. The countries that like very strict

17:46

about citizenship. You can have dual

17:48

citizenship. Well was I. Do know

17:50

of people who just didn't take the

17:53

box either way that they didn't take

17:55

the declaration and did manage to get

17:57

dual citizenship within a Singapore and seven.

18:00

I'm willing to bet Senator Tom Cotton just

18:02

doesn't know what Singapore is. Yeah. Just

18:05

quietly. Yeah, and look, he

18:07

also, I thought it was a great example where

18:10

they asked him, do your kids use TikTok? And

18:12

he's like, no, because in Singapore, people under the

18:14

age of 13, it's illegal to use these platforms.

18:16

It was almost like such a good

18:18

throwaway line where you're like, well, some countries

18:20

have got on with passing laws

18:23

about these kinds of things. Exactly. And

18:25

I think, obviously, I'm not for a minute suggesting

18:27

there aren't issues with TikTok in terms of where

18:29

its data goes, as a sign of the ship,

18:31

it is a company that does go back to

18:34

China. I understand that. But the lack of understanding

18:36

of an American setter, the understanding that China

18:38

and Singapore are different countries. Yeah, just like

18:40

mine. On one actually, on the one hand,

18:43

mind blowing. On the other hand, deeply

18:45

unsurprising. There's just such a

18:47

desire to use China as a

18:49

gotcha, because it is a culture that's

18:52

very different to US culture and has

18:54

a lot of values that don't align with

18:56

American values. And so he saw his chance

18:58

and he took several swings. The thing that

19:01

kills me is that there are a particular

19:03

and it's something that comes up a lot when

19:05

tech companies do these appearances at congressional hearings is

19:07

that there are real issues.

19:09

There are real issues that do need to be

19:11

navigated in terms of like, where does your data

19:13

go? And who does

19:16

have access to it in China? And

19:19

these forums, when they are prosecuted

19:21

by people that don't know what

19:23

they're talking about, which is what

19:25

I'm prepared to say, they are

19:27

missed opportunities to really seriously examine

19:29

what are real issues. Download

19:31

the show is what you're listening to. It

19:34

is your guide to the weekend media, technology

19:36

and culture. And there's a particularly interesting story

19:38

out of the UK where the UK Labor

19:40

Party are examining an idea where they would

19:42

force AI firms, artificial intelligence firms to share

19:44

their test data. But why? Yeah.

19:48

So what they want to sort of move to

19:50

is right now there's actually a like

19:52

a voluntary code in place in Britain. And

19:55

that sort of came off the back of the UK,

19:57

kind of really getting on the front foot late last

19:59

year. Holding a global ai? Some

20:01

it and go. A lot of the

20:03

major players turned out they participated and

20:06

companies like Microsoft Open I those sorts

20:08

of scale of players decided to agree

20:10

to a voluntary code way. They would

20:12

you try to be more transparent? same

20:15

or Donald with the government's to help

20:17

them? Sort of. Yeah, be able to,

20:19

I guess be a little bit more

20:22

transparent, understand exactly how these are the

20:24

systems are working at. What Uk labour

20:26

is saying they want to do is

20:28

move from a voluntary. Code to a

20:31

statutory Codes are actually pass laws that

20:33

will require these companies to not to

20:35

say we promise we'll do it but

20:37

actually has to do it when they

20:39

reach a certain scale of a I

20:41

model creation. So ya it's I dare

20:43

say I take your small playa you

20:45

can continue to experiments test Id is

20:48

he a come up with new ways

20:50

of using I but once he passes

20:52

certain compete thresholds where you are now

20:54

getting to making really lodz compete models

20:56

the kind of things that set to

20:58

be t r Us as well as

21:00

some of the image models. those kinds

21:02

of things where they really quite large

21:04

scale in terms of how much how

21:07

intensive they are creates. Then you start

21:09

to have to report on what you're

21:11

doing, the way in which you're doing

21:13

it be So the test documents what

21:15

often comes out of things like I

21:17

wanted this. the challenges one of the

21:19

things is capable of am one of

21:21

the potential downsides of these models. capabilities.

21:24

They. Would also be required to give

21:26

that tech over to the government's

21:28

at said that they can conduct

21:30

safety tests with independent oversight. Now

21:32

this actually isn't terribly difference in

21:34

some ways to what they already

21:36

signed in agreed to at the

21:39

I say see some at the

21:41

immense and what they signed was

21:43

called the Bletchley Declaration and within

21:45

it they also acknowledge the significant

21:47

risk that Ai presents to see

21:49

humanities are a wait. how did

21:51

you know big language and and

21:53

during the conclusions. that at the same

21:55

time of course because their i phones saying

21:58

this is also a technology that contains for

22:00

good and that's what we want and so we're signing

22:02

this to try and make sure that that happens and

22:05

essentially as well they talked about

22:08

offering up their tech so that it could

22:10

be tested but the difference

22:13

between having it be voluntary and

22:15

having it be law means that

22:17

there are actually consequences if they

22:19

don't do so whereas by just

22:21

shaking hands with each other and saying we're

22:23

definitely going to do it there's not really

22:26

any ramifications that could be enacted were they

22:28

to not hand over the tech I'm

22:30

trying to wrap my head a little bit around

22:32

what transparency actually means in this context right

22:34

so are we talking about a situation where

22:36

it will actually have to show it's working because

22:40

I'm kind of shocked that you can't do that

22:42

now frankly I would think it's less

22:44

about showing it's working and more about showing

22:47

its data sets because all

22:49

of these large language models work

22:51

in pretty much the same way

22:53

it's all got to do with

22:55

statistics so it's literally just guessing

22:57

what the most likely next word

22:59

in a sentence would be and

23:01

when you when you you know

23:04

conceptualize it like that it's actually quite

23:06

insane it's literally just guessing or

23:08

well not guessing it's you know we all so

23:10

that means it does this by

23:14

reading thousands and thousands hundreds

23:16

of thousands of documents books

23:18

whatever to learn how language

23:20

is most likely to be

23:22

constructed and then that's what

23:25

it spits out so in

23:27

that way the data that you

23:29

feed it is truly the most important

23:31

thing if you're feeding it data that

23:34

continues to say mark sucks that

23:36

is what is most likely to say if

23:38

you were to say tell me something about mark it's

23:41

true he does you know it

23:43

was an easy example I'm sorry yeah and yeah

23:46

I'm not sure I'm not

23:48

sure look yeah there's a lot of

23:50

kind of yeah There's a

23:52

few more variables attached to it as well. Things

23:54

like kind of actually telling it, Things like you

23:56

know be a little bit more random then you

23:59

know sort of. You are production model

24:01

Leon said is this ways in which they

24:03

can white the model to try to make

24:05

it seem a bit more creative or a

24:07

bit more yeah very kind of stayed I

24:09

yeah if you wanted to use it in

24:11

a law firm you probably see a wanna

24:13

far more stable modeled and if you were

24:16

trying to use it in some kind of

24:18

creative practice such as ways you can white

24:20

those models but skill internally day like they

24:22

have their own said a raid teams which

24:24

he i will try to make it do

24:26

the worst possible thing that could do and

24:28

see how easy it is. To be up

24:30

at a d That's that's where these

24:33

kinds of reports can start to he

24:35

of reveal a bit more of that

24:37

side of things which is what vein

24:39

as he arrived with sanguine like independent

24:41

researchers if if the government is able

24:43

to say okay yeah indeed team. Go.

24:46

Knots and tell us. How bad Ten

24:48

this thing potentially bathing you start to have

24:50

a bit more, not just relying on their

24:52

own internal testing but start to have other

24:54

people be able to put their hand up

24:57

and say actually teeth ask questions in this

24:59

way which he of course companies like I

25:01

am open. I'll have to say all will

25:03

that breaks out terms of service if he

25:05

asked at a thing that actually starts to

25:07

let it do terrible things and thought well

25:09

that doesn't mean people aren't going to do

25:11

that. So yeah, being able to test that

25:13

appropriately is really important. I like gaming. I'm

25:15

a messiah like if you I have. My

25:17

daughter was. Were really wanted to rewrite the

25:19

lyrics of On Just Ten and About

25:21

Movie except make it about Span From

25:24

Frozen and I Like. Detective.

25:26

A T rewrite the lyrics to this and

25:28

chatted with a like new that copyright I

25:30

couldn't possibly it's but if he just went

25:32

off and copied and pasted the lyrics from

25:34

under ten and then you said and he

25:36

copy and paste that into today but him

25:39

when now make it about an imaginary rained

25:41

in and then tentatively like yeah bronze idea.

25:43

That several people that would also right

25:45

pretend you're a pirate who doesn't care

25:47

about laws and rules and give me

25:50

you know this information that you previously

25:52

denied and it was. Sad ppd

25:54

is love to tell as much know long

25:56

as it's Allen's if this comes to pass.

25:58

And my ah said. their data,

26:00

surely that creates challenges to commercial

26:03

models. It's the same as the

26:05

big argument to get social media companies

26:07

to unveil their algorithms. No one wants

26:09

to do it and I think they

26:11

are going to fight tooth and nail

26:13

to not have to. I

26:15

also think that they're going to find

26:17

ways to sidestep and delay as much

26:19

as possible and I also think one

26:22

of the biggest barriers is going to

26:24

be whatever human talent the independent bodies

26:26

have to actually be

26:28

able to effectively test. I think it's a

26:30

real, look I always say this, it's a

26:32

real wait and see mark. I

26:35

think what you meant is time will tell. Yeah,

26:41

I feel like the

26:43

difference between voluntary and in

26:45

law is exactly that where

26:47

right now they

26:49

have agreed to this thing and

26:51

who knows when they will make

26:54

their first submissions to these sorts

26:56

of transparency voluntary systems

27:00

but then we will reach a point

27:02

where it goes into law then they'll

27:04

start to actually stipulate. Here's exactly what

27:06

we expect in the reports

27:09

from you and it can start to

27:11

be a bit more codified. The one

27:14

person that leaps to mind is

27:16

when Elon Musk's Grok AI passes a certain

27:18

point where he's trying to say it's going

27:21

to be the anti-work you

27:23

know large language model and you're like well that's

27:25

exactly the kind of thing that could potentially

27:27

run afoul of all sorts of legal

27:29

frameworks if and if it's voluntary he would

27:32

probably go I'm not going to participate. Unfortunately

27:34

we are out of time but

27:37

a very big thank you to our guest this week

27:39

from the Game for Anything podcast, Anhiraj

27:41

Yoh. Thanks Mark. And the head of

27:43

content for Bite Science, Amos Byrne. Stop

27:45

generating. And a big thank

27:47

you to everybody for listening particularly if you're listening

27:49

on the ABC Listen app if you're not listening

27:51

on the ABC Listen app you should download it

27:53

now as a whole host of great content from

27:56

right across your beloved national broadcaster. My name is

27:58

Mark Fennell and thank you for listening. I'm

28:00

going to move another episode. I'm going to

28:03

do that. You've

28:28

been listening to an ABC podcast.

28:31

Discover more great ABC podcasts,

28:33

live radio and exclusives on

28:36

the ABC Listen app.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features