Podchaser Logo
Home
Is your data really that private?

Is your data really that private?

Released Thursday, 19th October 2023
Good episode? Give it some love!
Is your data really that private?

Is your data really that private?

Is your data really that private?

Is your data really that private?

Thursday, 19th October 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

ABC Listen, podcasts,

0:02

radio, news, music and

0:05

more. Do

0:13

you ever wonder just how private your scrolling

0:15

is on the internet? Is your browsing really

0:17

that secure? Yes, this week on Download

0:19

This Show, the government is set to bring in new privacy regulations

0:22

to strengthen data protection. So exactly

0:25

what will it look like? Next on the show, how

0:27

social media companies can tackle disinformation

0:29

during a time of war, all of that and much

0:31

more coming up. This is your guide to the week in

0:34

media, technology and culture. My

0:36

name is Mark Finnell and welcome

0:38

to Download This Show.

0:50

Yes indeed, it is a brand new episode of

0:52

Download This Show and in a weird

0:54

twist of fate, I actually have

0:57

two guests in the studio. Because

0:59

you're listening to this on podcast, you have no idea that normally everyone's

1:01

spread across the country, but the amazing

1:03

Meg Coffey, digital strategist

1:05

and managing director of Coffee&Tea, you're

1:07

actually in the studio. I know,

1:09

I'm so excited. So real.

1:12

I am real for the first time face

1:14

to face with you. Not

1:17

AI, finally confirmed. I know,

1:19

hard to believe.

1:20

And Daniel Van Boom, technology correspondent at Capital

1:22

Brief. Welcome back. Thank you. It

1:25

feels like it's 2019. The

1:28

good old days.

1:29

Magical magical times. All right.

1:31

Nothing more fun than new privacy

1:33

legislation. There had to be a better throw than that,

1:36

but that's what I'm committing to. We are talking

1:38

about potential changes in privacy

1:40

legislation. It has been broached

1:42

by the government. Daniel, what has been discussed?

1:45

So much. So this, so

1:47

the original privacy act. Really dropped the ball on that intro.

1:50

Everyone knows it. That's right. I'll

1:52

drop the ball on the answer. The original privacy act

1:54

came in in 1988, which was like

1:57

before the internet became a thing properly. So

1:59

in like.

1:59

2019 the government was like we should probably

2:02

give that a look in early this year

2:04

the Attorney General's office Handed

2:06

down its report on potential things that they

2:09

could do to reform that act key among

2:11

them are things like the ability to to

2:13

delete the the ability

2:16

to ask companies to delete the data they have on

2:18

you to opt out of Companies

2:20

like Facebook and Google from targeting

2:23

advertisements out you things of that nature And

2:25

now we have the government's

2:27

response to the responses to that report.

2:30

God. I love bureaucracies Yeah, the wheels move

2:32

slow man, but we're getting there, but they're methodical

2:35

So of the things that have been tabled Meg.

2:38

What do you think is the most urgent that needs to

2:40

be dealt with? Asap.

2:42

Oh what needs who that's a good one There's

2:44

a lot of them I think it's probably the fact that we sort

2:46

of can have control of our own

2:49

data now I think that was the one that was the most interesting

2:52

to me Is that we actually can

2:54

control it and we can say look you need to delete my

2:56

data that I have the right to request that

2:59

Deletion now that was that was the one

3:01

that was the most important to me what yes There

3:04

are things that we're getting closer to GDPR We're

3:06

getting closer to you know more Regulations

3:08

around what small businesses need to do there is

3:10

a lot in that that I think a lot of people need

3:12

to pay attention To but from the

3:15

the human personal point of view the

3:17

thing that I like most about it is that I

3:19

can say hey

3:19

Stop stop tracking me. What's

3:22

missing from it? And because I know you have lots

3:24

of thoughts and feelings on privacy in general and

3:26

you're like the social media strategist that has the most

3:28

Like complex opinions about social media

3:30

that I've ever met and it's one of the many reasons why we

3:33

love having What

3:35

do you feel like is isn't being discussed

3:37

that you would like to see discussed? Is there anything that

3:39

stands out look

3:40

I think I mean it is a double-edged

3:42

sword, right? I love personalization in

3:44

that I want ads that are

3:46

the right ads for me. I don't want ads of

3:48

nonsense So I like that

3:50

I can be targeted but at the same

3:53

time I don't want all my data being sold

3:55

and it's it's wrong that

3:57

I'm not in control of the things that

4:00

I'm not in control of my own data.

4:03

And so I think that we do need more regulation

4:05

around that about controlling your own

4:07

data and who has access to it. And

4:09

then what happens when there are breaches?

4:11

Because

4:13

I don't think that we have enough enforcement

4:15

around that. And I think that

4:17

every business, no matter the size, should

4:20

be responsible for data. If you are collecting

4:22

someone's personal information and

4:25

things that are personal, whatever that

4:27

is, whatever that data point is, you need

4:29

to have strict precautions

4:32

in place to protect that. It's

4:34

horrible to say that data is the new oil. But

4:37

these things are important. And all of these data

4:39

points together are an important

4:42

profile. And that's really valuable. And I

4:44

don't think businesses are paying enough attention to

4:46

that. And so we do need more attention paid

4:48

to the protection of personal data.

4:51

There is a bit of a balancing act here, though, isn't there? Because

4:53

often we talk about the responsibility

4:55

shifting on people that manage our data. And we

4:58

instinctively think of the Facebooks and Googles of

5:00

this world. But I suppose if we're talking about legislation

5:02

that exists in Australia, that is going to affect

5:05

smaller businesses, maybe businesses that don't have as

5:07

much infrastructure. And I suppose there is a bit of a balancing

5:09

act here, Daniel, for doing

5:12

something that puts the data of users forefront

5:14

but isn't so onerous that companies

5:16

and businesses can't keep up. Has there

5:18

been any consideration for how that's being managed?

5:21

Yeah, there is. So as part of the response, there's

5:23

a bunch of things that the government said that they

5:25

will put into legislation next year. And

5:28

a bunch of things they said they've agreed to in

5:30

principle but need to flesh out more. And one of those

5:32

is the data responsibilities

5:34

of small companies. They are companies under $3 million

5:37

in revenue, I believe. And yeah, to

5:39

that exact point, it does bring

5:41

to mind the big behemoth companies. But data

5:44

is everywhere. And so small companies collect

5:46

a lot of data, sometimes maybe unbeknowingly. And

5:49

now it is likely that they will

5:51

have to be responsible. Yes, thank you for

5:53

saying that. That was teamwork.

5:56

I like teamwork. It's hard,

5:57

right? I get it. And if you are a

5:59

small business, you probably You probably don't have the infrastructure.

6:01

You're not thinking about it as much the

6:03

security of things and the security

6:05

of your website

6:06

or the security of your database or

6:08

your

6:09

PAWS system, right, your point of sale system.

6:11

But if you are doing these things, if you are

6:14

interacting with customers, you need

6:16

to think about that end to end of how

6:18

you are protecting your customer's data.

6:20

Meg, you brought up GDPR, which

6:22

is an acronym that basically speaks

6:24

to the privacy legislation that exists in the EU.

6:27

And of course, it's basically why every time you

6:29

go to a website, there's a little thing that says, we

6:32

use cookies. Are you okay with that? If

6:34

you're not, go away. How would

6:36

these kind of proposed

6:38

changes, you know, differ

6:40

or be similar to what GDPR has put in

6:42

place? Does anyone know?

6:44

I think it's pulling us in line with GDPR. It's

6:46

bringing us closer to that. I

6:48

don't know that necessarily GDPR is the perfect

6:52

solution, but I think it is a

6:54

good solution. And I think that globally

6:56

we

6:56

should be all more aligned.

6:59

Well, I think the other thing to say is that while

7:01

GDPR is the big one, Europe hasn't

7:03

stayed still. It's moved on since then. And so for instance,

7:06

one of the things that the government is very unlikely

7:08

to do, one of the recommendations from the

7:10

report that it's not likely to legislate is

7:13

to allow you to opt out of getting

7:15

targeted advertisements, which is something

7:18

that the Digital Services Act, which

7:20

in Europe went into effect in August, allows

7:22

people to do. So obviously,

7:25

Meta makes most of its money, 97% of

7:29

its money from targeted advertising.

7:32

And so as a result, it's been reported that they

7:34

are looking into doing ad-free subscriptions,

7:37

like 10 euros a month subscriptions for Facebook

7:39

and Instagram. That's- Wait,

7:42

wait, wait, ad-free Facebook. Yeah, yeah, ad-free

7:44

Facebook, ad-free Instagram. This is reported. It's

7:46

not a skit yet. But yeah,

7:48

so actually, Norway in

7:50

July started finding Facebook $150,000

7:53

a day per day, as

7:55

long as it does targeted advertising. So it's

7:58

not just the EU, Norway as well.

9:29

data

10:00

of people to data brokers. So data brokers

10:02

could start to advertise as being like, hey, Daniel's

10:05

walked by that ice cream shop three times today.

10:07

I think you should advertise some ice cream. So the

10:09

amount of data that they have on Australians

10:11

is, yes, quite concerning. And

10:13

I just want ice cream. You want to get

10:15

ice cream? So let

10:18

me talk to my phone. Hey, phone, tell

10:20

all the advertisers I want ice cream. Download the

10:22

show is what you're listening to. It is your guide to the week

10:24

in media, technology and culture. Mark

10:27

Fennell is my name. Ice cream is what I want. And our guests

10:29

this week are Meg Coffey, digital strategist

10:31

and the managing director of Coffey and Tea, and

10:34

Daniel Van Boom, technology correspondent

10:36

at Capital Brief. It's been a pretty awful

10:38

couple of weeks in the news. When you combine

10:40

also not just the conflict in the Middle East, there's also

10:43

the conflict in Ukraine. There's been a lot

10:45

of really horrific violence in the news.

10:47

But combined with that is also misinformation. It

10:50

seems that social media platforms have been swamped

10:52

with fake news about the conflicts unfolding

10:54

around the world. And in fact, it's become much worse

10:56

than anyone really predicted, Daniel.

10:59

Yes, it sure has. So obviously in the

11:01

last 10-ish days, there's been a pretty

11:03

harrowing conflict in the Middle East, Israel and

11:05

Hamas. And in the days following that conflict,

11:08

social media in general, but I would say ex-Twitter

11:11

in particular, was swamped with some

11:13

pretty insane disinformation, what

11:15

we maybe used to call fake news. Among

11:19

them are videos of purporting

11:21

to be attacks from Hamas

11:23

onto Israel. They were actually from a video game from 2015.

11:26

Other things like, again, purported

11:29

attacks from Israel into Hamas that were

11:31

actually attacks from earlier this year or

11:33

from years ago. Yeah, there's

11:35

a real don't believe what you see vibe happening

11:38

on social media right now. And I say ex-Twitter

11:40

in particular because some of

11:42

the changes that Elon Musk made to the platform

11:45

after buying it, well, most. People

11:48

often talk about the content moderation thing. He slashed

11:51

a lot of the team, which is probably

11:53

a problem. But I would say the bigger

11:55

problem is the verification badges. So

11:58

in the past, journalists and... official

12:00

people of some description were given the blue tick.

12:03

Now if you pay $8, you get the blue tick. And journalists

12:06

and official people don't automatically have it. In

12:08

addition to that, people with that $8 membership

12:11

can get paid if they get enough engagement. And

12:13

so now you have this this kind of twin incentive.

12:16

So you have the you have the blue ticket kind of

12:18

you look official, and you want to get as much engagement

12:20

as possible. And so you people now

12:22

it's not just trolling, people have an incentive

12:25

to spread that misinformation, and it can

12:27

be hard at certainly at a glance to discern

12:30

who is who really. We'll

12:32

get into the I guess the ways in which it's we've

12:34

arrived at this situation and the ways in which we could

12:37

potentially get out of it. But I I'm

12:39

sort of more interested in this on a psychological

12:41

level, which is as it becomes

12:44

more ubiquitously understood that the

12:46

things that you see on the internet may be misinformation,

12:48

maybe disinformation may just be a mistake,

12:51

right? Does it change the way

12:54

in which you interact with the internet? Do you

12:56

become more distrustful of what you

12:58

see, Meg? Yes.

13:00

And as someone who does social

13:02

media for a living, I shouldn't

13:05

say that traditional media is more

13:07

important than ever. But

13:09

that's exactly where we're at. I think

13:11

that

13:11

we are at the point where we need to be relying

13:14

on our trusted traditional

13:17

sources that we know are real

13:21

more than what we see online, because

13:24

we just online is

13:26

a fantasy world these days. And

13:28

it is very much anybody

13:31

can make anything online. And that's a great thing. I mean,

13:34

you know, I am I'm ever the optimist,

13:36

I always want to see the good in it, right?

13:38

And I think that the internet is a fantastic place.

13:40

And I think that technology is incredible. And what we

13:43

can create with it is incredible.

13:45

And what AI has allowed us to do is amazing.

13:48

The art we can create is amazing. But

13:50

I think that we have to believe we have to be at

13:52

the point now, that nothing that we

13:55

see online is real, just like we

13:57

used to be with magazine covers when you

13:59

see a magazine. cover, that is not what that person

14:02

looks like. And you just have to

14:04

come to terms and understand that. But

14:06

there's a media literacy that needs to be happening.

14:09

And I think it needs to be happening at the lowest

14:11

levels,

14:11

the earliest ages possible. I

14:14

suppose the part about that

14:16

I find maybe not

14:18

challenging, but difficult to navigate is at the moment

14:21

social media has become such a, it's

14:23

a powerful activist tool as well. Honestly,

14:26

Daniel, what is the impact on us as humans? Really,

14:29

really bad, I would think, because you kind

14:31

of have a, like you said, you look

14:34

at scans, like any information

14:36

that you feel like could be designed

14:38

to target your heartstrings or manipulate

14:41

you in some way, you kind of look at it and think like, well,

14:43

maybe that is the case. I mean, one of the

14:45

issues I had over the weekend was I saw Ben

14:48

Shapiro was trending, who's a very, very famous

14:50

conservative pundit. And I

14:52

thought like, Oh, what's going on here? What did he say now? And

14:54

he had shared a photo that I will not describe

14:57

because it's really, really intense, but to

14:59

put it mildly, it was of a dead person,

15:01

let's say. And then he was trending

15:03

because it turned out that that photo was apparently

15:06

generated by AI. And so I thought, all

15:08

right, well, that's interesting because I had like 50 million

15:11

impressions kind of thing. Um, and I thought

15:13

like that might be the most shared AI image maybe

15:15

ever. And then it turned out that the people

15:18

saying that it was AI with themselves

15:20

trolling, they had made fake, fake images

15:22

of it, of it being detected in an AI, like,

15:25

is this a real image or an AI detector? And

15:27

so the images actually came from, I

15:29

think, America's security department. But so

15:32

they were real. That happened over a series of

15:34

days. Like the, the narrative was like, this was fake.

15:36

Actually, no, it's not fake. Oh, it's from the government. And it

15:38

took me a digitally literate person. Like

15:41

after days of it happening, five or 10 minutes of like

15:43

explicitly looking at it to find

15:45

the veracity. And we're talking about a very

15:48

famous person. So in the, and

15:49

that's the thing, it took you a digitally

15:51

literate person days to

15:53

figure it out. What hope

15:56

do we have for the average person on the

15:58

street who is just, who's just an AI? the average

16:00

person living their lives consuming media

16:02

the average person right i think

16:05

your question to to the annual few minutes ago what

16:07

does this mean for humans we're

16:09

in a really bad place

16:11

i think that this is again

16:13

via social media is what i do but i think

16:15

social media is really bad and really

16:17

detrimental

16:18

to

16:19

to humans at the moment i think that

16:21

it is is pushed us to a point where

16:23

we are not meant to be disconnected we are not

16:26

meant to the this eternally on mine and

16:28

we have lost that ability to communicate

16:30

with our friends with last that's that's

16:32

in i think of back to the arab spring it and when twitter

16:35

was is so incredible a few years ago and

16:37

what it did for people we

16:38

are so far past that

16:40

or something struggled over last couple of weeks

16:43

and what i have noticed in the last seven

16:46

days as if i am accidentally

16:49

encountering really

16:51

for receipts visually

16:54

full on violence and

16:58

he makes and is our i guess

17:00

i have a conflict within a because of the part of me is like

17:02

we can't turn why there is something awful

17:04

happening right now arm and

17:06

as a few awful tasting rooms will to the moments

17:09

and a part of his like we shouldn't be turning

17:11

away from this we shouldn't be looking at a sanitized

17:13

version of his buttons on the part of me going every

17:15

time i open up any of the platforms

17:18

it's if i would be there was a sort of chapman's any

17:21

platform on worried about what

17:23

i mean to say and i kind of navigated a

17:25

common navigate in a really different why and i am i do

17:27

wonder if it is changing move

17:29

i wondered sizing may but i also wonder if inciting

17:32

the way i approach those services now the

17:34

thing that in some instances makes

17:37

twitter good is how you can

17:39

track them minute moment to moment happenings

17:42

by think when something is a flashpoint

17:44

for all the feelings that this

17:46

complete your arm brings

17:48

up ah i think distance

17:51

is probably a better than enough what i'm

17:53

doing i can i think that noise to

17:55

signal ratio of moment to moment in something

17:57

like this is just like so out of proportion

18:00

I think, you know, waiting a day or two or three

18:02

to kind of see what things

18:05

that have been verified come to the surface is

18:07

probably a better approach. And do you worry that

18:09

you... No, I agree with you, but then the other

18:11

concern I have with that is, am I putting my head in the sand

18:13

as this horrific humanitarian crisis

18:16

is unfolding? Is that the right thing to do

18:18

as well? Because that would be the flip side of that.

18:20

No, I don't think you're putting your head in the sand because you're acknowledging

18:22

it. You're aware that it is. It's just you're

18:25

making sure that you're only consuming the verified

18:27

news. We were not meant to be

18:30

connected and plugged in like this.

18:32

We were not meant to have video

18:35

footage of this stuff put in our faces 24 hours

18:37

a day. That is not... I

18:40

mean, I'm not a psychiatrist, right? I'm not... So

18:42

I don't know, but I would assume that that is not

18:44

good for our brains and that is not good for us as humans

18:46

to have that. So you're making the

18:49

choice for your own personal health to distance

18:51

yourself from that. You're not putting your head in the sand. You're

18:54

still getting the information. You're just waiting

18:56

until it's verified information and you're distancing

18:58

it. One of the things that we saw I think

19:00

the last week is schools,

19:03

not just in Australia, but actually around the world started to warn

19:05

parents about kids and what

19:07

they can see in the coming days. I

19:10

know in the EU and other places, there's

19:13

actually been more of a push to the social media companies

19:15

going, hey, this isn't actually... This is a thing that you

19:17

guys need to manage as well. For

19:20

lack of a better term, is there a smoking

19:22

gun here or is it going to come down to a combination

19:24

of us managing our own

19:27

digital hygiene whilst also expecting

19:29

more from our tech company, Daniel? I think

19:32

unfortunately, TBC on that, we'll actually

19:34

find out maybe progress on that question

19:37

soon because the EU as part of the Digital

19:39

Services Act legislation I mentioned before,

19:42

their investigation in Elon Musk basically to say, have

19:45

you done enough to read your platform of this

19:47

illegal content, deeming

19:50

the misinformation illegal? I

19:52

suspect that they will pursue a

19:55

similar thing in terms of violent content

19:58

which young people can see. I

20:00

think there is actually movement on that. I know there's something we've

20:02

talked about forever, and it's always been like, well, maybe

20:04

some government somewhere will do something about it, or maybe

20:06

some tech company will do something about it. Neither of which

20:08

has happened. But I think we are actually beginning to get movement

20:10

on that.

20:11

Nick, what do you think? I think it'll come from both sides.

20:13

I

20:13

think this conflict, or which conflict,

20:16

but I think what's currently happening now might

20:18

be an impetus to push it a little

20:20

bit more, mixed with the

20:22

way that Elon is handling Twitter, might

20:25

be an impetus for some of the other tech companies

20:27

and people just to sort of come together and

20:29

go, we actually, we need to

20:31

do something about this. Twitter is

20:35

off the rails, and we

20:37

don't want our platforms to be like

20:39

that. As TikTok, as Meta, as whatever,

20:42

we actually, we need to do something, and

20:44

the consumers are going, yep, we

20:47

need to fix this. I think,

20:50

I mean, you'll always be fighting this information, I think,

20:53

until we can come back with a proper verification.

20:55

I mean, I don't

20:57

have a solution for it. I think a lot of it does

21:00

rely on us as humans to measure

21:02

our own consumption,

21:04

but there is pressure on the companies like maybe

21:07

there hasn't been before. Download this show is

21:09

what you're listening to. It is your guide to the week in media,

21:11

technology, and culture. Mark Fennell

21:13

is my name, and I want you to visualize

21:15

yourself. You're in your lounge room, you're sitting down,

21:17

you turn on the screen, you grab a controller,

21:20

and you're destroying the planet,

21:22

and apparently, you're

21:24

making the Earth worse for everyone, maybe.

21:27

There's been a new article, it's actually on the ABC science

21:30

page, about whether or not video

21:32

gaming produces more carbon emissions

21:34

than other kinds of interaction with technology.

21:37

Walk me through it, Daniel. Yeah, so firstly, it is

21:39

your fault. No, it's taken. But

21:42

it isn't, because I haven't played a game since, oh wait, I

21:44

did Mario Kart with my kids on the weekend. Dammit! Yeah.

21:47

So there's actually an Australian researcher that

21:49

tracks how much carbon

21:51

emissions the global gaming industry produces every

21:54

year. And so the tech

21:56

industry is about 80 million tons of

21:58

carbon emitted into the. into the atmosphere

22:01

and gaming is about 15 million of that. But

22:03

the paradox is, well not the paradox, the

22:05

catch is, you probably

22:07

burn, the usage, as

22:09

in you burn more carbon playing the game

22:11

than you would using your iPhone.

22:14

But the reason why the techs here makes

22:16

more carbon is because the manufacturing

22:18

in the tech products you buy and also the

22:20

servers that run big tech, that

22:23

produces more carbon emissions in the

22:25

air. I think what you've done is you've highlighted

22:28

there's a number of paradoxes at play here.

22:30

So it's a question of how do you

22:32

measure the environmental impact of technology?

22:34

Do you measure it in the act itself? Because

22:37

for years we've talked about the fact that, for example,

22:39

mining cryptocurrencies. The number

22:42

one thing I can guarantee you as soon as somebody

22:44

brings up a cryptocurrency, firstly I walk

22:46

away. Secondly, there'll be a person there

22:48

going, well it's very bad for the environment and why didn't

22:50

you talk about that? Every time. And

22:52

if you're the person prepared to write me that email right now, don't,

22:55

I'm across it. But there's a debate around

22:57

certain kinds of technology about the environmental impact.

23:00

But then we sort of transfer it every

23:02

time we discuss it, right? So as you say, gaming,

23:04

yes, because of presumably the amount

23:07

of computing power required, it probably burns

23:09

more energy. But then you compare

23:11

it to the manufacturing of these old things, he says

23:13

as he waves around a mobile phone, forgetting that it's not TV, they

23:16

produce a different kind of environmental impact.

23:18

I guess the thing I'm curious about is

23:21

do they compare?

23:22

Everything we do, everything we do is bad

23:24

for the environment. Everything we

23:26

consume. Can you put that on a t-shirt? This

23:29

made it an unethical factory, don't get carried

23:31

on. Everything we touch electronically,

23:33

everything we consume is bad

23:35

for the environment in one way or the other.

23:38

I think we just have to measure how

23:40

bad is it, pick your battles,

23:42

right? Like is... Surely

23:45

choose your fighters is the best analogy there. Can

23:48

you tell I'm not a gamer? But

23:51

the thing is with the usage, it's kind of unfair

23:53

to be like you, the person playing that game are

23:55

at fault, because if the grid was

23:58

green, then you wouldn't necessarily have it.

25:25

So

26:00

if you have that pressure from the

26:02

market, then mixed with the transparency,

26:05

then you have an incentive for these companies to

26:07

develop more efficiently. I will say that

26:10

they have got a metric here. If you want, you can

26:12

check out this article. It is on the ABC News site. What

26:15

they do is they measure the total carbon emissions

26:17

per year per employee. That

26:20

does take into acknowledgement that some companies are big

26:22

and some companies are small. It is noticeable that

26:25

Ubisoft, one of the biggest

26:27

makers of AAA games, who are famous for Daniel,

26:29

Assassin's Creed, Far Cry, Tom

26:32

Clancy games. Thank you for doing my job for

26:34

me. They rank really low.

26:37

Their total carbon emissions are quite low, comparative

26:39

to something like a Sony or an

26:41

Apple or a Nintendo that is at

26:43

the other end. Nintendo of course being famous for? Mario,

26:47

Zelda, Donkey Kong. God, I am

26:49

surplus to requirement. Anyway, it is an interesting

26:51

comparison nonetheless. Well, can

26:53

I just say for that one, it is hard because

26:55

Nintendo and Sony make consoles and

26:57

Ubisoft doesn't, which is probably why they

27:00

are higher. Microsoft is sort

27:02

of ranking somewhere in the middle of that ranking and they do. They

27:05

do? They do. They make consoles. But

27:07

I think that, like I said before, they have... Well, is

27:09

that net? Because

27:13

total carbon emissions per year per employee. That

27:15

is very interesting. I retract everything. I

27:17

said I apologize. I don't think I need to retract.

27:21

I wasn't here. It's

27:23

an official apology going from Daniel Van Boom.

27:27

And with that, we are out of time. Huge

27:30

thank you to our guests in person this week.

27:32

Meg Coffey, Digital Strategist extraordinaire. Thank you

27:34

for joining us on the show. Thank you so much for having

27:36

me. And Daniel Van Boom, Technology Correspondent

27:39

at Capital Brief. Thank you so

27:41

much. Thank you for having me. If you enjoyed this show,

27:43

please

27:44

leave a review on whichever one of

27:46

those podcasty

27:47

apps you use. Maybe you're

27:49

a Pocket Cast person. Maybe you're an Apple person.

27:51

You do you. I'll leave a review. And

27:53

with that, I shall leave you. My name is Mark Fennell. My voice

27:55

was very high pitched then. Catch you next week for

27:57

another episode of Downlazy Side.

28:27

You've been listening to an ABC

28:29

podcast.

28:29

Discover more great ABC podcasts,

28:32

live radio and exclusives on

28:34

the ABC Listen app.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features