Podchaser Logo
Home
Protecting people over tech platforms with Anne Ikiara of Digital Action [Ep. 26]

Protecting people over tech platforms with Anne Ikiara of Digital Action [Ep. 26]

Released Friday, 1st September 2023
Good episode? Give it some love!
Protecting people over tech platforms with Anne Ikiara of Digital Action [Ep. 26]

Protecting people over tech platforms with Anne Ikiara of Digital Action [Ep. 26]

Protecting people over tech platforms with Anne Ikiara of Digital Action [Ep. 26]

Protecting people over tech platforms with Anne Ikiara of Digital Action [Ep. 26]

Friday, 1st September 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:10

Laura May: Hello and welcome to the Conflict Tipping podcast from Mediate.com,

0:15

the podcast that explores social conflict and what we can do about it.

0:19

I'm your host, Laura May, and today I have with me Anne Ikiara.

0:24

She's the executive director of the nonprofit Digital Action and has

0:28

a wealth of experience directing and working with social enterprises

0:32

in Global Majority countries. She speaks six languages and has the entrancing LinkedIn tagline

0:38

of "author, poet, speaker, gender consultant, and social advocate".

0:43

So I'm excited to dig into all of those identities.

0:46

Welcome, Anne. Anne Ikiara: Thank you thank you, hi Laura so much for having me.

0:52

Laura May: No, I'm so excited to have you here because you know, the, the

0:55

work you've been doing with Digital Action these last few months since

0:58

you've started has already been so interesting and fascinating to me as

1:02

someone who stalks to you on social media. So I'm really glad to have you here with me today to talk about it and

1:07

learn a bit about you and learn a bit about the organization as well.

1:11

So, I understand that Digital Action protects democracy and

1:16

human rights from digital threats. But before we dig into that, I actually wanna know about you.

1:22

So what led you there?

1:24

What sort of piques your interest in this kind of work?

1:29

Anne Ikiara: Thank you. Thank you, Laura. I have lived experience in the effects of disinformation, misinformation,

1:36

hate speech that is propagated online.

1:40

In 2007 we had elections in Kenya and uh, owing to disinformation,

1:48

misinformation and hate speech, we had post-election violence.

1:52

At that time I was running a small organization, national organization

1:57

called Nairobits, and as you may anticipate, the epitome of the violence

2:02

was in the non-formal settlements. So I came face to face with young people whose livelihoods had had

2:09

been destroyed, their houses had been burnt, they had lost relatives.

2:14

And even some of them had physical injuries.

2:18

And I remember one day a young person coming to me and telling

2:22

me that their home had been burnt. And as a result of that more than a thousand people lost

2:28

their lives and more than two hundred others who are displaced.

2:33

So I understand from a lived experience perspective what this

2:38

could mean at the personal level.

2:40

That is why when I saw the role at Digital Action, I got very interested because

2:46

I wanted to have this kind of impact globally and contribute to elections

2:54

and protect democracy from threats.

2:58

Laura May: Absolutely. And so, I mean, I know very little about Kenyan politics, and I have a hunch that

3:03

maybe quite a few of the listeners don't know much about Kenyan politics either.

3:07

So can you give me just a little bit more information about what this

3:11

disinformation and misinformation was?

3:14

Like what actually led to these outbreaks of violence and displacement

3:19

in Kenya around the elections? Anne Ikiara: Well, actually it is something that

3:23

perpetually happens in Kenya. It happened in 2007, 2000 and 12 again.

3:30

And even last year, 2000 and 22, it did happen.

3:36

What we have in Kenya is um, ethnicity.

3:39

There are different tribes in Kenya, and much of our politics follows

3:43

ethnic lines, so it's very easy to to have disinformation and hate

3:48

speech especially now with the digital media along those lines.

3:54

So that's exactly what happened in 2007, where We used actually the phone SMSs

4:01

to send hate messages and disinformation against other communities, which

4:07

is how the the violence happened.

4:10

And the same way now it's even become worse, because Kenyans have embraced

4:16

digital media and social media more than most other African countries.

4:22

So it has uh, escalated. Disinformation, misinformation has always been there in the context of

4:27

elections , has always been there. But now it is very easy to spread because of the tools

4:32

that we have in social media.

4:35

So that's what happened in 2007, 2008.

4:39

And that is how the violence happened because we, we, I mean the information

4:44

pitted communities against each other. And then it went to offline violence where we physically fought each other.

4:53

Laura May: Awful. Yeah, and thank you so much for shedding light on that and I mean,

4:57

it does sound really difficult 'cause there's along those ethnic lines as

5:00

well that I guess it makes such a, a visible cleavage for people to use and

5:05

to exploit for their own political ends.

5:07

So it sounds really, really difficult.

5:10

So tell me then about Digital Action.

5:14

What does the organization actually do? Anne Ikiara: So Digital Action is a small but mighty organization.

5:19

Started in 2019 to protect democracy from digital threats.

5:26

We are a fiscally sponsored organization that is funded by the SCO Foundation,

5:31

Luminate, the MacArthur Foundation and Open Society, and the Ford Foundation.

5:37

And our work really is to take tech companies to account, to protect

5:42

democracy from the threats that are propagated on their platforms.

5:47

So big tech companies such as Meta Twitter and YouTube have under invested

5:53

in the Global Majority countries.

5:55

So much of their investment in protecting citizens is spent in the Global Minority.

6:03

But then you and I know that much of the harm happens in

6:07

the Global Majority countries.

6:10

So Digital Action is trying to take tech companies to account to invest as much

6:19

to protect the Global Majority as much as they protect the Global Minority.

6:26

Laura May: Absolutely. And for those listeners who haven't encountered these phrases before, Global

6:30

Minority and Global Majority, it aligns more or less with sort of this idea

6:35

of West and non-West or Global North and Global South but stresses that in

6:40

fact what had previously been described as the Global South is the majority

6:44

of population, majority of countries, the majority of land area, and yet

6:48

not getting the majority of resources. And so, yeah, for those who are listening, that's what we're talking about.

6:54

I understand that, in the Eu for instance, there's all, there's a lot

6:58

of talk about the Digital Services Act and, and things like that, which will

7:01

help to, as I understand it, to regulate some of these social media platforms.

7:07

Are there similar initiatives and legislation underway in Africa?

7:12

I mean, I guess I wanna know, like this divide in resources, is it related to

7:18

local legislation or is it related to the biases of the tech companies, or is it

7:23

related to something else, do you think? Anne Ikiara: It is related to the biases of tech companies.

7:28

You understand that tech is very versatile.

7:31

It usually will go faster than registration in specific countries.

7:35

And it's a very complex legal situation because most of the servers, of course,

7:40

are not based in the Global Majority.

7:43

They're in the Global Minority. So it's very easy for a big tech company to side step local registration.

7:51

So what Digital Action is trying to do is to take them to account.

7:56

To provide safeguards not based on the level of uh, resources that they

8:01

get or the business model, but also the level of harm that could happen.

8:07

So if in Kenya, for example, or in Brazil or in any other country, the

8:13

level of harm is huge, then they should invest more in that context.

8:18

As much as they invest in Us, where they get most of their business from.

8:23

So that is what Digital Action is trying to do.

8:26

Because right now, the model follows the money.

8:31

Where they get advertisements and where their, their revenue is

8:34

coming from is where they invest, ignoring the Global Majority, where,

8:40

of course much of their platforms have been uptaken by the citizens.

8:46

And the effect is even worse for obvious reasons because of lack of resources

8:51

to, to mitigate some of the challenges that are occasioned by that situation.

8:57

Laura May: No, it, it absolutely makes sense. And something I was really struck by is when I was reading Chris Wiley's

9:03

book about Cambridge Analytica , as one of the whistleblowers, and he

9:07

talked about how this organization had started off experimenting in Africa

9:13

and trying to influence elections there and trying to like stir up different

9:17

types of partisan violence there. And so it was almost as this testing ground, I suppose for this

9:22

Global Minority based organization. And the consequences have just gone un untalked about right, because

9:28

we, I mean we heard about Brexit, we heard about like, you know, obviously

9:31

Trump's election in the US as well. Like, oh yeah, this is all because of media manipulation, whatever.

9:36

But what we don't hear about is the harm in Global Minority

9:40

countries, as you've just flagged. Anne Ikiara: That is why that's what Digital Action is trying to amplify.

9:48

Because we are working with partners. We are a frontier organization.

9:52

We don't necessarily do the work ourselves, but we like to front

9:57

organizations in the Global Majority that are doing different things

10:02

to make that environment safe. So there are different people doing different things.

10:08

There are researchers, there are civic educators.

10:11

There are, other policy people at the intersection of policy and regulation.

10:17

But Digital Action is the convener. And at the moment we have more than a hundred and forty

10:21

organizations across the world.

10:24

And we are having a campaign, to make 2024 the Year of Democracy, elections safe.

10:33

And in 2024, over 65 countries are having elections.

10:38

And that is the first in a century where so many people will be

10:44

having elections and also the level of threats then is heightened.

10:48

Because if there is no regulation and if there are no safeguards

10:53

in that space, then you can see the level of harm in 2024.

10:58

So we are having a campaign that is being launched on the 15th of September.

11:03

And we are calling it " protect people and elections and not big

11:07

tech", and there are organizations in the space, partnering with us to

11:13

really make sure that the campaign is very strong and that the big tech

11:19

companies listen and pay attention to some of the asks that we have.

11:24

Laura May: It actually sounds really scary. 'cause I mean, you've just highlighted that misinformation, disinformation,

11:29

hate speech had this profound and in fact physical effect in Kenya.

11:35

And yet now we're talking about 65 different countries which are gonna

11:39

have elections, which could be affected in similar ways and by similar means.

11:43

It sounds like we could be hearing about violence, about co-optation

11:48

of democracy in countries. Like it's, it's quite scary what you're talking about.

11:53

Anne Ikiara: Yes. It's very, very, yeah. And that is why our Campaign Global Coalition for Tech Justice is convening

12:01

to really protect people and not the big tech companies and call really the Metas,

12:06

the YouTubes, and the Googles to account.

12:10

To protect, to mitigate that situation in much the same way as they would

12:15

mitigate in the Global Minority, to make sure we are all safe in 2024.

12:21

It's really a big test and it's a big also opportunity for them to show concern and

12:28

responsibility in protecting democracy.

12:32

Laura May: And so when you talk about protecting people and not big tech, and

12:35

you've mentioned safeguards a few times. What are the asks?

12:39

What are the safeguards? What could actually protect us?

12:42

Anne Ikiara: Okay, what could protect us is. Some of their policies are aligned to the West, you know, they, they are

12:48

specific to the English speaking context especially, but in other countries, like

12:54

in Kenya, for example, you just said in the beginning that I speak six languages.

12:59

I could write in any of those languages, you know, hate speech on Twitter, and

13:04

it'll not be flagged unless they have found somebody or they have context

13:09

specific safeguards so that content moderators really understand that

13:14

language and the challenges that are specific to the Kenyan context.

13:19

So, one ask is for them to make sure that content moderation and

13:24

safeguards are context specific.

13:28

And then the other is they should be transparent.

13:33

Because right now we really don't know what safeguards are in place.

13:37

We don't know how much money is being spent, where; we really don't know.

13:43

So we ask them to be transparent. You know, we are using this amount of money in US, for example, and we

13:50

are using this amount of money to protect Kenyans as well, for example.

13:55

So they should be transparent and the resources should match the level of

14:00

harm anticipated, and not the revenue.

14:04

That is another one. And then they should also operate throughout the election period.

14:09

Because like you saw in the, in the US and like I've given you the example

14:14

of Kenya, they let their guard down immediately the election happened.

14:20

And then we are talking about post-election violence.

14:24

So they stop moderating. So, so they should put in measures before, during, and after the elections.

14:33

They should offer comprehensive range of tools and measures and adopting to local

14:38

context that we, we have talked about. And they should also involve governments.

14:45

Not, in the way of buying them off so that they, they're silent about the

14:49

harm, but also partnering with them to make sure that the elections are safe.

14:54

And not just governments but also election bodies, civil society.

14:58

They should partner with us, because we are on the ground and we can point out

15:03

areas of concern that they can invest in.

15:06

So in a nutshell, those are some of our asks.

15:09

Laura May: Yeah, I have so many questions about the asks.

15:13

The first one that comes to mind is you mentioned that they need to put resources

15:18

into protection, not just during election campaigns, but also afterwards, because

15:24

as you mentioned, post-election violence.

15:26

And something that strikes me, 'cause you know, before we started having this

15:31

call, before we started recording, we were talking a lot about gender and racism.

15:34

And so I guess when I think about this, I think, oh well, yeah,

15:37

post-election violence is bad, domestic violence is also bad.

15:42

And so maybe they should have these safeguards and these moderation always

15:48

. Like why not dedicate resources to protecting people, not just in the context

15:53

of elections, but to protecting people from misogyny online or racism online,

15:58

which also lead to violence, right? Anne Ikiara: Exactly.

16:02

Misinformation, disinformation is a very wide subject and covers

16:07

different kinds of concerns. And this is just one of them, but that is what we focus on.

16:13

But even in the context of elections, it's not gender blind.

16:17

Women candidates, even women election officials, have been targeted with hate

16:24

speech that really removes agency from them as election officials, and integrity,

16:32

but also some of that has moved from online to physical harm to themselves.

16:39

Because really the way they're portrayed in media, in social media,

16:45

can sometimes expose them to harm.

16:48

And it has happened in several places where women have really

16:53

been targeted and sometimes even physically harmed, and their families.

16:57

Even some of the harm has extended to their family.

16:59

So it's not a very, it is not gender blind.

17:01

It's a very gendered concept. Mm.

17:04

Yes. So that is also something that should concern them, and that is

17:08

why it should be context specific. Because like in the context of Africa, for example, in the Global Majority, other

17:14

Global majority countries, women are, are just now getting into elected positions.

17:22

Competing for positions , in the electoral space.

17:25

And it's not yet a very acceptable concept in some, in some areas,

17:30

especially in mostly in Africa.

17:33

So, women are really targeted, candidates especially, and, and it's

17:39

gonna appear very interesting and very annoying because for the men

17:45

nobody talks about their private lives.

17:48

And, and, Laura May: What they're wearing Anne Ikiara: or what they are wearing.

17:52

But for women, somebody will talk about what they wear, who they are married

17:58

to, and how many children they have.

18:00

I don't know, who they ever dated. And there are all these things that really not not relevant to the electoral

18:06

position that they're looking for. So that should also be a concern.

18:10

But most of these things are context specific.

18:13

That is why we insist that they should enable accountability at the

18:17

level, at the level of the context. Laura May: This actually makes me really curious about Rwanda of all

18:25

places, because as far as I'm aware, they're the only country in the world

18:29

that has majority female government.

18:32

And so I'm really curious, especially given the context of their history.

18:36

You know, it's what, 30 years nearly since the genocide.

18:39

I'm like, I wonder for myself like what hate speech looks like in

18:44

Rwanda nowadays around the electoral cycle and around the role of women.

18:48

And if it's somehow different. Very curious.

18:50

I mean, I don't know if you know this, I'm just like, ooh, that's so interesting.

18:55

Anne Ikiara: Rwanda is a very progressive country, and the

18:59

rule of law is followed.

19:02

I'm not very familiar with that context, but my estimation is that

19:06

there will always be subtle subtle gender issues in this context,

19:11

but it may not be as pronounced in Rwanda as it is in other places.

19:15

Like, for example, compared to other African countries, Rwanda might

19:20

be a little bit ahead, but that doesn't mean it's exclusively absent.

19:25

It might be, it might be there, but it might be more subtle

19:28

than it is in other countries. Laura May: Hmm.

19:33

No, I would be really curious because yeah, when I think about the Australian

19:38

context, and obviously we had Julia Gillard as as a woman Prime Minister,

19:42

and she was just shredded in media, for yeah, what she was wearing and

19:47

her, inverted commas, "lifestyle choices" and all of these other things.

19:50

And it was brutal. You know, this, this sheer misogyny she faced on a day in, day out basis.

19:56

And yet, people think about Australia as this, you know, developed country,

20:00

it should be progressive, right? Like women get into government so it can't be sexist.

20:04

I was like, well, I've got news for you buddy. Like, that's not how it works.

20:08

That's not how it works. Oh my goodness.

20:11

I'm gonna leave that, that alone.

20:14

There's actually something else I wanted to talk about, which is also

20:16

difficult to measure because you referred to this idea of aligning

20:22

funding and resources to the level of harm done on social media platforms.

20:28

So how do we measure levels of harm and particularly potential harm for

20:33

an election that hasn't happened yet?

20:37

Anne Ikiara: Yeah, good question. That is, that is difficult.

20:41

And one of the things that we are struggling with is that there's

20:44

no baseline data, but given the past elections, and given the

20:53

the heat before elections, it is possible to anticipate that indeed

20:58

we need to invest heavily here. Because like if I give an example of Kenya, because that is where I come from,

21:05

elections are usually hotly contested, and it's very clear what the proponents are.

21:13

So it is possible to estimate that people will be posting comments in their

21:19

native languages or probably sometimes in Kiswahili, and I think the level of

21:26

investment should follow that trajectory.

21:31

And it should be properly monitored so that as is escalates, then also

21:36

the level of protection follows. Because once you have a service that is potentially dangerous, then I think you

21:45

have also the responsibility of mitigating that risk, however big it might be.

21:53

Yeah so it's a grey area, admittedly, but um big tech companies should

21:58

have the resources to do their own research and be able to anticipate the

22:03

level of investment that is required in their platforms to mitigate.

22:09

And I don't think it is impossible, because many of the factors are known

22:13

long before the election takes place. And if they're willing to have partnerships with local civil

22:19

society organizations that are invested in the local context, and

22:24

governments and electoral bodies. Then it should be possible to really understand and single out

22:30

the factors that constitute risk, so that they're better able to mitigate,

22:35

long before the harm happens. Laura May: It's beautifully put.

22:40

I mean, yeah, you have the concrete need for local language moderators,

22:43

but as you've just highlighted yourself as well, people already on

22:47

the ground in civil society already know what the danger zones are.

22:52

They already know if something's gonna blow up.

22:54

And yeah, by partnering those organizations, social media platforms

22:58

can say, "oh, we do actually need to to allocate some resources here, we do need

23:02

do a better risk assessment for here. Like it absolutely makes sense.

23:06

Anne Ikiara: Yes. Exactly. Yeah. Risk assessment should happen in every context, and that's actually one

23:14

of our requirements, one of our asks.

23:18

For you to make it context specific, then risk assessment must take

23:23

place in that particular context.

23:27

Laura May: So, tell me a bit more then about this campaign you're launching.

23:30

How do people get involved? Like how does the campaign work?

23:34

Anne Ikiara: Okay. The way it works is that over the past one year, we have been researching and

23:40

trying to find out the best method to coordinate and cooperate with people.

23:46

And it's been a very consultative process in which we have talked

23:50

to different people globally. So in June we launched our website in which different people and organizations,

24:01

both individuals and and organizations across the globe could sign, and

24:07

agree to our regulations because they are for any, any organization, any

24:13

coalition, there has to be something that is bringing you together.

24:17

So they needed to agree on our campaign asks.

24:20

Since then, 140 organizations and individuals have signed on.

24:26

We call it a coalition for tech justice.

24:29

That's the name that we have given it. And then together we are having different activities.

24:36

We are having the official launch in September 15, the International

24:41

Day of Democracy, that's when we are having the launch.

24:45

After that, then there'll be different activities to highlight what we are

24:49

doing, and by different organizations that are already decided to partner with us.

24:56

And of course tied to that is what I talked about earlier, writing to

25:01

big tech companies specifically to ask them to make the environment

25:07

safe, equitably across board.

25:11

And different organizations will have different activities.

25:14

Even individuals will have different activities, all to

25:17

create a lot of visibility around the issue of digital harm.

25:23

And we shall monitor different elections that are happening in 2024, and make

25:30

sure that we understand the level of harm and the safeguards that are being

25:35

put in place, so that then in the year 2025 we'll be having some data, to

25:42

take big tech companies to account. And to ask for policy direction, now based on hard data that we are going to have

25:52

collected from monitoring the elections.

25:55

Laura May: Amazing. Anne Ikiara: Yeah. Laura May: And a huge project, a huge campaign.

25:58

Anne Ikiara: Yes. Yes. And you'll be surprised, the campaign is ran by four people within Digital Action.

26:06

Our team is small, but we have the bigger network to front our case.

26:11

Laura May: Amazing. love Anne Ikiara: Yes. Laura May: that.

26:14

And so, something else I'm curious about is the sheer confusion of the social

26:21

media landscape at this point in time. Obviously we've seen Twitter has become X with some horrible looking branding

26:29

and it's obviously sort of falling apart. You know, people talking about the dark days of Twitter, the fall of Twitter.

26:34

We've seen similar things happen with Reddit in terms of, there was

26:37

a lot of fuss about the APIs being cut off, apps no longer being used.

26:42

We've seen migrations to Mastodon servers to Lemmy, kbin, we've seen as well the

26:50

launch of Threads, which, after the first three days, I heard nothing about.

26:54

Who can we even talk to in this environment?

26:57

Like who, who are the people? What are the correct platforms?

27:01

This sounds like such a confusing, huge puzzle.

27:05

Digital Action has written letters to specific people that are responsible

27:11

for exactly what you have described.

27:14

We have through our own networks, we have identified people who are responsible for

27:20

making the platform safe at Twitter, at Google, at YouTube, and at TikTok, and

27:30

we have written specifically to them, and actually the deadline for them to

27:34

respond to us is the 4th of September in anticipation for our launch on the 15th.

27:41

It is not as faceless as it it might look because there are

27:44

people running those offices. There are people who report to that office every day, and their

27:48

task is to make the platform safe.

27:52

So we have written specifically to those people to make sure that they

27:57

tell us what exactly they're going to do for the 65, more than 65 countries

28:02

that are having elections in 2024.

28:06

And what does that look like for the decentralized platforms.

28:10

Like Mastodon for instance, like Lemmy and kbin, where there's not one person

28:15

in control or one company in control.

28:18

For example, for Mastodon, I'm on a server for social scientists.

28:23

Which is managed by social scientists and you have to like be

28:26

a social scientist to be accepted. But I mean, there's heaps and heaps of different servers and

28:31

they all have their own rules. I mean, Truth Social, like Trump's network, is a Mastodon server.

28:37

And I mean, I'm assuming you can't really write to that server or who, whoever's

28:41

running it server and say, Hey, would you mind just not using hate speech?

28:44

Is that cool with you? Like, so how do you deal with this decentralization issue?

28:51

Anne Ikiara: We have to find strategies to deal with, because it's evolving.

28:55

It's an evolving threat. Every day is something that is different.

28:59

So it's something that we should anticipate.

29:02

As we, as we continue, because there are evolving threats every day.

29:06

That's why this field is very challenging, and it's also very

29:10

exciting, because you see different things every day which you might either

29:16

anticipate or respond to as they evolve.

29:20

So that's another new challenge. Laura May: Yeah, it sounds really difficult in the era of decentralization

29:26

and sort of fragmentation of the social media landscape, at least with targets

29:31

like Meta, like Google, you do have, as far as I'm aware, the majority

29:35

of the world's population on there. So they're pretty good targets for reducing harm in the interim.

29:41

Anne Ikiara: Yes. They also have a very, very wide reach.

29:44

And in Majority countries there is quite a, a sizeable chunk of the population

29:50

have access to those platforms.

29:52

So I mean, you put your resources where the harm is greatest and

29:58

where you can score big wins. I, yeah, that's part of our thinking.

30:04

Laura May: Absolutely. Anne Ikiara: Yeah. Laura May: Okay.

30:07

There's actually something else I wanna ask you about, because

30:10

I've been so curious about your identity as an author and a poet

30:14

since I saw that on your LinkedIn. I love that that's your LinkedIn tagline.

30:18

I love that it's there, you know, you have this creative

30:20

component to your personality, you've got your soul on display.

30:24

So so tell me, what kind of things do you write?

30:27

What is your poetry about? Anne Ikiara: My poetry is about justice, and equity.

30:34

That's what I write about. You're not surprised, no?

30:38

Laura May: I'm not surprised at all. Anne Ikiara: I write about equity and human rights and democracy.

30:47

I'm a child of, as you may as assume, from parents that experienced colonialism.

30:54

So there's a bit of a bit of that in my poetry.

30:58

My book actually is I have a manuscript that is currently going

31:02

through editing, that is about my experiences in the nonprofit sector,

31:07

and the inequalities that exist in that space for for people of color.

31:14

And the colonial aid system structures that follow the same

31:19

trajectory as colonialism did.

31:21

I also write about women's rights and, and gender issues.

31:27

That's my passion. Laura May: Incredible.

31:30

Absolutely incredible. I mean, to me it sounds like we would be better off having your

31:35

writing circulating on social media than disinformation for sure.

31:38

Like. Anne Ikiara: Yes, yes.

31:41

Yes. I, I hope it'll circulate

31:44

uh, soon. It usually happens that disinformation spreads faster than positive messages.

31:51

I think that's the way human beings are. Laura May: It's true.

31:55

I mean, we've got that negativity bias, right?

31:57

And threats are more immediate and more important and pressing than

32:01

things that make us feel good. Absolutely.

32:04

Yes. So tell me, if you had a magic wand and you could use it to change one

32:10

thing about the digital landscape.

32:14

What would you do with your magic wand? If you could do any one thing?

32:18

Anne Ikiara: I would make all platforms safe for the 2 billion people that

32:23

are going to have elections in 2024.

32:26

I would just wave my magic and 2 billion people would be safe.

32:31

There'll be no disinformation, no misinformation, no hate speech,

32:35

so democracy would thrive.

32:38

People would have their agency, because disinformation robs people of their agency

32:43

because these spaces target messages at you that skew your thinking, and that's of

32:49

course taking your agency away from you.

32:52

So citizens in those countries would have their agency, would have the best

32:56

leadership, would have the best democracy.

32:59

There would not be any hate speech. There would be serenity in all of the world and we would

33:05

interact in those platforms. To, you know, to have messages of hope and peace and, and progression,

33:12

not how to hate on each other and how to make life difficult for

33:17

each other, rather to progress. And we'd discuss things that are great for take us forward,

33:25

rather than that divide us. Laura May: I love that answer.

33:30

You know, sometimes, sometimes when I ask people a question like this,

33:33

they'll be like, hmm, I would change this program to be something different.

33:36

And you're like, no, no. With my magic wand, I'm gonna cause world peace in the next year.

33:43

I love that. I love that. And why not, right?

33:45

I mean, that's what you're doing with Digital Action.

33:47

That's, that's the whole end goal, so good on you.

33:51

And on the more personal level, Do you have any recommendations for

33:56

us as individuals and as listeners?

33:58

Like what should we do if we think something is disinformation?

34:03

Anne Ikiara: I would ask citizens, private citizens, not to spread misinformation,

34:08

disinformation, hate speech. Verify information before you pass it on, because the platforms are

34:16

powerless without us participating in the disinformation, in spreading

34:20

disinformation and hate speech. So don't spread hate speech.

34:24

Verify the information before you spread it, and instead of

34:29

spreading disinformation, spread the right information that is

34:34

bringing peace, democracy, and promoting human rights to the world.

34:39

Both in the Global Minority and Global Majority.

34:44

all people are the same. We are all human and I think that's the way we should see ourselves.

34:52

Laura May: Amazing. So look Anne, thank you so much for joining me today.

34:55

For those who are interested in learning more about your work, whether

34:58

as a poet or on the behalf of Digital Action, where can they find you?

35:04

Anne Ikiara: Okay, our work is on www dot Digital Action dot co.

35:10

That's where you can find our campaign materials.

35:13

You can find our asks, and you can find our coalition partners.

35:18

Thank you so much for having me. I really appreciate the opportunity to talk about Digital Action and the Global

35:26

Coalition for Tech Justice and how people can partner with us to protect people

35:32

and elections and not big tech companies.

35:35

Laura May: Absolutely. Thank you so much again Anne, and for everyone else, until next time,

35:40

this is Laura May with the Conflict Tipping Podcast from Mediate.com

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features