Podchaser Logo
Home
Tech, Privacy, and Liberty: Unpacking the Future with Sean Patrick Tario

Tech, Privacy, and Liberty: Unpacking the Future with Sean Patrick Tario

Released Tuesday, 23rd April 2024
Good episode? Give it some love!
Tech, Privacy, and Liberty: Unpacking the Future with Sean Patrick Tario

Tech, Privacy, and Liberty: Unpacking the Future with Sean Patrick Tario

Tech, Privacy, and Liberty: Unpacking the Future with Sean Patrick Tario

Tech, Privacy, and Liberty: Unpacking the Future with Sean Patrick Tario

Tuesday, 23rd April 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:17

Well, hello, faithful politics listeners.

0:20

And if you're joining us on YouTube, our

0:20

viewers, thanks for coming and being with

0:25

us today. We're excited to welcome on to the show

0:26

today, Sean Patrick Theriault, to our

0:33

podcast. Sean's an innovative entrepreneur and a

0:34

passionate community builder.

0:38

He's founded Open Spectrum Incorporated, a

0:38

consulting firm that offers crucial

0:43

insights in the data industry, data center

0:43

industry, rather.

0:47

He's also driving force behind mark37

0:47

.com, which curates products and services

0:54

for a sovereign lifestyle and a co

0:54

-founder of Intelligence On Demand, a team

1:00

dedicated to combating misinformation.

1:03

Sean's work reflects his deep commitment

1:03

to decentralization, privacy and

1:08

empowering communities through technology.

1:10

Welcome to the show. Thanks so much for being with us here,

1:12

Sean, and I hope I pronounced your last

1:16

name correctly. It's...yeah.

1:21

Yep. But most of our last names are all

1:22

bastardized over the years, right?

1:26

So I mean, it used to be T -H -E -R -I -A

1:26

-U -L -T, French, the French way of

1:31

spelling it. There you go.

1:33

I mean, mine, yeah, mine was Bertram and

1:33

it began as Buttram.

1:37

So think about that one for a little bit,

1:37

dude.

1:40

That's how it was in the old English.

1:43

So anyway, hey, man. So thanks for joining us today.

1:45

Tell us a little bit about yourself. I know I gave a brief bio there, but kind

1:47

of talk to us a little bit about yourself

1:51

and what's what's gotten you to where you

1:51

are today, where you're so interested in

1:56

decentralization, privacy in empowering

1:56

communities through technology.

2:01

Yeah, so I'm born and raised in North

2:01

Shore of Chicago.

2:04

If you've seen the movie Home Alone, a

2:04

Risky Business, that's basically where I

2:07

grew up. Spent most of my teenage years there,

2:09

ended up having an opportunity to go to

2:15

California for university in the Bay Area

2:15

during the peak of the boom in 98.

2:20

So I was at Santa Clara University in

2:20

Santa Clara, California, right in Silicon

2:24

Valley from 98 to 02.

2:28

and then I effectively moved to the beach

2:28

from there coming from Chicago.

2:33

I figured if I'm going to be living in

2:33

California, I want to live on the beach.

2:36

So I moved to Santa Cruz, California,

2:36

which if anyone has ever known or been

2:40

around Santa Cruz, Santa Cruz is like the

2:40

hippiest of the hippies live in Santa

2:44

Cruz. I think you guys are near North Carolina,

2:45

right?

2:47

You're in Virginia. So not too far from Asheville.

2:50

Asheville is basically considered the

2:50

Santa Cruz of the East coast.

2:54

So I spent a lot of time with some very

2:54

free thinking, open minded, you know,

2:59

crazy hippies, libertarians.

3:03

In fact, I got very involved in politics

3:03

around the Ron Paul campaign.

3:10

So that's really where I started digging

3:10

into how the sausage is made in DC, even

3:14

though I was an econ polysci double major.

3:17

But I've been following the money my whole

3:17

life and really trying to teach people and

3:20

educate people on. you money, what is currency, what has value.

3:26

I was very grateful and thankful to have a

3:26

father who started sitting me down at an

3:29

early age and starting to teach me these

3:29

things and made me read.

3:33

Have you guys ever heard of the book or

3:33

watched the series The Creature from

3:36

Jekyll Island? No.

3:38

heaven. highly recommend it. It's all about the history of money, the

3:40

history of currency, the Federal Reserve

3:44

Bank, and really just educating people on

3:44

how our monetary system became what it is

3:50

today. It's essential learning, I think, for all

3:51

humans, let alone citizens.

3:55

I think you guys would greatly appreciate

3:55

digging into that.

3:58

But long story short, as a result of that,

3:58

I've been called a conspiracy theorist my

4:02

whole life. And so I've thankfully been proven correct

4:03

on pretty much everything.

4:09

that has played out over the last couple

4:09

months and years.

4:12

We have the World Economic Forum.

4:15

I don't know if you guys have heard of the

4:15

World Economic Forum.

4:17

So out in Davos, they're all openly

4:17

speaking about globalization, openly

4:23

speaking about the destruction of states,

4:23

openly speaking about depopulation, all

4:29

these things that I've been trying to,

4:29

central bank digital currencies, the...

4:37

social credit scores, like all of these

4:37

things that they are now openly saying we

4:41

want this, we need this, we're gonna be

4:41

working towards this, is what I've been

4:45

trying to warn people about for decades

4:45

and now it's all starting to play out.

4:50

So for better or worse, it is what it is.

4:53

But for me, I've been in the IT industry

4:53

my whole life and I've followed the money,

4:58

saw that all the money was coming from

4:58

these same people that are in Davos, are

5:02

the ones funding the venture capital and

5:02

the private equity firms that are funding.

5:06

the technology companies.

5:08

And so I realized what was going on and

5:08

started really realizing after 9 -11, if

5:14

you can imagine this, I actually read the

5:14

Patriot Act.

5:18

Very few people have actually read this

5:18

document, but it is a absolutely

5:22

terrifying document if you read it,

5:22

because what it does is it essentially

5:27

gives carte blanche to our military and to

5:27

our government to spy on its own citizens

5:31

under the guise of national security.

5:35

So when this came out, I was warning all

5:35

of my friends and family that this was

5:38

going to be leveraged against the American

5:38

people.

5:41

And they said, that'll never happen. This is just going to be used for the very

5:43

specific use case of these terrorists in

5:47

Iraq and Afghanistan and wherever, Syria,

5:47

Russia.

5:51

And I said, no, all this does is give

5:51

legal cover for what our intelligence

5:57

agencies have already been doing,

5:57

basically stripping away our privacy

6:02

through data collection. And I lived through that in Silicon

6:04

Valley.

6:06

So if you went into a VC meeting or a

6:06

private equity meeting and you didn't have

6:11

a data story to your pitch, you were not

6:11

getting funding.

6:15

You had to in some way incorporate how

6:15

you're going to be aggregating and pulling

6:20

data from your customers in order to get

6:20

funding because they knew how valuable

6:25

that data was going to be. So here we are in, you know, what is it,

6:26

2024.

6:31

in a dynamic where the surveillance state

6:31

is everywhere.

6:34

This is not a left or right or a

6:34

conservative or liberal topic.

6:38

This is a simple reality. Your device that you carry around in your

6:40

pocket is listening to you all day every

6:46

day. Everyone knows it.

6:49

This is not like, no, that's not, no, it

6:49

is.

6:51

Everyone knows it is because they talk

6:51

about something and then they see ads for

6:54

it. And they're like, how the heck did that

6:55

happen? So the simple question is, how do you feel

6:57

about that?

7:00

And literally, 90 plus percent of the people that I asked

7:01

that question to, again, doesn't matter

7:04

what political spectrum you are, they're

7:04

not a fan.

7:07

They're not, they don't like it, right?

7:10

So the real question is, so what can we do

7:10

about it?

7:13

So I spend my time traveling the country,

7:13

educating and training organizations and

7:17

groups and individuals on the reality of

7:17

all the different ways that big tech has

7:22

gone about invading our lives,

7:22

infiltrating our lives, making it so that

7:27

things are so convenient.

7:29

that were oblivious to exactly all the

7:29

ways that they're sucking and pulling our

7:33

data and then walking people through,

7:33

well, what can we do about it?

7:36

How can we get these companies out of our

7:36

way?

7:39

You know, that's really cool.

7:42

I have a friend in Washington state who

7:42

will love everything that you just said.

7:48

He's an old army buddy of mine.

7:51

And he was really reluctant to get like a

7:51

smartphone for years and years, you know,

7:59

and anybody kind of part of the military

7:59

will have like a little clique of military

8:04

friends. And there's always like the one person

8:05

that, you know,

8:08

says some really weird stuff and you're

8:08

just like, okay, that's Johnny or

8:12

whatever. But then it's like, you see all these

8:12

things actually come to fruition and be

8:17

like, man, maybe Johnny was right all

8:17

along.

8:20

He was way ahead of us.

8:24

I was thinking about this going into this

8:24

interview, the amount of data that's

8:34

available to Josh and I.

8:36

just as a result of having a podcast, like

8:36

we, from our hosting provider to YouTube,

8:42

is almost scary.

8:45

Like, we're not smart enough to know what

8:45

to do with it, to market, you know, in a

8:50

way that makes sense, but like big

8:50

businesses, like are smart enough to know

8:54

how to market and use that data.

8:56

So like, how can one, how can one protect

8:56

their data?

9:01

Or is that even possible?

9:03

Yeah they totally can. So it's just a matter of how much data are

9:04

you putting out there.

9:08

So think of it in this perspective.

9:10

Facebook Google Microsoft Amazon all these

9:10

big tech companies Apple they're

9:17

constantly harvesting and collecting data.

9:59

So if you're using a Google operating

9:59

system if you're using an Apple operating

10:03

system Microsoft Windows operating system

10:03

Amazon operating system if you're using

10:08

Alexa if you're using Siri right.

10:10

You're constantly feeding. data and these devices are just data

10:12

harvesting machines.

10:16

That's what they're intended to do. That's why they want you to continue to

10:17

buy newer, newer things, right?

10:21

It's not because of the marginal tiny

10:21

little difference in the picture quality,

10:27

right? It's because they've added more processing

10:27

power, they've added more RAM, more CPU,

10:32

so they can harvest more data, they can

10:32

process that data on the device that

10:37

you're paying them for before it gets

10:37

shipped off into their cloud environment.

10:43

you So we all have our own profiles that are

10:43

sitting in this cloud that all these

10:49

companies share and there's a dollar

10:49

amount that's tagged to your profile, each

10:53

one of your profiles, based on your

10:53

demographics, where you went to school,

10:57

what your income bracket is, where you

10:57

live, yada, yada, yada, right?

11:00

So the more data you feed into this Borg

11:00

system, the more value is gonna be

11:06

attached to your profile, right?

11:09

So from a marketing perspective, that

11:09

makes perfect sense.

11:12

It's brilliant, right? But here's where the scary part comes in.

11:16

It's when we start to realize that this is

11:16

psychological warfare.

11:20

Because you can use the reality that if I

11:20

know what someone's buying, I know where

11:26

they're going, where they've been, I know

11:26

their friends, I know how they communicate

11:31

with their friends, the language that they

11:31

use, I know what's triggering them, I can

11:36

then start to shape a narrative that I

11:36

want them...

11:39

to believe. So by pushing information or withholding

11:40

information, you can start to shape the

11:46

minds of the masses.

11:48

And this is what our government has been

11:48

doing, not just overseas in other

11:53

countries to manipulate elections and

11:53

whatnot, but it's what they're doing in

11:57

this our own country. So if I'm a global corporation, and I

11:58

could care less about nation states,

12:05

right, and I simply want people to move in

12:05

a certain direction,

12:08

I can certainly use that data because I

12:08

can say I want to go after this specific

12:13

demographic. I want every African American male between

12:14

the age of 40 and 50 years old who makes

12:19

between 200 and $400 ,000 that has at

12:19

least two kids, you can narrow it down.

12:24

They say, okay, that's about 25 ,000

12:24

people, whatever it might be.

12:28

And your cost to address that population

12:28

is gonna be X, right?

12:33

And they don't care what you're gonna push

12:33

to them.

12:36

They do not care what you're going to push

12:36

to them.

12:38

You can be sending text messages to that

12:38

population.

12:41

You could be doing ads on Facebook or

12:41

Instagram or Twitter or whatever it is.

12:45

They don't care. So for people to think, well, this

12:46

information is only going to be used for

12:50

marketing purposes. It's not going to be used to try to

12:51

influence an election or try to influence

12:55

the popular opinion in any perspective.

12:58

Like you're living with your head buried

12:58

in the sand.

13:01

You're living in some kind of clown world

13:01

universe.

13:03

Right. Right.

13:06

100%. So for us, I decided I'm not okay with

13:07

that and I need to do something about it

13:12

and people need to do something about it. And it's really pretty simple.

13:15

Have you guys heard of Linux? So Linux is an open source operating

13:18

system for laptops and desktops.

13:23

What Linux does is it gives you control of

13:23

your device.

13:27

You no longer have Microsoft, you no

13:27

longer have Apple running your device on

13:32

the backend. performing updates whenever it wants,

13:33

reinstalling applications that you may

13:38

have deleted, turning on certain settings

13:38

that you may have turned off, doing things

13:43

that you have no idea what it's doing

13:43

because it lives in a closed source code

13:48

environment. That means you can't see the code, right?

13:52

In an open source environment, everybody

13:52

has access to the code.

13:56

It's like having the full blueprints to

13:56

your house and you're giving it to

13:59

everybody and you're saying, Where is there a threat in my house?

14:03

Can someone come in underground? Can someone come in through a window?

14:06

And if that's the case and I need to patch

14:06

that, I can create a patch and then I can

14:11

push that out to the whole environment and

14:11

say, does this patch make sense?

14:15

Is this gonna solve the problem? And the community can say yes or no or

14:17

whatever.

14:19

Hey, you also left out, by patching it

14:19

this way, now someone else can come in

14:24

through another route, right? So with an open source environment, you

14:25

know if something's been fixed or not.

14:29

and you have a community of people that

14:29

have eyeballs on it.

14:31

In a closed source environment, to your

14:31

point about trust, you're supposed to just

14:35

trust Microsoft, or trust Apple, or trust

14:35

Google.

14:40

don't worry about it, we've taken care of

14:40

it.

14:42

Shouldn't be a concern of yours anymore.

14:45

But what I learned in technology, working

14:45

in Silicon Valley, working in DC, having

14:51

had conversations with intelligence folks,

14:54

is that for national security reasons,

14:54

every one of these major technology

14:58

companies have provided backdoor access to

14:58

those environments.

15:04

And that is in part why these environments

15:04

are kept closed source, because they don't

15:10

want companies and individuals and

15:10

organizations to know exactly how they're

15:15

accessing your devices.

15:18

So from a comms perspective, Will, you

15:18

said you were in the military, right?

15:23

Coms is absolutely imperative in military.

15:26

You have to protect and control your coms,

15:26

100%.

15:30

If I have control of my enemy's coms,

15:30

where I have access to my enemy's coms, do

15:35

I tell the enemy, hey guys, I've got

15:35

access to your coms?

15:40

No, you keep your mouth shut.

15:42

You want them to be as ignorant about the

15:42

fact that you have full access and control

15:46

of their coms as possible.

15:49

Because then you see what they see, you

15:49

hear what they hear, right?

15:53

That's exactly what's playing out.

15:55

So we have these three letter agencies and

15:55

these big global corporations that have

16:00

full access to all of our comms all the

16:00

time.

16:04

And they've kept people completely

16:04

ignorant about that fact.

16:09

And it's no wonder why those who are

16:09

fighting for sovereignty and people who

16:13

are fighting for digital freedom and

16:13

privacy or election integrity or whatever

16:17

it might be, why it is that the enemy

16:17

knows what's happening before it's

16:21

happening. because they continue to use these devices

16:22

that are controlled by the enemy.

16:26

So Josh and Will, simple question that I

16:26

have for you guys.

16:31

If we're at war, right?

16:33

I believe that we're at war right now, but

16:33

whatever.

16:36

If we're at war, right?

16:38

And I give you a weapon, and I say just so

16:38

you know, there's a GPS tracking device on

16:43

this, there's a camera and a microphone.

16:45

All of that information from that is

16:45

actually being fed to the enemy.

16:50

And I'm not gonna explain to you how to

16:50

use this weapon.

16:52

I'm just gonna tell you if you pull this

16:52

trigger, it shoots, it fires, you know.

16:56

That's all I'm gonna tell you.

16:59

Would you use that weapon in combat?

17:04

I mean, do I have access to another

17:04

weapon?

17:09

I've got some follow -ups. I've got some follow -up questions.

17:12

you wanna find access to another weapon,

17:12

right?

17:15

Because you're just walking into a losing

17:15

battle if they know exactly where you are.

17:20

And by the way, they can turn that weapon

17:20

on and off at will, right?

17:26

And via that weapon, they know all the

17:26

other weapons that are connected near you

17:31

and around you, right? Like, that's exactly what this is.

17:35

This is a psychological warfare weapon

17:35

that they have spent trillions of dollars,

17:41

inventing and investing in to control the

17:41

population.

17:46

And it's not just companies, because as

17:46

you guys probably know, the DOD and DARPA

17:52

have put a lot of money into these

17:52

companies and into these businesses

17:56

helping to develop this technology to

17:56

figure out how they can influence people

18:01

through this tech. So that's the paradigm that we live in

18:03

today.

18:06

And My whole shtick is we just need to get

18:06

educated and informed and be aware of this

18:11

because the more they encroach across this

18:11

line and get into everything in our lives,

18:17

we will be in, and I think we're already,

18:17

excuse me, we're already in a 1984, you

18:22

know, the book 1984 State, where they

18:22

have, you know, Big Brother is living on

18:27

the wall and they say, well, you have to

18:27

have this in your house.

18:32

for national security reasons, and we're

18:32

only gonna use it to push out updates from

18:36

the president, or only gonna push out

18:36

national news that is relevant for you,

18:42

with the reality that we know that it's

18:42

looking and listening and spying all the

18:45

time, so that if you violate wrong think,

18:45

you get sent off to the authorities.

18:50

That's where we're going. That's exactly how things play out in

18:52

North Korea, that's exactly how things

18:55

play out in China, and in other.

18:58

communist countries around the world that

18:58

have this level of sophistication.

19:02

And for me, I don't want to live in that

19:02

paradigm.

19:05

Do you guys want to live in that paradigm?

19:08

Now, I'm curious, two questions actually.

19:13

Maybe you can talk to us about your cell

19:13

phone.

19:17

Is your cell phone different than ours?

19:21

Yes. So the only difference is that this phone

19:21

runs a open source operating system called

19:26

Graphene OS. It is the most private and secure open

19:28

source mobile operating system on the

19:32

market today. So what this does is every application

19:33

that is downloaded onto this device, I can

19:38

download Facebook if I want. People always ask, well, what about this

19:39

app?

19:42

What about that app? You can use any app.

19:44

The difference is when I download that app

19:44

and I click, I'm OK with the user terms of

19:49

service. I'm instantly going to start to get

19:50

notifications from this phone that says

19:54

this application wants access to your GPS,

19:54

wants access to your contacts, wants

19:59

access to whatever. So if I download a flashlight app and it

20:00

starts spitting off, hey, this flashlight

20:05

app wants access to your contacts, wants

20:05

access to your GPS, I can be like, nope, I

20:10

don't want to give that information to

20:10

this flashlight app.

20:13

Versus what happens now when you download

20:13

that free app.

20:16

And you click, yep, okay, okay.

20:18

No one reads the terms of service. They don't realize that when they clicked

20:20

okay, they just gave that app full access

20:23

to their device. It now knows everything that's going on,

20:24

knows all different applications you're

20:27

using, when you're using them, and it just

20:27

starts sucking all that data and pushing

20:31

it out because that's how these companies

20:31

make their money.

20:34

So you now gain control and access to

20:34

everything happening on that device.

20:39

Apple's not involved, Google's not

20:39

involved, Microsoft's not involved.

20:42

Any of these big tech companies are not

20:42

involved.

20:44

But if you want them to be involved,

20:44

that's your choice.

20:48

I can download Google Maps, the app Google

20:48

Maps, and then I can give Google Maps full

20:54

access to my GPS and all this information.

20:56

I can choose to do that if I want to.

20:59

Now what I coach people about is there's

20:59

other ways to use these applications on

21:03

your mobile devices that doesn't involve

21:03

you downloading the application.

21:07

You can use Google Maps through a browser.

21:10

You can do it using a VPN.

21:12

You can use it VPN through your browser in

21:12

a private session so that Google has no

21:17

idea who you are and who's actually

21:17

accessing that application through the

21:24

browser, which is what I do every now and

21:24

again when I have to.

21:27

But there's other tools that are out there. Have you guys heard of the company Garmin?

21:33

So really cool company Garmin.

21:35

Garmin is a mapping company.

21:37

A lot of people use it when they're

21:37

backpacking, going on trips and whatnot.

21:40

But what's cool about Garmin, they've

21:40

always had a philosophy that everything

21:44

you're doing is on the device.

21:46

And they've wanted it that way because

21:46

they knew that some of their customers who

21:50

are going to be out in remote locations

21:50

are still going to need to be able to

21:53

navigate without access to data, a data

21:53

plan or a data connection.

21:57

So everything happens on the device.

22:00

You type in a location, it routes you.

22:02

That routing isn't happening in a cloud.

22:04

where they then know where you're going

22:04

and where you've been and try to sell you

22:08

things along the way, it happens on the

22:08

device.

22:11

So that data is stored on the device, kept

22:11

on the device.

22:14

It's not fed to anybody else. So that's just a shift in mentality in

22:15

think, which is not what's your data

22:20

strategy. It's how do we protect and keep that data

22:21

sovereign to you and protected for you,

22:26

right? Not in a cloud somewhere else owned by

22:27

people that are gonna use that data to try

22:31

to harm us. Yeah. So, so my, the, the, the second question I

22:33

had was, was about TikTok.

22:39

I hear a lot of talk about, you know,

22:39

TikTok, the dangers of TikTok.

22:44

my wife uses TikTok. I don't use TikTok. Although we have a TikTok for this, for

22:46

this podcast, but I don't manage it.

22:50

Josh does. So that's all his data. So tell me about TikTok and why is it so

22:52

bad?

22:56

So I would not say that TikTok is so bad.

22:58

I would say that it is doing nothing

22:58

different than what Facebook is doing,

23:02

what Instagram is doing, what WhatsApp is

23:02

doing, what any of these other

23:05

applications are doing. I think part of what is happening, and

23:06

I've been saying this for a long time,

23:10

they're going after social media companies

23:10

and platforms.

23:13

They're trying to stigmatize them as being

23:13

bastions of wrong think and that we can't

23:21

have these applications available to the

23:21

general public because it's not safe.

23:26

Right. That gives them the latitude to then be

23:27

able to control what is and is not on all

23:33

of these applications. Right.

23:35

Control the content that is is not allowed

23:35

on all these applications.

23:39

So for me this is the government trying to

23:39

once again trying to create a creative

23:43

maneuver to try to create legislation that

23:43

gives them the authority to now say what

23:49

content is and is not allowed on the

23:49

Internet.

23:53

Right. That's not okay. This is all going towards a digital ID

23:54

system.

23:58

The entire end game here, the Hegelian

23:58

dialectic, if you guys are familiar with

24:03

that, right? You create the problem already knowing

24:03

what the solution is that you're gonna

24:06

steer people towards, right? They're creating this problem of TikTok

24:08

saying, this is a Chinese -owned company

24:12

and that they're harvesting your data and

24:12

this is being used against you.

24:16

All the other companies are doing it. They're doing the exact same thing, right?

24:20

So they're using this as a means to try to

24:20

say, well,

24:22

Now we need to control, you cannot be

24:22

anonymous online.

24:26

You cannot have an account anywhere online

24:26

unless we know exactly who you are,

24:30

because if you push out any nefarious

24:30

activity, we want to know exactly who did

24:34

it and where you were so that we can find

24:34

you and hold you accountable.

24:37

That's where this is all leading toward,

24:37

which is also leading towards the social

24:41

credit score system.

24:44

That's where it's all leading towards,

24:44

folks.

24:47

man, this is all like, it's like drinking

24:47

through a fire hose.

24:50

And I'm so fascinated by it.

24:53

And I really appreciate one, your passion,

24:53

two, your knowledge ability on everything.

24:59

I think one of my questions is you had

24:59

mentioned like follow the money.

25:03

How is it that someone like me, like I

25:03

have a MacBook Pro, I'm using their iOS

25:10

system or their OS, you know, whatever the

25:10

version is now system.

25:16

How can I take this? hardware that I have and how can I follow

25:17

the money?

25:21

How can I figure it out? Is it possible or do I have to trust

25:23

experts like you that can tell me that

25:29

this is happening? How can I verify it, if that makes sense?

25:33

So, follow the money is pretty

25:33

straightforward, right?

25:36

So Apple is a publicly traded company.

25:38

You can just do the research on the company. Who are the investors in Apple?

25:42

Right? Well, who are the investors in Apple and

25:42

Microsoft and Intel and AMD and all these

25:47

big tech companies, right? Well, funny enough, or not surprisingly

25:48

for some of us who have been doing this

25:52

type of research, State Street, Vanguard,

25:52

Berkshire, Hathaway, BlackRock, right?

25:59

There's only a handful of big equity

25:59

investors.

26:03

that have board seats and controlling

26:03

interest in all of these companies.

26:07

And it's not just tech, right?

26:10

It's not just tech, it's healthcare, it's

26:10

our food, it's our cosmetics, it's

26:16

everything that we are consuming, our own

26:16

and controlled by very few people.

26:21

And again, this is not conspiracy theory.

26:23

You can just do the research.

26:25

And like I said, you know, Will earlier,

26:25

I'm just spitting facts.

26:28

You just do the research and you'll see.

26:31

who actually owns and controls these

26:31

companies.

26:33

And it's the same groups of people.

26:35

So then you have to ask yourself, well,

26:35

what is the ethos of the people that own

26:40

these companies? Well, I've heard the ethos of these people

26:41

because they're the ones sponsoring and

26:45

attending the things like the Bilderberger

26:45

Group, the World Economic Forum, the

26:51

United Nations, and other organizations

26:51

that are very transparent about what

26:55

they're trying to manifest and what

26:55

they're trying to create.

26:58

And it is a totalitarian, state environment that is, you know,

27:00

that's not the world I want to live in.

27:04

But that's the world that those who have

27:04

full control and board seats that control

27:10

these businesses across the board.

27:15

So you can't look at like, well the CEO is

27:15

this guy and he thinks this way.

27:19

Well the CEO is a stooge. In most cases he's just a pawn at the

27:21

company.

27:24

He does whatever his board tells him to do

27:24

so he doesn't lose his job, right?

27:28

He might be a great guy and I hear this

27:28

all the time.

27:31

Well so and so is a great guy, right?

27:34

It's like many politicians. This politician is a great guy.

27:37

I'm sure he's a great guy. I'm sure he kisses babies really great,

27:38

right?

27:41

But does he vote in accordance with your

27:41

ideals?

27:46

90 % of the time when you do the research

27:46

and look at the data, the person who's a

27:51

great guy who goes to your church isn't

27:51

voting the way that he said he was gonna

27:56

vote. And they bank on the fact that you're not

27:56

doing your homework and you're not doing

28:01

your research, which is the same with

28:01

these corporations.

28:04

So as a Christian conservative, like this

28:04

is where I come from, I go insane when I

28:10

hear people talking about, well we have to

28:10

boycott Bud Light.

28:13

We have to boycott Target, we have to

28:13

boycott Liberty Safes, we have to boycott

28:18

Ben & Jerry's. I'm like, why are we not talking about

28:19

boycotting Big Tech when they are

28:24

exponentially more responsible for the

28:24

paradigm that we live in today?

28:29

Exponentially more responsible, to the

28:29

tune of trillions of dollars more.

28:33

We're talking about boycotting, why are

28:33

the talking heads in our movement not

28:38

having this conversation? It's either because they're ignorant,

28:39

they're uninformed, or they are aware,

28:43

they just want to shut up about it because

28:43

they don't want to be deplatformed.

28:46

Because if they're deplatformed, they'll

28:46

lose their monetary streams.

28:50

But you guys are Christians, right?

28:54

So, simple question, right?

28:57

If we're Christians, and Jesus Christ is

28:57

our Lord and Savior, is for me, right?

29:03

Do I serve money, or do I serve God?

29:08

I serve God. Right?

29:10

So for me it's a very simple equation

29:10

because I've been down this road, I was in

29:13

a lucrative career doing what I was doing,

29:13

making a lot of money, but the Lord made

29:17

it very clear to me, and they said, Sean,

29:17

this is your mission now.

29:21

You have to go educate and train people

29:21

and use all the experiences that I've

29:25

given you and the knowledge that I've

29:25

given you to start waking people up.

29:30

so that they start making better decisions

29:30

because our dollars is our ammunition in

29:35

this war. And if we are called to be angels and

29:36

saints, which we are called to be, Christ

29:40

calls us to be angels and saints.

29:43

And we're saying, sorry God, I can't do

29:43

what you've called me to do, speaking

29:47

truth and love to my friends and family

29:47

and everyone else around me in business,

29:51

doesn't matter where, because I'm afraid I

29:51

might get deplatformed, I might lose my

29:56

job, I might whatever, right?

29:58

then you're specifically saying that the

29:58

Lord of Lords, our God of Gods, who moves

30:03

mountains and raises the dead is not more

30:03

powerful than manna.

30:08

That is what you are saying by your

30:08

behavior.

30:11

So I'm livid, obviously, because I have

30:11

this conversation every single day with

30:17

pastors and Christians who have excuses as

30:17

to why they continue to serve manna and

30:24

money and these false idols and not Christ

30:24

the King.

30:30

You know, what's interesting is like

30:30

how...

30:40

I want to make sure I phrase this

30:40

correctly, because it is something that I

30:43

think about. I'm not doubting one bit that our data is

30:44

being used in ways that aren't necessarily

30:50

beneficial to anybody.

30:53

My question is, I'm a dad.

30:56

I've got a job. I've got kids.

30:59

I got one with some pretty severe medical

30:59

issues.

31:02

I don't necessarily spend much time

31:02

thinking about how my data is being used,

31:07

because I'm just like, man, I just got all

31:07

these other things to kind of...

31:09

be focused on. So like, can you make an argument for why

31:11

I should care and how that data could

31:21

potentially be used against me, even

31:21

without my knowledge?

31:25

So it's already being used against you, is

31:25

what my point has been, right?

31:30

Why you think what you think and believe

31:30

what you believe.

31:33

In large part, if you're on these devices

31:33

on a regular basis, using Google for

31:38

search, using Microsoft Bing for search,

31:38

using any of Apple's tools for search,

31:43

they are literally crafting the message.

31:45

We go to these tools and we put them on

31:45

this pedestal as being this authority for

31:52

knowledge, right? when that knowledge is literally being

31:54

manipulated.

31:56

And you guys probably saw the news last

31:56

week, there's a big hoop to do about

32:00

Google's AI image generation system, and

32:00

it was clearly skewing things in a certain

32:06

way. This is how all these systems work.

32:09

They're controlling narrative, right?

32:11

So once you start to realize, Will, that

32:11

maybe a lot of the things that I believe

32:16

to be true may not actually be rooted in

32:16

truth,

32:21

but rooted in a narrative that I've been

32:21

sold and pushed to me over the last God

32:25

knows how many years because I truly

32:25

believe the vast majority of what we've

32:29

been taught, even in school, having gone

32:29

through public school is a lie, is a joke,

32:34

and it's been all leading towards pushing

32:34

at a specific agenda throughout this whole

32:38

period. And if you look at following the money of

32:39

who started our education system, who

32:44

started our CDC system, who started our

32:44

National Institute of Health system,

32:49

These are moneyed interests that had a

32:49

very specific agenda.

32:52

You just have to do the homework and do

32:52

the research.

32:54

So for me, Will, the line in the sand has

32:54

to be crossed for certain people in

32:59

different places. For COVID, like I don't know where you

33:00

guys stood on COVID, but from day one, I

33:04

knew that this was gonna be a massive

33:04

cluster.

33:06

I knew that they were gonna be lying

33:06

through the teeth.

33:08

And it blew my mind that in Santa Cruz,

33:08

California, where I used to live, where

33:13

people, they were anti -authoritarian.

33:16

They hated Big Pharma. They hated big government and yet when

33:18

COVID came out and big pharma and big

33:22

government were spending hundreds of

33:22

billions of dollars to push out a

33:27

narrative that you needed to take this

33:27

vaccine that was untested and just brand

33:32

new into the market. They had no idea what was gonna happen and

33:32

making all these outlandish claims like

33:36

you wouldn't spread COVID if you got the

33:36

vaccine.

33:38

And then they said, 95 % efficacy.

33:41

well, 90%. well, 85%.

33:43

well, we don't really know. You still might get COVID.

33:46

Like making this up. the entire Santa Cruz community just

33:47

bought it.

33:50

And I would be like. This doesn't make any sense to me, why?

33:56

It's because the narrative that they

33:56

consumed, NPR, CNN, MSNBC, and the doctors

34:03

in that community were just feeding them

34:03

this information, so that's what they

34:07

believed to be truth. So we have this paradigm where we have

34:09

people walking down the street in a major

34:13

city being yelled at and screamed at for

34:13

not wearing a mask outdoors, you know, in

34:19

open air, and saying, you

34:23

And then you drive 20 minutes south, and

34:23

this was my reality in Raleigh, North

34:27

Carolina where I used to live. Drive 20 minutes south, everything's open.

34:32

Bars are open, they're open until 2 a

34:32

Restaurants are open, you can walk into a

34:37

grocery store, no one's screaming and

34:37

yelling at you, very few people wearing

34:40

masks, right? So we have this paradigm reality where

34:41

people are like, well, why should I care?

34:45

Why is it a big deal? I don't understand.

34:48

Why don't I just go along and play along

34:48

and not stir the pot?

34:51

And then other people are like, wait, I

34:51

have to think critically over here about

34:54

what I'm doing and why I'm doing it and

34:54

where I'm feeding.

34:58

What are my dollars feeding in this

34:58

process?

35:01

So for me, again, our dollars are our

35:01

ammo.

35:04

I believe that Google, Apple, Microsoft,

35:04

Facebook, Amazon, all these companies

35:10

collectively that make over a trillion

35:10

dollars a year between these five

35:13

companies alone are evil companies.

35:16

I believe that they are part of this

35:16

global apparatus that is trying to destroy

35:20

nation states. I believe that they are trying to bring

35:21

forth a dystopian world view.

35:26

I believe that they are trying to push a

35:26

depopulation agenda, not because I'm

35:32

crazy, but because they tell me that's

35:32

exactly what they're doing.

35:36

Because the CEOs of these companies are

35:36

literally at these events saying, yep,

35:40

that's what we're trying to accomplish.

35:43

So, well, if you're okay,

35:45

giving your data and all of your

35:45

information and that of your families to

35:49

people who are literally proactively

35:49

trying to kill you, that's your decision.

35:53

That's your choice. You can continue to do that.

35:56

And you can say, well, Sean, I don't

35:56

really believe that they're trying to kill

35:58

me. And I can say, just watch any of the Davos

35:58

forums, literally, just watch any of them

36:03

and listen to the people that are on stage

36:03

who are from Google, Microsoft, Apple,

36:08

Amazon, our FBI, our CIA, all the other

36:08

intelligence agencies.

36:14

They're all there being like, yep, yep,

36:14

that's what we're trying to do.

36:18

People need to wake up that this is

36:18

happening, right?

36:21

We're at war right now with people trying

36:21

to kill us and we're giving them free rein

36:26

access to all of our information and all

36:26

of our data.

36:29

That's a problem. But that line in the sand for you has to

36:30

be crossed at some point.

36:33

And I brought up COVID because for some

36:33

people it was wearing the mask where they

36:36

were like, screw that, you're not gonna

36:36

make me wear a mask.

36:39

That was the line in the sand. Other people were, well, you have to get

36:41

the job if you want to continue to work

36:44

here. And they said, nope, I'm not doing that.

36:46

That's where people started to wake up and

36:46

be like, wait a second, can my company do

36:49

that? Can this business force me to do that?

36:52

Is this constitutional? you

36:56

So that line in the sand has to be crossed

36:56

for you.

36:58

It may not be crossed yet. You might still be in a place where you're

36:59

like, eh, no big deal.

37:02

I don't think these companies are evil. And that's fine for you.

37:06

I know for a fact that they are. I've met some of these people that work at

37:08

that level at these companies.

37:11

I know what they're trying to accomplish.

37:13

They are generational wealth. They are evil people.

37:16

And they are trying to cause a

37:16

depopulation event.

37:19

And I'm not OK feeding my information, my

37:19

data into that system.

37:23

So I have two choices. I can go full Amish.

37:25

and just fully step away.

37:28

Or I can figure out how do I best go about

37:28

building a parallel economy, a parallel

37:33

system that I can still operate within, I

37:33

can still do business and commerce, we can

37:38

still have these types of calls,

37:38

communicate with people and not have to

37:43

just step out entirely from the system.

37:46

will just add that Amish do use

37:46

electronics.

37:52

My wife is from Northern Indiana, and she

37:52

lives right in the heart of Amish country.

37:57

And I remember the first time I seen some

37:57

Amish in one of their stores, because they

38:01

make great furniture. They're on cell phones and computers, and

38:03

I'm like, what the what?

38:06

My idea of an Amish was way different.

38:09

They're like, yeah, as long as they're

38:09

using it for work, it's OK.

38:12

They were.

38:14

into that for that My wife was raised in a night and she's

38:15

always someone who used technology.

38:21

But anyway, that's a different show. Yeah, I mean, dude, this is like, I mean,

38:22

one thing is like, you know, I'm just

38:28

again, like, taking it in, thinking about,

38:28

okay, so what does that mean?

38:34

Again, for me in terms of like, so I would

38:34

need to if I wanted to get off of, you

38:42

know, I wanted to be able to continue in

38:42

my commerce, buying and selling, and being

38:47

able to get my kids in education, you

38:47

know,

38:52

doing the kinds of things that you kind of

38:52

feel like has, whether it's just been like

39:00

this is what, they've just defined what a

39:00

good life is now.

39:04

And so this is like, hey, you have to have

39:04

a good life.

39:07

If you're gonna have a good life, it means

39:07

you need a college education.

39:10

It means you need this, this, or this.

39:13

And so what would my steps be if I'm like,

39:13

okay, I totally believe...

39:20

what Patrick's saying, what Sean's saying.

39:23

What are my next steps to download,

39:23

download the different operating system

39:32

and then like, sure, go ahead.

39:37

So what I'm talking about, digital privacy

39:37

and security is no different than physical

39:42

privacy and security. So what do I mean by that?

39:46

You can't just walk into a karate studio

39:46

and say, hey, I've got an hour.

39:50

Can you just sell me something so that I

39:50

can be a black belt?

39:53

That's not how it works. You can't just be like, well, I've got an

39:55

hour. Can you just train me for an hour and then

39:56

make me a jujitsu master?

40:00

That's not how it works, right. You have to...

40:03

I used to train Jujutsu, but anyways, go

40:03

ahead.

40:06

So you have to commit to this, right? And it takes some rewiring of your brain

40:08

and how you think and how you operate and

40:12

how you do things. You can't, well, you can't train someone

40:13

to be a jiu -jitsu master overnight.

40:16

You can't because there's a lot of muscle

40:16

memory that goes into this, right?

40:21

It has to be something that you learn over

40:21

time.

40:24

You don't, you don't become a master

40:24

gardener overnight.

40:28

You don't just buy a raised bed and some

40:28

seeds and then all of a sudden you're

40:31

growing your own food and you know, you

40:31

don't have to go to the grocery store

40:34

anymore. That's not how that works.

40:36

It takes time. You have to learn things.

40:39

You have to learn the process. You have to learn why and how and who and

40:41

where.

40:44

You have to learn things. So the first thing is we have to just

40:45

acknowledge, raise our hand.

40:49

I don't know if you guys know anyone who's

40:49

gone through addiction programs or

40:52

whatnot, I definitely do. But we have to just acknowledge, we're

40:54

addicted.

40:56

I'm an addict, I'm addicted, right?

40:59

I'm addicted to this thing. I have a problem and I'm willing and

41:00

wanting to take the steps to step away

41:06

from this addiction, right? Once you make that step and you

41:08

acknowledge and you admit, okay, let's now

41:12

surround you with not only some people who

41:12

can help coach you through the process of

41:16

which there's a huge community online, if

41:16

you go to

41:19

If you go to true social or gab or on

41:19

Twitter find mark 37 .com Or or Jeffrey

41:26

Peterson's got a great program. There's so many options that are available

41:28

right now literally

41:31

tens of millions of people around the

41:31

world that are starting to wake up and

41:33

help each other through this program. So you're not alone, you can find a

41:35

community to help you through the process.

41:38

That's one. Two is we provide you the tools.

41:40

So whether it's the phones or the laptops,

41:40

you can learn how to do all this yourself.

41:44

You can learn how to rip Google off of

41:44

your device and install an open source

41:48

operating system. You can learn how to rip Mac OS off of

41:49

your Macbook.

41:53

You can learn how to rip Windows off of

41:53

your PC and install Linux and open source

41:58

operating system. You can learn how to do all this stuff

41:59

yourself. But in many cases, people don't have that

42:01

technical acumen, so they would rather

42:05

someone else just do it for them. So that's where we have tools.

42:08

You can buy them on our website, you can

42:08

buy the laptop, you can buy the phone.

42:13

It's already preloaded with those open

42:13

source operating systems.

42:16

It's already preloaded with two dozen

42:16

apps.

42:20

that we've vetted that are owned by good

42:20

people in the same fight as us that are

42:24

protecting your data and your privacy with

42:24

those apps.

42:28

So like a mapping application that does

42:28

what I told you guys earlier does.

42:32

Browsers, whatever, VPN tools, all these

42:32

different tools that we can coach and

42:36

train you on how to use, but they're

42:36

already there, ready to go.

42:39

You don't have to figure it out. So that's an option.

42:42

But then you have to say, okay, my dad, my

42:42

70 some odd year old dad, his threat,

42:49

vector is different than mine.

42:51

He stays in one place. He really only uses his computer for

42:52

research and to talk with his grandkids.

42:57

Right. Whereas I'm traveling all the time

42:57

everywhere.

43:00

And you better believe there are people

43:00

who don't want me sharing the message that

43:03

I'm sharing with you right now. So they're trying to shut me up and make

43:04

my life as difficult as possible.

43:08

I have a different threat. I need to be trained in different tactics

43:09

than my dad needs to be trained as a

43:15

result. Right. So.

43:17

Josh, you have to figure out where am I in

43:17

this process and where do I want to start?

43:23

Where's the foot in the door? For a lot of people, it's grabbing a

43:25

laptop. For a lot of people, it's grabbing a

43:27

phone. And you don't just do a hard switch over.

43:31

Like you don't just say, okay, we have a

43:31

raised bed and we have some good soil and

43:36

some seeds. I'm gonna stop going to the grocery store.

43:38

You don't do that, right? You wait until your garden's starting to

43:40

produce and it's productive and you have

43:44

produce coming from it. And then you can start saying, okay, now

43:45

we're gonna start augmenting.

43:48

And then you start saying, okay, well, I'm

43:48

gonna get some chickens and maybe a

43:51

cattle, head of cattle, whatever. And then you can start.

43:55

Right? So the same thing happens with tech,

43:56

technology.

43:58

You start transitioning what makes sense

43:58

as you get comfortable.

44:01

And it's a lot easier than most people

44:01

think.

44:04

So once you get a device, you keep your

44:04

primary device, you use the other device

44:08

and set it up and get everything going the

44:08

way you want, start playing around with

44:11

the applications, get comfortable using

44:11

those applications, figuring out what they

44:14

do, how they do, what they do. And then the last step is you take your

44:15

SIM card out of the phone that you use

44:19

today and you put it in the new phone. And then all of your messages and whatnot

44:21

get directed to the new phone.

44:25

So that's there's a process. It's a journey.

44:27

It's my long winded way. Josh is saying it's a process.

44:31

It's a journey. But you have to start somewhere.

44:33

You have to say this line was crossed way

44:33

long ago and it's time for me to start

44:38

taking this seriously. I need to start making this this migration

44:40

happen.

44:43

Yeah, I really appreciate your metaphors

44:43

because I, you know, I've heard this

44:49

argument a number of different ways.

44:51

I'm not sure if you're familiar with Clay

44:51

Clark from the Reawakened tour.

44:55

We've had him on the show and he kind of

44:55

explained something pretty similar, but it

45:00

was sort of kind of all over the place.

45:03

That's not necessarily his fault, but

45:03

probably just mine in understanding it.

45:06

But I think really kind of understanding

45:06

the gardening metaphor, the jujitsu,

45:11

whatever, like, I think that that really

45:11

kind of resonates because I think what I

45:16

was trying to get to in my earlier

45:16

question was like, this is just too

45:19

overwhelming. You know, like, like,

45:21

Like I don't doubt what you're saying. Like I don't because I'm human and I'm not

45:23

one to necessarily attribute like very

45:31

altruistic behaviors to the CEOs and board

45:31

members of all these major companies

45:36

because like, you know, money crops,

45:36

right?

45:39

So, but I wanna ask you, are you, my notes

45:39

here says that you're a co -founder of

45:46

Rightforge. Is that correct?

45:49

Yeah, so Rightforge, I was a co -founder

45:49

of Rightforge, it was actually my baby,

45:54

really saw the writing on the wall when

45:54

Parler got deplatformed from Amazon Web

45:59

Services. And I had been warning all sorts of

46:00

companies and people that stuff like this

46:05

was going to happen and it was going to

46:05

start accelerating exponentially.

46:09

And sure enough, when Parler got given 24

46:09

hour notice that they needed to get off of

46:13

Amazon Web Services, which...

46:16

It's pretty shocking that their team at

46:16

Parler chose to go to AWS because they

46:20

were kicked off of Azure not long prior to

46:20

moving to AWS.

46:26

So how they thought somehow that Amazon as

46:26

a corporation would be different than

46:31

Microsoft as a corporation is just beyond

46:31

me.

46:33

But I digress. So I knew that this was going to be

46:35

coming.

46:37

I had all kinds of customers calling me

46:37

saying, Sean, you were right.

46:41

They're coming after us. They're trying to deplatform us from

46:42

GoDaddy or from Salesforce or from all

46:47

these different Silicon Valley companies

46:47

that were basically given orders from

46:51

their board. that you need to get rid of conservative

46:52

content and conserve, you know, 2A, 1A,

46:57

anyone that was talking about anything

46:57

about COVID, anyone that was talking

47:00

anything about election integrity, not

47:00

allowed to be on the platform, your D

47:05

platform, because it's a violation of our

47:05

terms of service.

47:08

So I started getting these phone calls.

47:11

I started building this network of

47:11

infrastructure providers who I knew

47:14

personally, as I said I think before the

47:14

show started, I had a podcast called I

47:18

Love Data Centers that you can actually

47:18

still find.

47:20

So I knew the owner -operators, I knew the

47:20

players in the space, I wrote the book on

47:24

the industry, I literally wrote the book

47:24

on the data center industry, it's called I

47:27

Love Data Centers, it's basically data

47:27

centers for dummies, travel the world

47:31

training and teaching people about this

47:31

industry.

47:34

So I knew the players, I built this

47:34

network, we were the original hosting

47:39

providers for Project Veritas, for true social, for a lot of conservative

47:41

content, pro -life stuff that was getting

47:46

kicked off of various different hosting

47:46

websites.

47:50

And yeah, so that's my long -winded answer

47:50

of saying yes, I was co -founder of

47:57

Rightforge. But I left that company after the first

47:58

few months.

48:00

And that's a long story that we can get

48:00

into for another time.

48:03

But I started networking with other

48:03

different infrastructure hosting providers

48:07

that were owned and operated by

48:07

libertarian minded people who wanted to

48:10

protect and defend their liberties and the

48:10

constitutional rights that we have.

48:14

And so once that was established that

48:14

ground layer which I think this is

48:18

important that base layer infrastructure

48:18

layer that most people think is magic

48:22

right. They don't quite understand how the cloud

48:22

works or how the Internet really works.

48:26

It's just servers. But the of the day that are sitting in these

48:28

facilities that are designed to never lose

48:32

power, never go down, right?

48:34

So I became a specialist in that space and

48:34

I knew that you could create any kind of

48:38

app you wanted, all this encryption stuff

48:38

and all these cool things, right?

48:43

But if that data is sitting on servers

48:43

that are owned by people that want to turn

48:47

you off, they can flip the switch and turn

48:47

you off.

48:50

So once that core base layer was

48:50

established and I knew it was owned by

48:54

good people, then I can move on to the next problem

48:55

set. And I'm an entrepreneur, so I started

48:57

saying, okay, what's the next major tip of

49:02

the spear that we need to go after? And it was these things, hands down these

49:04

things.

49:08

So that's what I moved on to after that.

49:11

you know, and I'm probably a little bit

49:11

different than maybe some of my other

49:16

fellow liberals.

49:18

I do care about free speech and, you know,

49:18

I'm probably one of the few, you know,

49:26

anti -Trump liberals that actually didn't

49:26

really want him to be kicked off of

49:30

Twitter, to be honest, just because, like,

49:30

people are going to say horrible things,

49:35

ridiculous things, offensive things.

49:39

But if you believe in social capitalism,

49:39

those voices will get drowned out

49:46

eventually and go away by themselves kind

49:46

of thing.

49:51

But I'd love to get your take on why you

49:51

think conservative voices are being

49:58

suppressed. And is it all conservatives?

50:02

What's your take on that?

50:05

So it's to control the narrative.

50:08

It's to control the narrative. You do not want people who are exposing

50:10

others to concepts and ideas that are

50:16

contrary to your narrative, right?

50:18

So if I'm a totalitarian dictator, I'm not

50:18

going to give my opposition any say, any

50:26

voice at all. I'm going to attack them, deplatform them.

50:30

deplatform them and prevent them from

50:30

speaking.

50:32

So this is what blows my mind, Will, and

50:32

it's got to blow your mind, where you

50:36

have, you call yourself a liberal, right?

50:38

But now liberals of today are really not

50:38

liberals.

50:40

They're like straight up communists, like

50:40

unapologetic communists.

50:45

They claim to be communists. They are communists, and they claim to be

50:46

liberals.

50:48

And I'm like, but that's totally not

50:48

aligned with, you know, the Democratic

50:53

Party of RFK, of JFK, of...

50:56

You know, even 10 years ago, it's just not

50:56

in alignment.

51:00

So your movement, which I would be pissed

51:00

off about, has been hijacked by people

51:05

that truly aren't even American.

51:07

They don't want to believe in a

51:07

constitution.

51:10

They want to get rid of the constitution.

51:12

They don't believe in having sound and

51:12

secure borders.

51:16

They want to get rid of sound and secure

51:16

borders.

51:19

But... when you're trying to control a narrative,

51:20

you cannot have the dissenting view and

51:24

the dissenting voice. So that is, I mean, that makes sense,

51:25

right?

51:28

That's pretty basic linear logical

51:28

thinking.

51:31

If I don't want a certain point of view to

51:31

gain traction, I'm going to limit that

51:36

point of view. And that's where Elon Musk, when he gained

51:37

ahold of Twitter, he said, I just bought

51:42

basically a massive crime scene.

51:44

And he did when you started looking at

51:44

what was happening in that company and the

51:50

degree to which their algorithms that were

51:50

put into the system by people that worked

51:54

there were dissenting and creating

51:54

division and or ignoring entire

52:00

populations of people, deplatforming

52:00

people, and the algorithms were making it

52:05

so that certain content was not even being

52:05

seen.

52:09

So that, I mean, that makes logical sense,

52:09

right?

52:12

If I'm a totalitarian dictator, Will, and

52:12

I had no ethos, I would do the exact same

52:18

thing. This is not crazy think. This is how power operates.

52:22

Power politics works. Yeah, yeah, I mean, so there's so much of

52:26

what you're saying that I feel like very,

52:32

like definitely sympathetic to and

52:32

definitely like, man, like I don't, again,

52:42

like I don't necessarily, I don't really

52:42

trust the government.

52:47

Like I don't really trust, you know.

52:51

major CEOs like and I don't know why I

52:51

don't trust them except to say that I mean

52:57

I could say the things that you've told me

52:57

and and look in and check in on those and

53:02

and things like that but it's more like I

53:02

just don't trust them because they have a

53:06

lot of power and have a lot of money but

53:06

then again I kind of want more power and

53:12

more money I mean I I feel like I'm a

53:12

Christian I think I am and of course I

53:18

would say if if Some was like, hey, you're going to choose

53:20

money?

53:23

Are you going to choose, you know, God?

53:26

My hope and my prayers, I'm going to

53:26

choose God.

53:29

That's what I want to do. I want to, I don't want to not choose God.

53:35

And then the question, well, do you have to? I know you can't serve both God and money,

53:36

and yet we need money to operate within

53:41

this thing. And even to your own point, you know,

53:42

money is our ammunition in this war,

53:46

right? The metaphor that you use, the way that we

53:46

spend.

53:49

our dollars is the ammunition.

53:52

And it's like, so when I'm thinking about

53:52

like, let's say, like I wanna go in, I

54:03

wanna like the world economic forum, like

54:03

they're very open about this.

54:08

Where would I go to find like the

54:08

documents or the, you know, the evidence

54:16

or whatever it is? it on their website.

54:19

But all you have to do is go into your

54:19

browser and type in World Economic Forum.

54:23

You'll go straight to their website.

54:26

And just, they have their talks all up

54:26

there.

54:29

I mean, they literally don't hide any of

54:29

this.

54:34

And then, so go check it out and go see

54:34

what they're saying and then make the

54:44

decision. That's basically it.

54:48

Yep. I mean, there are a number of what I call

54:49

talking heads, you know, on all sides who

54:55

have been given this platform, who can

54:55

espouse, you know...

54:59

whatever on Tucker Carlson as an example

54:59

or Elon Musk, all these different, you

55:04

have to pick and choose who do you think

55:04

actually has a perspective that is based,

55:10

that is based on truth, that has a good

55:10

heart and that is not simply trying to

55:16

follow an agenda. It's for me, so I have been someone who's

55:17

always done the homework.

55:22

I don't always just. take someone's word for it.

55:26

You know, I'm the doubting Thomas, right? I wanna see the holes, right?

55:30

So I've been digging and I've been reading

55:30

and researching, asking questions and

55:35

meeting people and putting it out there

55:35

and asking God.

55:38

I know you guys probably both understand

55:38

this.

55:40

People say, well, I don't understand, Sean. You say that God talks to you and you've

55:42

had this conversation with God.

55:45

I just don't know what God's voice sounds

55:45

like, right?

55:48

Well, my question to those people every

55:48

time is how often are you in the word?

55:54

How often are you reading scripture,

55:54

listening to God's voice?

55:57

How often are you in prayer, meditating,

55:57

and asking God to hear his voice?

56:03

Asking God specifically to have a

56:03

conversation with him.

56:06

How often are you doing that? Or do you only go to God when you have

56:07

something going on in your life and you

56:12

need answers, right? What kind of a two -way relationship is

56:13

that?

56:16

Do I really want to interface and dialogue

56:16

with someone who only comes to me with

56:19

problems and issues? And am I really getting a two -sided...

56:23

perspective of that person, you're not,

56:23

right?

56:26

So unless you're in the Word on a regular

56:26

basis, on your knees on a regular basis

56:32

praying and listening and proactively

56:32

engaging in that conversation with God and

56:38

with Jesus and His Word, you're not gonna

56:38

really understand what His voice says.

56:44

So with that in mind, I have been spending

56:44

my life on my knees and praying and I'm

56:50

not always perfect. You know, my life is a roller coaster.

56:53

I get it. last couple years, I've spent a lot of

56:54

time, I've dedicated at least an hour and

56:58

a half, if not two hours of my day,

56:58

dedicated in the morning to just focus on

57:03

doing that so that I know what that voice

57:03

sounds like.

57:06

It's for discernment. And I pray for discernment.

57:09

Discernment. So that as I'm listening to these

57:11

different talking heads and I'm asking

57:14

questions like, wonder what this guy's

57:14

motivating factor is, I can do some quick

57:18

research and try to figure out where's the

57:18

money coming from, right?

57:22

Where's this guy's money coming from?

57:24

If he's beholden to what master, right,

57:24

that's gonna be puppeteering, pulling the

57:29

strings, or is he authentically out

57:29

speaking truth in love no matter what,

57:34

right? No matter what might come out of him. That doesn't mean that you can't make

57:36

money in the process, right?

57:40

I have a business, I need to make money, I

57:40

have people I need to pay for, but at the

57:45

end of the day, if someone comes to me and

57:45

says, Sean, you need to compromise your

57:49

values and your integrity and we'll... to the tune of 50 million bucks, I'm gonna

57:51

say no thank you.

57:56

Right? Not interested. Right?

57:58

Because I don't want to have that

57:58

conversation with my lord and savior when

58:01

I die. When he says, you knew exactly what you

58:01

were supposed to do and you chose money

58:06

over me. Sayonara sucker.

58:09

Right? That's not the conversation I want to

58:09

have.

58:13

So I don't know if that answers your

58:13

question, man, but that's...

58:15

we going to have to put an E, an explicit

58:15

rating, because you did this, the hand

58:20

signal? I'm not really sure what the rules are for

58:20

podcasts.

58:23

fuzz me out when I do it.

58:25

Fuzz out the face. That's good.

58:28

You know, this is my last question, Sean.

58:30

This has been a fascinating conversation,

58:30

a challenging conversation.

58:35

I really appreciate you coming on and

58:35

talking with us about it.

58:38

But, and you kind of, it's going to be a

58:38

little bit like of repetition, to be

58:42

honest, because, but I don't think it will

58:42

be redundant because, you know, you've

58:48

basically been talking about this the

58:48

whole time.

58:52

And, but the question is,

58:54

What do you feel like, let me back up for

58:54

a second and give some context.

58:59

We have a lot of different people that

58:59

listen to the podcast, right?

59:03

One of the reasons that I really, I was

59:03

like, and I was even talking to Will,

59:07

like, I really want to have, like, Sean

59:07

when I got the email is because we don't

59:13

tend to get as many conservative voices

59:13

that, you know, just for whatever reason,

59:21

like a lot of times they just haven't.

59:23

gotten back to me, people like when I've

59:23

contacted them and things like that.

59:29

And I have friends that are very much

59:29

going to enjoy this conversation and are

59:35

very much going to be like, yes, I agree

59:35

100 % with what Sean is saying.

59:40

And I'm all in on what he's saying and

59:40

checking out your stuff.

59:45

I feel pretty confident about that.

59:47

We also have a lot of people that would be

59:47

like, my gosh, I can't believe.

59:51

what he's saying. I can't believe they would let that guy

59:53

come on the podcast.

59:56

What's wrong? And we have tried to never let that be a

59:56

factor in who we have on.

1:00:03

I mean, you've been on and we've also had

1:00:03

like the first transgender bishop on.

1:00:08

We've had Clay Clark on. So we've had some, a fairly wide range of

1:00:09

people.

1:00:14

And my question is, if you're thinking

1:00:14

about, not that you would change it, but

1:00:19

if you're thinking about, addressing liberals to conservatives,

1:00:21

Christians, atheists, Muslims, Hindus,

1:00:27

everyone. What's the most important thing you feel

1:00:28

like from your message that you want to

1:00:33

get out? And actually kind of tacking on to that,

1:00:35

what are you, I mean, again, you've

1:00:39

already said it, but what are you most

1:00:39

concerned about?

1:00:41

Let's even just say for this next upcoming

1:00:41

year moving into this election season,

1:00:46

what are your biggest concerns?

1:00:49

So apathy, apathy.

1:00:52

People who basically are either so

1:00:52

disconnected to what's actually happening

1:00:57

in the world that they just don't care,

1:00:57

they're obliviously just continue to go on

1:01:01

and become zombies in the system.

1:01:04

And or those who are so black -pilled,

1:01:04

right?

1:01:08

They just think, well, it's so

1:01:08

overwhelming, there's nothing I can do

1:01:11

about it, so screw it, I'm just gonna keep

1:01:11

doing what I'm doing, right?

1:01:15

But again, as Christians, I believe we are

1:01:15

called.

1:01:19

to be angels and saints.

1:01:21

We're not called to be bystanders in this

1:01:21

process.

1:01:24

We are called to the front lines of this

1:01:24

war.

1:01:28

I am no longer in the business, which I

1:01:28

used to do for years and years, of trying

1:01:32

to convince people of what's happening in

1:01:32

the world.

1:01:34

I truly believe that if you haven't

1:01:34

started to question what you're being fed

1:01:39

all day every day from the media, that

1:01:39

there's nothing I can say that's going to

1:01:43

convince you. There's no documentary you can watch that

1:01:44

I can present to you.

1:01:47

There's no research I can show you. where you're gonna say, okay, I get it

1:01:48

now.

1:01:51

It's going to take God working in your

1:01:51

life in a very real personal way to make

1:01:57

you start to rewire your brain and rethink

1:01:57

and relearn that everything you've learned

1:02:02

has been a lie, right? And that you need to restart and reengage,

1:02:03

which is hard for people to do.

1:02:07

I get it. When the pillars of your reality start to

1:02:08

crumble and you're like, holy crap, you're

1:02:13

telling me that this isn't real, this

1:02:13

illusion, this matrix I'm living in isn't

1:02:17

real, right? That's a tough pill for people to swallow.

1:02:20

But people have to realize that they are

1:02:20

called to the front lines of this war.

1:02:23

And that's my job these days, is to arm

1:02:23

those who are awake and are aware with the

1:02:28

tools that they need to be successful in

1:02:28

fighting this war and to reclaim their

1:02:33

digital privacy and their digital

1:02:33

sovereignty.

1:02:36

So that's what my primary, you know, I

1:02:36

have zero fear actually, to be totally

1:02:42

honest with you. I don't fear anything because I know it's

1:02:42

all in God's hand.

1:02:46

And... We may not see justice, a lot of people

1:02:46

want to see justice and Fauci hung and all

1:02:51

these people held accountable. I don't care about accountability right

1:02:52

now because that's not the battle that I'm

1:02:56

in. And I know God is gonna get his vengeance

1:02:57

and God is gonna hold people accountable

1:03:00

at some point, either in this life or the

1:03:00

next, right?

1:03:03

So that's not my job to worry and be

1:03:03

concerned about.

1:03:07

So that's my long -winded answer to your

1:03:07

question is I'm really worried about

1:03:12

apathy in people saying, eh.

1:03:14

Not my problem. Someone else will take care of it.

1:03:17

Someone else will deal with it. When I'm saying, look man, if you claim to

1:03:18

be a Christian, you are called to be on

1:03:23

the front lines of this battle right now

1:03:23

and to become awake and aware and to spend

1:03:27

time in the Word, spend time praying,

1:03:27

spend time listening, communicating with

1:03:32

other people, not just online, but get to

1:03:32

know your neighbors.

1:03:37

Have some coffee, break some bread, figure

1:03:37

out where they stand, where they're at.

1:03:40

Because if things get complicated,

1:03:40

which...

1:03:44

you than they already are, you're going to

1:03:44

want to know who your neighbors are and

1:03:48

who you can trust at the end of the day.

1:03:50

Yeah, yeah. So what would you say to like non

1:03:51

-Christians that are listening and people

1:03:56

that wouldn't say, you know, I'm a

1:03:56

Christian, I totally get what you're

1:04:00

saying. What would you, what's your appeal?

1:04:02

question that I asked you. How do you feel about the fact that your

1:04:03

devices and everything around you is

1:04:06

listening to you and watching you all the

1:04:06

time?

1:04:09

And if someone's like, I don't care, no

1:04:09

big deal, that's fine, that's their

1:04:13

prerogative. That's their deal, right?

1:04:17

For those who are like, yeah, I'm not okay

1:04:17

with it, I say great.

1:04:20

Do you know that there's options and tools

1:04:20

that are available to get these companies

1:04:24

out of your life? And they're like, I had no idea.

1:04:27

So I can walk them through those tools,

1:04:27

right?

1:04:31

That's awesome. So how can people follow you, kind of,

1:04:31

what social media platforms are you a part

1:04:37

of, how can they follow you, and how can

1:04:37

they, where can they go to get some of

1:04:42

these things that you're talking about? Yeah, so I used to be on LinkedIn until

1:04:43

April of 22 and they kicked me off because

1:04:48

they didn't like the narrative that I was

1:04:48

pushing on that platform.

1:04:52

They, you know, canceled me even though I

1:04:52

had 10 ,000 followers on there and I was

1:04:56

sorry, there's something flying in my face

1:04:56

over here.

1:05:01

Right, that's probably what it is.

1:05:04

So I had 10 ,000 followers. I was one of their first customers, you

1:05:05

know, at LinkedIn, you know, being in

1:05:08

Silicon Valley. totally purged me.

1:05:12

So I'm not on LinkedIn is my long story

1:05:12

short.

1:05:15

We are on True Social. We are on Gab.

1:05:17

We are on Twitter. We are on Rumble.

1:05:20

You just look at Mark37. You'll see our little shield logo with the

1:05:22

Betsy Ross flag on it.

1:05:27

That's our company. Or you could go to mark37 .com.

1:05:30

We have literally a mountain of content

1:05:30

available for you to dig through in the

1:05:35

resources page, in the blog page.

1:05:39

I've got tons of articles on simple ways

1:05:39

to start your migration to digital

1:05:45

freedom. I've got the whole video of the online or

1:05:46

the on and offline training that I do.

1:05:50

So I travel the country right now to

1:05:50

different parts, different communities who

1:05:55

are interested in this conversation and

1:05:55

want to learn these tools.

1:05:58

So I'm traveling the country. I do about three, four hour seminars.

1:06:03

That's all. online. That's all on our website.

1:06:05

You can find it there. So if you're interested in learning more,

1:06:06

I would highly recommend you check that

1:06:08

out. Dig around. Check out our products and services.

1:06:11

If you have any questions, feel free to

1:06:11

email me.

1:06:15

Support, S -U -P -P -O -R -T at mark37

1:06:15

.com is the best way to get a hold of us

1:06:20

and our team. We'll get back to you probably within 24

1:06:21

hours tops.

1:06:25

Awesome. Well, go ahead.

1:06:29

if anyone listening has experience

1:06:29

migrating people off of Windows and Mac

1:06:35

environments to Linux, and you have

1:06:35

experience doing the same from a mobile

1:06:39

operating system perspective as well, I

1:06:39

want to talk to you.

1:06:42

We are growing a team nationally right

1:06:42

now, and we want to have people in all the

1:06:46

different regions of the country so that

1:06:46

as people come to us with needs to make

1:06:50

this migration happen, we can point them

1:06:50

to you.

1:06:53

You can help assist them through that

1:06:53

migration process.

1:06:55

So please reach out to us at support at

1:06:55

mark37 .com.

1:06:58

And again, this is a nonpartisan issue.

1:07:01

It should be a nonpartisan issue. If you believe in the First Amendment, if

1:07:03

you believe in digital privacy, digital

1:07:08

security, digital sovereignty, this should

1:07:08

be important to you.

1:07:10

And for those that say, well, I'm not

1:07:10

saying or doing anything that's, you know,

1:07:14

outrageous online, so my digital privacy

1:07:14

doesn't matter.

1:07:17

That is no different than saying that

1:07:17

because you're not planning on saying

1:07:21

anything offensive to anybody, that your

1:07:21

First Amendment does not matter.

1:07:24

That you're... Freedom of speech does that matter.

1:07:26

Or as we were talking about before the

1:07:26

episode, like saying, well, I'm not

1:07:31

planning on shooting anybody or I'm not

1:07:31

planning on killing anybody and I don't

1:07:34

think anybody's going to come try to rob

1:07:34

me.

1:07:36

So my second amendment shouldn't matter

1:07:36

either.

1:07:39

Right. The reality is our digital privacy and our

1:07:39

digital security does matter and we have

1:07:44

to start acting like it. There you go.

1:07:47

You guys have heard it from Sean Patrick

1:07:47

Theriault.

1:07:51

And Sean, really appreciate you coming on

1:07:51

the show and talking with us.

1:07:56

It was a fascinating, challenging

1:07:56

conversation.

1:07:58

And people, again, can go to mark37 .com

1:07:58

and you can follow some of the resources

1:08:03

and recommendations that Sean has made.

1:08:06

And until next time, guys, don't keep your

1:08:06

conversations left or right, but up.

1:08:10

God bless you guys. Have a great day. Thank you.

1:08:13

See you.

Rate

From The Podcast

Faithful Politics

Dive into the profound world of Faithful Politics, a compelling podcast where the spheres of faith and politics converge in meaningful dialogues. Guided by Pastor Josh Burtram (Faithful Host) and Will Wright (Political Host), this unique platform invites listeners to delve into the complex impact of political choices on both the faithful and faithless.Join our hosts, Josh and Will, as they engage with world-renowned experts, scholars, theologians, politicians, journalists, and ordinary folks. Their objective? To deepen our collective understanding of the intersection between faith and politics.Faithful Politics sets itself apart by refusing to subscribe to any single political ideology or religious conviction. This approach is mirrored in the diverse backgrounds of our hosts. Will Wright, a disabled Veteran and African-Asian American, is a former atheist and a liberal progressive with a lifelong intrigue in politics. On the other hand, Josh Burtram, a Conservative Republican and devoted Pastor, brings a passion for theology that resonates throughout the discourse.Yet, in the face of their contrasting outlooks, Josh and Will display a remarkable ability to facilitate respectful and civil dialogue on challenging topics. This opens up a space where listeners of various political and religious leanings can find value and deepen their understanding.So, regardless if you're a Democrat or Republican, a believer or an atheist, we assure you that Faithful Politics has insightful conversations that will appeal to you and stimulate your intellectual curiosity. Come join us in this enthralling exploration of the intricate nexus of faith and politics. Add us to your regular podcast stream and don't forget to subscribe to our YouTube Channel. Let's navigate this fascinating realm together! Not Right. Not Left. UP.

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features