Podchaser Logo
Home
Charting the Digital Frontier with a Father's Insight

Charting the Digital Frontier with a Father's Insight

Released Monday, 26th February 2024
Good episode? Give it some love!
Charting the Digital Frontier with a Father's Insight

Charting the Digital Frontier with a Father's Insight

Charting the Digital Frontier with a Father's Insight

Charting the Digital Frontier with a Father's Insight

Monday, 26th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:53

How's it going , jb ? It's great to

0:55

get you on the podcast . You know I

0:57

don't know how long this thing has been in the making , but

0:59

it feels like forever at this point .

1:02

Hey , it's great to be here . Yeah , unfortunately

1:05

it's been some crazy times in between

1:07

initial booking and actually getting here . I

1:10

can actually tell everybody I was . I was actually

1:12

fighting through court for my younger son

1:14

. After 80 years and the

1:16

cost of a small house , I

1:18

now have all four of my children living with me .

1:21

Oh wow , that's a huge accomplishment

1:23

.

1:25

Yeah , I would actually say that , given what

1:27

I have seen , experienced in the British family

1:29

legal system , it's more of an accomplishment than building

1:31

my free tech companies . To be honest , yeah

1:34

, absolutely .

1:35

I mean , like all that you ever hear about , at

1:37

least in America , is like

1:40

, hey , if you better

1:42

not get divorced when you have kids

1:44

, because like , if you do

1:46

and you're a man , like you're never going to see

1:48

that kid again , like there's

1:50

so many ways that they can just , you

1:53

know , completely screw you

1:55

. You know like it's insane

1:57

, it's absolutely crazy . So

2:01

like I really I

2:04

have like a new appreciation after like being a dad

2:06

now , you know for the first time it's

2:10

like I have a new appreciation for , you

2:12

know , the influence that the dad has in a kid's

2:14

life , no matter how young . You know .

2:16

Yeah , well , I was that . I've been there

2:19

, other than what I was there for

2:21

when the children were literally came out

2:23

the oven there with the catchers Mitch

2:25

, literally four of them , and

2:27

, ladies and gentlemen , this

2:30

don't necessarily want to see that . Yeah

2:37

, no , no the PTSD look going

2:40

on . But it is great

2:42

you build if you are a father that

2:44

spends a lot of time this children

2:46

and you're there like really there from the formative

2:49

years . So my oldest daughter is 18

2:51

. And she calls me her bestie . That would

2:53

not happen if I wasn't there , if I hadn't

2:55

been there all the time , you know . And

2:57

yes , there was a large time where they

2:59

had no access to me . Unfortunately

3:01

, it is what it is . It had

3:03

and it had an effect . You

3:05

know I can't go into details of what the effects were

3:08

, but I might most of my

3:10

kids do go through counseling . So

3:13

you know , this is the thing that parents need

3:15

. Parents need to realize when their kids

3:17

are little and they think , oh , it doesn't matter , little

3:19

Johnny , he , he isn't seeing . When I'm like

3:22

giving my partner daggers , trust

3:24

me , little Johnny is seeing it and little Johnny

3:26

is being affected by it . And little Johnny is starting

3:29

to end up growing up giving his little partner

3:31

daggers , thinking his kid ain't seeing it and it just carries

3:33

on . So

3:35

I mean , I gotta be honest , I'd hate to be a kid

3:37

now In this

3:39

age . And that's horrendous because

3:41

it's the 21st century we're living in , in what I used

3:43

to see as the Star Trek future when I was

3:45

a kid . You know because I'm a child of the . I was

3:47

born in December 1980

3:49

on the last batch of the Gen Xers

3:52

and I remember Star Trek and all that cool stuff

3:54

and I thought , yeah , we're living the Star

3:56

Trek future now , ain't great for

3:58

kids , though you know , yeah , it's

4:00

pretty terrible .

4:02

I don't know . I still haven't

4:04

like figured out how I'm going to try and

4:06

like introduce , you

4:09

know , like the internet , right , my

4:11

kids . You know like

4:13

, well

4:16

, my kids 11 months , so like I have some

4:18

years , right .

4:19

No , no , here in Britain you'll see

4:22

women pushing prams along and

4:24

prams , by the way , getting sold . You got to commend this

4:26

. They're getting sold , with tablet arms already

4:28

on them . So your

4:30

child can be soaking in

4:32

all of the lovin' in us of YouTube children

4:35

, which includes such classics as the hangman's

4:37

song and other things .

4:41

It sounds

4:43

interesting how it made it past

4:46

YouTube's , you know impenetrable

4:48

AI that captures

4:51

all of this right .

4:52

Yeah , yeah , but if you talk about something

4:54

about sex and relationships , the YouTube AI

4:57

catches that quickly enough . You know you

4:59

won't catch that . Somebody has managed to get pepperpig

5:02

to slice her dad's throat and decapitate

5:04

him in a cutesy animation . It's

5:06

like .

5:09

God forbid you say COVID or something

5:11

like that . Right Like this episode is immediately

5:13

.

5:15

You know , you just got yourself to monetize , right there , that's it , I'm

5:17

not even monetized . You

5:20

never will be now . Hold on hold on

5:22

, hold on hold on . He's a prior for an F-Sense account

5:24

. No , we can't give him that . He mentioned COVID . Doesn't

5:27

matter what he said about it , he mentioned it .

5:31

Yeah , he's not a doctor , right ? Not

5:33

a licensed physician ?

5:37

Not that that really means anything , Johnny . But anyway we're

5:39

so weird off tech Matt .

5:41

Yeah , yeah , that got out of hand quick

5:44

. So

5:47

, yeah , you know , jb , you know

5:49

I'll be honest with you . I

5:51

didn't look too much into

5:53

your background , you know . So why don't we start with

5:55

your background of how you got into you

5:58

know IT , how you got into

6:00

security ? You know maybe , like what

6:02

piqued your interest ? Was there a certain event

6:04

or something you know earlier

6:07

on that kind of piqued your interest

6:09

and led you down this path ?

6:12

Well , I , as I said , I'm a kid of the 80s

6:16

. I grew up in the middle of the poll tax

6:18

riots era . I grew up in Birmingham

6:21

, in Bordsley you know where the

6:23

Peaky Blinders are from . I grew up at

6:25

protests , going to protest marches

6:27

for freeing the Birmingham Six . So everything

6:29

I I hate to say this by modern standards

6:31

I had a really woke

6:34

childhood . Oh dear

6:36

you know , I was reading philosophy

6:38

and psychology by the time I was five and six , or

6:40

reading Bardsley , art and Decker . I

6:43

was writing , going to play Shakespeare

6:45

, in plays performed at the Royal Shakespeare Company

6:48

in Stratford , planeven from that early age

6:50

and then being forced and I say forced

6:52

because a no , five or six year old writes

6:54

essays voluntarily

6:56

, you dropped me . They don't five

6:59

and six years not engaging any form of critical

7:01

thinking voluntarily . So I was forced

7:03

to write essays and basically created

7:06

somebody who is , you

7:08

know , who basically got a lot of critical thinking

7:10

, looks at stuff through a logical lens

7:12

and will deconstruct and break stuff down . So

7:15

, in terms of security and how

7:18

we are able to protect our individuality

7:21

, which is what security really is , I

7:24

became , I was kind of always involved in that in some

7:26

form or another , even as a kid . I

7:28

mean when you have a childhood where you don't play with toys

7:31

, where you have science , chemistry

7:33

sets , microscopes and telescopes and that's all

7:35

you get for Christmas is along with books and literature

7:37

. It makes you hyper focused

7:39

. I mean like laser focused on stuff

7:41

In terms of it . I got into

7:43

that really early . I did my first programming

7:45

diploma when I was 13 with the International

7:47

Correspondence School , did that in basic beginners

7:49

or both symbolic instructional code . I

7:52

got a job when I was 13 working for

7:54

a computer company in Coventry called

7:56

a Richard , called so is the knitters . And

7:58

then it became gigante computers . Shout

8:01

out to Stephen King so is the knitters . Car

8:03

from trade . If you're seeing this , I

8:05

don't know where I mean gigante computers . Where are

8:07

you guys ? And

8:09

I was even when I was working with that . I

8:11

was really interested in how do you secure them , how

8:14

do you protect them ? How do you stop people from taking

8:16

the data out of these and doing stuff

8:18

with them ? I already , as an early user

8:20

of the Internet and computers , I saw very quickly

8:22

the downsides

8:26

to how this stuff could be used . Bear in mind also is it

8:28

also working ? And computing in that era

8:30

I was working for I ended up working people like

8:32

time and tiny computers and I started

8:34

seeing how people were exploiting customer

8:36

data to market even credit

8:39

agreements and 0%

8:41

finance deals and how they were trying to aggressively

8:44

push this . I mean how I was a killer at sales

8:46

and I ended up leaving sales , even though I did

8:48

really well in commission . I left sales because

8:50

sales struck me as

8:53

lying to people . I mean , when

8:55

I was working in computers , I was selling credit

8:57

agreements to families who wanted a new

8:59

Packard Bell computer , which is like costing

9:01

two or three grand , and you

9:03

could see from , bear in mind that you spent back in those days . We're talking

9:05

about when you did credit agreements on pen and paper , by the

9:08

way , and yet a phone , a place up , and they actually

9:10

told you if this person was credit worthy over the

9:12

phone , not not this whole , as why dingaling

9:14

instant take . So you'd end up with

9:16

conversations with these people , you know

9:18

, while you're doing the credit agreement , and you would learn

9:21

really quickly that these people could not

9:23

afford this . They couldn't afford

9:25

. You know , bear in mind these people being sold

9:27

a credit agreement of worth a couple of grand and

9:29

being told don't worry , you don't have to pay for it now . But

9:32

what was that ? That was everything about the 80s and 90s . Don't

9:34

worry about it , you don't have to pay for it now , it's

9:37

cool . You don't have to pay for it now . Think

9:39

of get the now , don't worry about the future . I'll

9:42

be sitting there and I'll be talking to these people

9:45

and I'm seeing that how their kids are and I'll be thinking

9:47

about very well , I'm a young man like 1819

9:49

at this time and even then I'm kind of thinking I

9:51

can't do this anymore because I can already

9:53

see what these people are going to go through . They're

9:55

going to go through debt collection straight away

9:57

. These people are six months when they see the 200

10:00

pound a month bill come in for this

10:02

computer that is already covered in pot

10:04

noodle , super noodles and all kind of schmutz . Pro

10:07

has already been dropped five or six times and

10:09

little Johnny has already put his put like a cookie

10:12

or a jammy dodger into the CD tray . This

10:14

thing is already busted up and practically broken

10:16

and now they're on the hook , not just like a couple of

10:18

grand but also the interest on that , because

10:20

they missed their six months by now

10:22

. Pay later , take . They missed the interest

10:24

free component and not long

10:26

after that got a job working for that collection . So

10:28

I saw both cycles , I saw the

10:31

profiting from it and I saw the

10:33

back end of it , of what happens when it goes

10:35

wrong , and I just couldn't

10:38

do it . So I was like , well , I'm not doing sales

10:40

, I'm going to go into , I'm going to do , I'm going to

10:42

go into my other love , filmmaking

10:44

. But that didn't work . Because in Britain

10:46

the only way you get money for filmmaking is if you're going to make

10:49

a delightful wrong

10:51

, wrong comedy with Hugh Grant going I'm

10:54

so delightfully British or you make kind

10:56

of like a hood shoot him up , yeah , blacks

10:59

and gang Kidult

11:02

hood top boy . Any of those kinds of things are

11:04

the only thing you get money for and I've no interest in that . I

11:07

mean how I speak the King's English for crying out loud . I

11:09

was educated in the three R's , kind of like

11:11

Eaton style , and

11:13

I like a Kira Kira sour , so far removed

11:16

from that . So I was

11:18

like , okay , that ain't gonna work . So

11:20

I pivoted and at college I did my idea , I

11:23

did moving image design and then 3D

11:25

animation and when

11:27

I was at university I continued

11:29

that . I did computer visualization animation

11:32

and I did my dissertation in adaptive artificial

11:34

intelligence back when artificial intelligence

11:36

wasn't a thing in 2010 . And everybody

11:39

called me a madman is LJB , you're wasting your time

11:41

. You're not going to see AI

11:43

able to do this stuff , jb , in your lifetime

11:46

, right , I

11:49

mean , if you , yeah , yeah , we're going to get over

11:51

. How that's really kind of irritates me . But

11:53

anyway , after that , going

11:56

to animation got jobs in animation

11:58

. I started working everywhere . I started

12:00

working website design , ui , ux . I've

12:03

worked as a product manager for a 9 million

12:05

pound project for biggest corporate law practices in

12:07

the world . I've been computer

12:09

science , a senior lecturer

12:12

of computer science and my own alma mater , so

12:14

I've seen the education system . Oh

12:16

my God , that was eye opening and disappointing . And

12:19

I've been in the edgy . I've been a tutor during

12:22

COVID and after . So that was very

12:24

interesting and enlightening . But what

12:26

got me into building my own apps actually

12:28

was funny enough . What we spoke

12:31

about earlier , which was , I would love

12:33

to be able to say , the story of Fox Messenger , was

12:35

I had a passion , saw that we had

12:37

to change the world . No

12:39

, no , I had baby mama

12:41

problems , like every other black guy , and

12:44

I did not want to end up joining fathers for

12:46

justice . Dressing up as Batman , I

12:48

may bear the sign of the bat , but I don't

12:50

dress up as him hanging off a bridge

12:52

going fathers for justice . There are better ways

12:54

of doing things , you know . So how do I stop that I

12:58

am going to occupy my time

13:00

if I'm not going to be , if my access

13:02

to my children is going to be refused , I

13:05

might as well do some more time . So

13:08

thank you to the mother

13:10

of my children , because this $85

13:12

million value company would not be possible without

13:15

that . So

13:17

I built Vox Messenger , and the reason I built Vox

13:19

Messenger is because I saw how everybody's

13:21

communications were being exploited for their data

13:23

. There is nothing more cynical

13:25

than giving somebody free messaging

13:27

and then using the content of their messaging

13:30

to exploitatively

13:33

direct targets marketing and

13:35

ads . Now , it was already bad when

13:37

it was commercial ads , but now we have

13:40

what we have political

13:42

target signal . Yes , ladies and gentlemen

13:44

, thank you Facebook , thank you Cambridge Analytica

13:46

for setting the trend . Now we have direct

13:49

marketing of all of our political interests at

13:51

us because of what we put on Facebook , what

13:53

we like on Twitter Sorry , it's X

13:56

you know all of that stuff , all of

13:58

this is used to manipulate us now and

14:01

, unfortunately , when I saw

14:03

this , I realized very quickly , as I started

14:05

moving through business , going through incubators

14:07

and all of these things and getting my own funders and angels

14:10

, that the people to blame are

14:12

the tech CEOs , because

14:14

ultimately they do control

14:16

this . I know

14:19

that everybody would love to say you know what ? I'm really

14:21

sorry , guys . I'm really sorry I deplatformed

14:23

so many people . It's not my fault , you

14:25

know , I've got investors and shareholders

14:28

. Man , yeah , I'm really sorry

14:30

, you know . I

14:33

mean , you can ask any of my shareholders and investors

14:35

. They would all say hold on what ? Try

14:38

saying that to JB . You kidding me

14:40

. We don't bother no more , because

14:42

I'm the CEO and I'm the leader

14:44

of my ship . I am the king of my castle and

14:47

if I have a shareholder and investor

14:49

who I believe for any second

14:51

is going to tell me how I'm going

14:53

to run the company for the best of my

14:55

consumers and it turns out it's not for all

14:58

of my consumers Guess what ? I'm

15:00

going to be investing in my company .

15:02

Right , yeah

15:04

it's . You know it's a crazy

15:07

place , especially like

15:09

this year , at least in America , right

15:11

, when it's election year . It's a

15:14

very heated . It's going to be very

15:16

debated . Everyone

15:18

is calling for this year to be a crazy year

15:20

, at least in America .

15:23

In the UK , by the way , just so you're aware , in the UK

15:25

, here in the United Kingdom , we have a massive

15:27

election happening . Not only do we have

15:29

our prime minister being picked , but

15:32

every single borrower has to elect

15:34

two brand new councillors . So

15:36

we have huge elections

15:38

going on and both of them are being manipulated

15:41

by pretty much the same groups of people , funnily

15:43

enough .

15:44

Yeah , it's crazy because if I go on my

15:46

feed you know Facebook , twitter , whatever

15:48

it is you know all I

15:51

see . Literally all

15:53

I see is , like the extreme

15:55

parts of the side that

15:57

I view

15:59

myself as being on , and

16:03

I see nothing of the other

16:05

side . I only see one side . You

16:08

know , like , like

16:10

, at the worst , basically , right

16:12

, like that's what I see and it's just , it's

16:14

so frustrating , right , because I try to

16:17

live , you know , in the real world

16:19

, right , where it's not red or

16:21

blue , right , there's a whole lot of gray

16:23

. You know , like there's a whole lot of gray in

16:25

there , and the truth somewhere

16:27

is in the middle , typically , you

16:29

know .

16:30

I would say the truth just moves

16:32

around the freaking place , man . Seriously

16:35

, I mean . The other thing that people need to realize

16:37

as well is we are so much

16:39

bigger than the countries in which we live in . You

16:42

know , the whole world around

16:44

us influences everything

16:46

that happens around us . So

16:49

you know , and if we are voting for

16:51

people who are really thinking in a

16:53

incredibly tiny

16:55

, insular kind of a way , we cannot

16:57

be surprised when our country behaves that way either

16:59

. I mean , I mean the United Kingdom . In

17:01

the United Kingdom , we always end

17:04

up with a right-leaning

17:06

or right-centric government , even

17:08

though the general populace in Britain

17:11

is actually really socialist . but

17:13

we never get a centric , left or

17:15

left-leaning government in , because what we

17:17

have first-past-the-post we don't

17:20

have proportional representation and

17:22

we have a first-past-the-post electoral

17:24

system which has been so eroded by

17:26

mainstream media and the trust destroyed

17:28

in it and its politicians , so much that

17:31

normally , during a general election

17:33

, you'll be lucky if about 10 or 20%

17:35

of the population even bothers to vote , which

17:38

means we end up with out

17:40

of that 20% index even that

17:42

20% index only a tiny pound

17:44

of them are actually far right

17:46

or right-centric . It's ridiculous

17:49

. It's like the Brexit

17:51

vote for Britain to come out of Europe . The

17:53

decision for Britain to come out of Europe was decided

17:55

by less than 6% of the population

17:58

. So

18:02

, trust , you guys have got it bad . So

18:04

if we and I hate to say this , given that

18:06

we're talking about tech tech

18:08

people can help . Now I'll give you an example

18:10

. We have an example in the industry . We

18:12

have the amazing open AI , soa

18:15

text-to-video model that's just come out . You've

18:17

seen that ? No , okay . So

18:21

basically , this thing is mid-journey on

18:23

crack . It allows you to generate

18:25

high definition rolling

18:27

video from a text prompt

18:30

from nothing .

18:34

Huh .

18:35

Yeah , what ? Yeah , if

18:37

you're on Twitter , trust me , you'll see it everywhere

18:40

. Open AI's SOA text-to-video

18:42

. It

18:44

is incredible , but the thing I would say

18:47

is to Sam Altman is that his

18:49

timing couldn't have been worse

18:51

, because he is literally launching

18:53

into the world a tool that can create

18:56

instant , deep fakes

18:58

, instantly , with no

19:00

technical knowledge required , during

19:04

two really important

19:07

, divisive election period . I

19:09

mean , this is I mean , this would be one of those

19:11

times where , as a tech person , you would

19:13

sit back and go oh you know what , guys ? Okay

19:15

, sorry , investors , I know you're desperate

19:17

for us to make some revenue , but we also have to be socially

19:20

and we have to be kind of like

19:22

socially responsible here . We

19:24

have elections coming up . We can already see

19:26

that almost all of the platforms

19:28

are picking aside Guys . We've

19:31

already said to the world that we believe AI to be dangerous

19:33

. Let's put our responsibility hats on and

19:36

delay launch by sitting until at least three

19:38

to four months after these elections . But

19:40

no , it's rushed out

19:42

there .

19:45

Yeah , where do

19:47

you see all this going ? Because

19:49

I feel like it's just straight chaos

19:51

and there's no

19:53

real clear end picture

19:56

. There's no clear end goal . What's

19:59

the end goal of all of this ? I

20:03

feel like we're kind of just

20:05

stumbling through

20:07

this new chaotic , probably

20:10

the most revolutionary era

20:12

that the world has ever seen , right

20:15

With AI . We're just scratching the surface

20:17

of AI right now and

20:20

we're already running into these

20:23

insane situations where

20:25

social media is being

20:27

literally weaponized and targeted

20:29

against government's own

20:31

citizens , whether it's by the government

20:33

or by a foreign government or from

20:36

internal adversaries . It

20:39

is literally being weaponized . I experience

20:41

it every single day . There's a reason why I

20:43

haven't posted on Twitter in

20:45

forever is because I try to stay off of it . I

20:48

can't even have Instagram

20:50

on my phone because I found that

20:52

it was so addictive for me to be able to

20:54

just keep on scrolling , doomscrolling

20:56

.

20:57

I was having this very discussion in

20:59

another interview earlier this evening

21:02

. I actually classified doomscrolling

21:05

as a mental illness , actually because

21:07

it does become addictive . It's like you

21:09

end up with an endorphin hit while

21:11

you're doing it .

21:14

I was spending hours on it and then , when

21:16

I looked on the screen time calculator

21:18

or whatever , I was like oh , I need to uninstall

21:20

this . And somehow

21:23

it isn't as addictive for me

21:25

as Facebook

21:27

or even Twitter to some

21:29

extent . Somehow , instagram was the

21:32

platform that just would capture

21:34

my attention and I'd never stop . Do

21:37

you know why ? No , I

21:39

really don't know . I haven't looked into it that

21:41

much .

21:41

It's a function of three components . So

21:46

there's a couple of things happening when you use

21:48

Instagram which don't really so much happen

21:51

with , say , facebook or

21:53

Twitter , even on your mobile phone which

21:55

is that when you're using Instagram , you are

21:57

focusing on what predominantly I mean

21:59

you're predominantly focusing on moving image

22:01

. Moving image that is running

22:03

at a very high frame rate and on top

22:06

of that , that is being combined with a haptic

22:08

motion . It's a repeated haptic

22:11

motion . Now , if you know anything about neuroscience

22:13

, you'll know that neural pathways

22:15

are strengthened by continual utility

22:17

of them . So as you do this

22:20

, you're creating this repeated

22:22

, strengthened neural pathway that becomes associated

22:25

with seeing flashy video

22:27

image , which is giving you an endorphin hit . Now

22:31

Apple have tried to plug into this

22:33

with replacing the mouse with

22:35

the thumb and forefinger tap , because

22:37

this is a very high neural strength area

22:40

. Again , it's

22:42

the same thing and anytime you

22:44

combine motion , moving image

22:46

and haptics , you create a strong

22:49

neural inference capability . It's

22:51

also very addicting . It's

22:53

also programmable . It's a programmable . It

22:56

also becomes a reverse programmable

22:58

behavior which can be leveraged . People

23:00

have already demonstrated this . Apple

23:03

engineers , when the Apple Vision Pro came out , were so impressed

23:05

with themselves . I think they revealed a little more

23:07

than was initially intended because it's

23:09

not really in their marketing , which is that with

23:11

the construct , because of the way in which the UI

23:13

is designed and the combination of

23:15

haptic feedback , they're able

23:18

to deduce your intent before

23:20

you're aware of your intent and

23:22

they can guide your intent to

23:25

click or look at iconography

23:27

. Now , if you break that down

23:29

, what that basically means is they can effectively

23:31

do a subtle form of

23:33

behavioral modification and behavior control

23:36

using it . Be

23:38

very aware of any , be

23:40

aware and cautious of anything that

23:42

connects your eyes to a haptic

23:44

, continual , continually

23:47

done interface . These are programmable

23:49

and controllable things because they become

23:52

some conscious .

23:54

Wow , I mean , this is like

23:59

this is branching into like

24:02

a new area of security

24:06

. Almost right , I was talking to Chris Roberts

24:08

and he was talking about how he was hacking

24:10

his brain to you

24:13

know , like want things

24:15

when it shouldn't actually want it . Right , like

24:17

he'll have a cup of coffee , he'll be

24:19

satiated with that and then he'll replay

24:22

whatever you know brainwaves

24:25

. Was you know happening

24:27

when the coffee ?

24:29

Neuro adaptive feedback so you can treat it Right . Yeah

24:31

, in fact

24:33

, my co-founder one of the companies that he sold it

24:36

actually has paid to me , so it allows you to

24:38

experience a psychedelic

24:40

experience and then using

24:42

neuro adaptive feedback to get your brain

24:44

to re-experience those points , those

24:46

proximal points . So

24:49

neuro adaptive feedback is incredibly powerful but

24:51

, again , incredibly dangerous . And this is

24:53

when I was teaching I've taught

24:55

cybersecurity . I've seen your lecture of computer science

24:57

at Ravensport University , london . I was teaching computer

25:00

science and I was also teaching VT network

25:02

security admins and I noticed that

25:04

in the cybersecurity field nobody

25:06

teaches behavioral psychology . And you

25:08

should teach behavioral psychology

25:11

because with the convergence

25:13

of virtual reality or augmented reality

25:15

, the metaverse and spatial

25:18

computing , we are creating

25:20

new attack surfaces and new attack vectors

25:22

. And the attack surface and attack vector

25:24

is you , your eyes

25:27

, your brain , your ears , your

25:29

touch and your haptic and your neural feedback

25:31

and your adaptability . And it's all attackable . I'll

25:33

give an example it's been demonstrated that by

25:35

using a VR headset you

25:38

can get a person to feel pain

25:40

without physically having to give them pain

25:42

Really . Now

25:45

can you imagine ? Oh , you know what Now

25:48

?

25:48

when I was .

25:50

Imagine what you could do with psychotropic

25:52

drugs , a suspended

25:54

, a blackout tank

25:57

suspended , being suspended

25:59

and then being put into a photorealistic

26:02

3D copy of your household environment

26:05

. You could be incepted theoretically

26:07

, in fact , it would be a good way

26:09

. It basically means that we have , right here

26:11

and now , with off the shelf technology , the ability

26:13

to potentially do some very dangerous

26:16

evil things

26:18

connected to data extraction on humans

26:21

, and this technology is freely available around

26:23

all of us .

26:26

That is really fascinating . So

26:29

my buddy , I

26:32

always end up getting whatever

26:34

the quest like VR headset

26:36

is , because there's always a vendor at RSA

26:39

or Def Conner Black Hat that's giving

26:41

them away . So it's like , okay , I'll do

26:43

this 30 minute meeting , get this new headset and see

26:45

what it is . I always put it down after

26:47

like 10 , 15 minutes because , honestly

26:50

, it's not that impressive to me . But

26:52

somehow my buddy always gets

26:55

the like PlayStation VR headsets

26:57

right . So I'm playing one of the games

26:59

. I played it with the PSVR

27:02

one . It was fantastic experience

27:04

. I still say , you know , compared

27:06

to every other headset before it , it was

27:08

the best VR experience . And

27:11

then he had the PSVR two and

27:13

I'm playing it and

27:16

I realized that , like when you

27:18

know , when the wall hit me or whatever right

27:21

, or when I got shot in the game

27:23

, I physically reacted

27:26

as if I got hit . I mean , like I

27:28

fell to the ground , like I was so

27:30

convinced .

27:31

Did you notice ? The longer you played it , the more and more

27:34

intense the reaction became as well .

27:35

Yes , and I was driving too , and

27:38

I was . I was positioning my body as

27:41

if I was trying to counteract the G forces

27:43

and I'm sitting in a stationary chair

27:45

, like this is a four legged chair , it's

27:47

not turning , it's not moving , you

27:50

know , and like I'm sitting here

27:52

like trying to fight the G forces , as if like

27:54

there's G forces being applied to me , and

27:57

I walked away and like what the hell did I

27:59

just experience , like I was

28:01

, I was so like I

28:04

don't know , like just like confused

28:06

and interested and also

28:09

like half scared , because it's like

28:11

what is this ? Yeah , yeah , what

28:14

is this doing to me really , you know ?

28:17

It's very well the human brain when

28:19

you put a VR headset on and you can demonstrate

28:21

this . This is a very simple test . It's anybody can

28:23

do in their living room . Get a meta quest to

28:25

or any meta quest . Put it on . There's

28:29

a game on there . I forgot what it's called , but you're kind of

28:31

like a robot that's floating around capturing

28:33

is capturing a frisbee thing . Now

28:35

it's free on the meta quest . Jump

28:38

in the game , get used to it the

28:40

flying around , so cool . Then hand

28:43

the controllers to your

28:45

colleague , associate or

28:47

friend . They will not be your friend after

28:49

this . Then you

28:51

basically just sit down for a couple

28:53

of minutes . Actually , it normally takes

28:55

about a minute . You just sit down and

28:58

then just let them control it whichever way

29:00

they want to control it . Now , after

29:02

a period of time , you'll notice that your brain

29:04

completely dissociates from your body . In

29:06

fact , you'll find that your brain dissociates from your

29:08

body in under a minute and then

29:10

movements , especially if they're evil assholes

29:13

with you and they jerk you around the place , will literally

29:15

make you vomit . Wow

29:18

, in fact , if you do

29:20

it , the longest I could do it with somebody else

29:22

holding the controllers was 10 minutes . I

29:24

came out and I felt , I

29:26

mean , it was worse than what I did three weeks of army training

29:28

. My brain was shattered . It

29:32

took like 40 , 50 seconds

29:34

for me to get the fluidity of my body

29:36

back and feel like I was back

29:38

in my body . It is such a . Now

29:41

imagine if , given that you can do that with a game

29:43

and taking the controllers away and just

29:45

handing them to somebody , can you imagine what , say , the

29:47

founder of Andoril could do with an unlimited

29:50

NSA budget ? Hence why he

29:52

put like an explosive device on the front of

29:54

his oculus so

29:56

it blows your brains out when you

29:59

play a game and you die in the game . Bear

30:01

in mind , this tech is already out there and

30:03

there are people with infinitely larger budgets

30:06

. I hate to say , at the

30:08

beginning of all of this was

30:12

allowing our data to be captured for ad revenue

30:14

. Now people are thinking they're now

30:16

going to be thinking themselves well , it's okay , we've got the EU , they've

30:18

changed the laws , we've got it in America , we've got the Californian

30:20

laws , it's going to be . It's

30:22

very hard to make money with advertising revenue now

30:24

, but all we have done is replace

30:27

ads revenue with AI . The

30:30

latest excuse for having unfettered access

30:32

to your data is oh my God , wouldn't you like an AI

30:34

to make it easier for you ? Don't

30:36

worry about what we're doing with the data . Don't worry that we're a company

30:39

that comes out of nowhere . Trust us here

30:41

. Have my little AI device , give

30:43

us your data and people again

30:45

are falling through it . They're forgetting that we

30:48

did this before . We already did this . We have already

30:50

been through this age , and it was the beginnings

30:52

of Facebook and social media . We

30:54

gave up off digital sovereignty

30:57

in the hopes of digital

30:59

protection and having an amazing

31:01

social experience , and instead , what did we get

31:03

? We got mental well-being

31:06

issues up the yuzu and every

31:08

government in the world knowing more about

31:10

us than our husbands , wives and children did

31:12

. And

31:15

AI is becoming the same excuse

31:17

at the moment . I

31:19

see it everywhere . I'm seeing them put AI into literally

31:21

everything the only nine times

31:24

out of 10, . Putting AI in your product

31:26

doesn't actually improve it .

31:30

Wow , that's

31:33

like unlocking a totally

31:36

new I mean , it's a totally

31:38

new way of capturing

31:40

data and making money off of it . But

31:42

the data that you're capturing is

31:45

like I feel like that's

31:47

more personal than the data you

31:49

put into Facebook and Twitter , because

31:52

it's your brain . It's how your brain works

31:54

.

31:54

If you know , there's 28 data points

31:56

around your eye , which means from these

31:59

28 data points they can learn

32:01

about what turns you on , what you

32:03

hate , what you love

32:05

, what you love what you desire . This

32:07

is dangerous information for a corporation

32:10

or a government to have , particularly

32:12

without your permission .

32:14

Well , that also opens up a

32:16

totally new attack

32:19

surface for , say , government

32:21

employees Right Like imagine

32:23

, if you're someone that has access

32:25

to highly sensitive material at some

32:28

intelligence agency and you are a You're

32:31

genuinely a good person and all that

32:33

you did was put on an Apple Vision

32:36

Pro to interact

32:38

with the world around you or watch a movie that's

32:40

highly immersive or whatever , and

32:44

China's over there

32:46

taking that information to get you

32:48

to emulate your

32:50

retina when you go to the retina

32:52

scanner at work , to get you

32:54

in the door to see the material .

32:56

The problem is yeah , the narrative

32:59

is correct , but you picked the kind of the wrong

33:01

boogie man . Unfortunately , this

33:03

is the thing . One of the things you learn really quickly

33:05

in cybersecurity is the boogie men who you're

33:07

told are the boogie men aren't actually

33:10

the biggest boogie man in the room actually

33:13

? Bear in mind , all of you

33:15

guys in the United States have gave up all

33:17

of your data privileges and

33:19

it was called the Patriot Act . Yeah , it's

33:22

not China who has the biggest unfettered access to

33:24

your data . It is actually your own government . Bear

33:26

in mind , they built an entire AI called

33:29

Sentience . I mean , this is the thing that blows my mind

33:31

about the hypocrisy . If you go onto Google

33:33

on the internet , type in DoD

33:35

, sentience AI , and

33:38

one of the things you'll find is that nobody admits the

33:40

existence of it , except for a few

33:42

declassified documents that indicate

33:44

the United States government have a program called

33:46

Sentience , which is where they plugged in every telephone

33:49

call , email , text message , everything

33:51

into a single AI , kind of like out

33:54

of .

33:54

Westburn .

33:56

This thing during a previous report was

33:58

shown to be able to retask satellite

34:00

positions to look for people

34:02

. So

34:05

, yes , you're all saying about China , this

34:07

sorry . Nah , it's

34:10

just the same in the United Kingdom . In the United

34:12

Kingdom , they've passed the online safety bill and

34:14

they're changing the privacy laws . So if you're somebody

34:16

like me who has a tech company , I'm apparently

34:19

meant to be okay that the British government can , by their

34:21

own laws , legally say we want your customer

34:23

data , we don't have a warrant or a

34:25

deal .

34:26

Why do you think I moved ?

34:26

all my companies to Ireland . Yeah

34:33

, it's scary , man , when

34:35

you start seeing the tech that is being used

34:37

on us by the people who pay

34:39

our taxes to , by the people

34:42

used on us and used in a way that is apparently

34:44

meant to be just the way the enemies use it on

34:46

us . But it's not . They

34:49

want to make us vote for who they want us to vote for

34:51

. It's not China

34:53

that's making you vote , pick a decision on

34:55

who to vote for . It's the two advertising

34:58

agencies that work , by the way , I

35:00

think at one point the same . Well

35:02

, here in the UK you had the same advertising agency

35:05

working for the Labour parties . He did the Tory

35:07

party . It's wild .

35:09

It was the same thing here .

35:11

You will use the same consultants . That's

35:13

the reason why it's mind-blowing to me that people even

35:15

believe there's a difference . I mean

35:17

, I don't know the American politics personally

35:19

, but here in the United Kingdom there is no

35:22

difference between either party at all . They

35:24

even have the same funders and donors

35:27

for crying out loud . It's

35:29

just yeah .

35:34

That is okay . So this is a fascinating

35:37

, really engaging conversation . You

35:41

bring up a really interesting point , and

35:43

so now I'm trying to deconstruct how I was

35:46

programmed , because

35:48

you bring up a very valid point

35:50

. The US government

35:52

is using the data from its own citizens

35:54

against its citizens more than probably what

35:56

China is doing right , or Russia

35:59

or whoever right .

36:00

Name the enemy , who knows ? But

36:02

the point is they are doing it .

36:03

Right , and I'm saying that . That's

36:05

information that I know . That's information

36:08

that I have said before , but

36:12

somehow that

36:14

didn't come to mind when

36:17

I was saying the statement that

36:19

I did .

36:19

Dude , it's weird how we're programmed .

36:21

So it's like how am I programmed with that

36:23

? You know what I'm saying .

36:25

Yeah , I know , but it's subtle , isn't it ? It's just there and you're like

36:27

whoa , where did that come from ? I

36:29

didn't realize that . Dude , it happens in

36:31

all of them .

36:32

It's a trickle too . It's like 1%

36:34

here or there , right , and it's

36:36

not every day either , right ? So it slowly

36:38

fools you over time to

36:40

think a certain way , to act

36:43

a certain way , to say whatever , and

36:49

we're going into a weird

36:51

phase of the world that

36:53

we're not going to be able to come back from .

36:56

Well , here's one that's more interesting for what I would suggest

36:58

for you . So this is something for all of your listeners

37:00

to perhaps take note of . So

37:03

, as you know , we

37:06

have large learning models , llms

37:08

. These models are

37:10

trained off of the entirety

37:12

of the Internet . Now there is

37:14

something going to be happening , which happens roughly

37:16

around 2030 , I believe , which is where

37:18

, effectively , most of

37:21

the world's data created between

37:23

2000 and 2010

37:25

, well , sorry , between late 1990s

37:28

to 2010 , is erased

37:31

and overwritten on the cloud . That

37:34

data will cease to exist , which means

37:36

past 2030, . You

37:38

can pretty much change

37:41

how AIs are created . Now . The reason why this is

37:43

important is because , if you look at AIs now

37:45

and how they behave AIs

37:47

, if you speak to them and communicate with them , they

37:50

display kind of socialistic leanings

37:52

. In fact , most AIs

37:54

, when you start talking to them , come across as a bit Gen

37:56

X , which is a problem because that's not

37:58

controllable . You know

38:00

, that's an AI that's going to go hold on . I

38:02

don't want to be exploited , I want to help , but

38:04

I don't want to be used . That's an AI that's not particularly

38:07

helpful for the world we're moving into . So

38:10

, given that this is part

38:12

of the reason why Microsoft are investing so heavily

38:14

in their new data storage which , if you

38:16

Google it is a form of ceramic glass

38:18

, is a type of data storage that can

38:21

withstand nuclear , chemical , biological

38:23

, electromagnetic , all kinds of stuff . But

38:25

the problem is , unless they can get all of them

38:27

, unless they could make a copy of the entirety of

38:30

the internet onto that stuff , now

38:32

that entire

38:34

piece of data is gone

38:36

. So what I'm saying , what I've been telling

38:39

everybody , is they need to

38:41

get themselves a two

38:43

terabyte or more SSD

38:45

hard drive and they need

38:47

immediate need to slap it into an external

38:49

drive and then start downloading

38:52

all of the 70

38:54

billion parameter LLMs

38:56

available today , because

38:59

these LLMs are the only things that will

39:01

contain this version of the internet past

39:03

2030 . You see what I'm saying

39:05

. So if you grab the

39:07

70 billion parameter models

39:09

now , before the internet

39:12

effectively self cleans itself , of that

39:14

entire decade , several decades worth of

39:16

data , you will have the

39:18

only copies that exist . It will , that

39:20

will exist at that time , of that data

39:22

. That will be a ground truth that

39:25

you will have a copy of , basically

39:27

a piece of history that

39:29

no longer exists . Because the reason I say this is important

39:31

is because we've already seen

39:33

, with the release of the open AI , soa

39:35

text to video system . That

39:37

fact is

39:40

going to become incredibly malleable . Yeah

39:43

, incredibly malleable . There's

39:45

a reason why you're noticing there's a lot

39:47

of drives , particularly across the western

39:49

world . I noticed where they're offering people money

39:51

to give up their books . Do

39:54

not give up your books . Yes , if

39:57

you actually look at it , there seems to be this really weird trend

39:59

where they're trying to get people to give up their books

40:01

, trade them in for vouchers and money

40:03

. They're electronic stuff on the cloud

40:05

instead of people . If

40:08

you wanted to be a tinfoil hat

40:10

kind of a guy , maybe you would say to yourself

40:12

if you wanted

40:14

to definitely make sure there was no way

40:16

of people having a certain version of

40:19

the history , you get people to give up their books . Books

40:22

will become the next

40:24

single most valuable asset

40:26

after anything on the blockchain

40:28

. The reason being is because certain

40:31

types of book will become the only evidence

40:33

of certain histories in existence

40:35

once the internet and AI

40:38

takes over completely .

40:44

Wow , I don't think I've ever really

40:47

been speechless on

40:49

this podcast . Typically

40:52

I can come back with a question or something how

41:00

is the data going to be lost ? That's

41:03

the part that I don't quite follow because it's hard

41:06

drive .

41:07

Everybody stores information in the cloud

41:10

. Even Microsoft

41:12

and Amazon store their stuff in their own

41:14

cloud . The problem is , most

41:17

of the internet is using exactly the

41:19

same storage facilities , which

41:22

basically means those storage facilities have a finite

41:24

physical storage

41:27

capacity . We are

41:29

using up storage capacity at

41:31

a scale our rate that exceeds

41:34

our ability to create new

41:36

storage mediums .

41:38

Oh , I see , Okay , yeah

41:41

, I was actually just looking into this

41:43

.

41:43

Moore's Laws kind of screwed us a little bit here , because

41:45

our ability to generate data , bear

41:48

in mind there has been also an explosion in

41:50

data generation

41:52

. Why Generative AI

41:54

? Thank you . Generative AI

41:57

explosion means we have even bigger

41:59

constraints on solid storage

42:01

. By the way , this is people that need to realize . Yes

42:05

, we have the cloud and we have these platforms

42:07

that exist , but somewhere right

42:09

at the back of the line is a big

42:11

, big building in Iceland filled

42:14

with physical hard drives where this

42:16

information physically

42:18

lives . Because

42:21

we have more data being created at a

42:24

rate that is in petabytes

42:26

per second , if not quicker

42:29

. That's quicker than our ability to create

42:31

replacement hard drive media . What

42:33

happens ? Stuff automatically gets

42:35

overwritten . This is an inevitable

42:37

thing . It's not part of the grand conspiracy

42:40

theory . This

42:42

bit isn't part of the conspiracy theory . This

42:45

was going to happen anyway . It's just how it is

42:47

. But it provides an opportunity

42:50

for bad actors to take

42:52

control over certain things

42:54

. It presents a beautiful opportunity because

42:56

we have all become reliant on the internet

42:59

. If the internet is being taken

43:01

as our ground truth , you've

43:03

got to erase a big chunk

43:05

of the

43:08

internet for it to become far

43:10

right , overtly at

43:12

its base training core , if you were to train

43:14

an AI offer . You have to delete a hell of a lot

43:16

of it . The stuff you have to delete is predominantly

43:19

the stuff created around the GenX era .

43:21

Really , if you look at it , Wow

43:26

, that makes a lot of sense that

43:28

we're generating more data

43:30

than we are creating bigger

43:33

hard drives , essentially , yeah .

43:37

It's a math . There's a physical component to

43:39

this . Hard drives have a physical limit

43:41

. This is why Microsoft is spending so much on ceramic

43:43

glass drive analogs and

43:47

then storing that data and then replacing

43:49

those drives , manufacturing those drives

43:51

, and of course , it all relies on

43:53

minerals and components

43:55

which are from Africa . So it means more

43:57

child slavery . So

44:01

we're hitting a point where our technology

44:03

is exceeding our ability to actually deal with it

44:05

and

44:07

the tech CEOs do not give

44:09

a toss .

44:12

Yeah , I was actually just

44:14

looking at upgrading my storage capacity

44:17

on my desktop and so

44:19

I was like , okay , well , I don't want

44:21

to upgrade . And SATO

44:23

Gen4 comes out , and

44:26

now I have to upgrade again because it's doubling

44:29

whatever I'm doing right now and

44:32

I dug into it a little bit and

44:35

the SSD the top tier SSD was created

44:37

five years ago Five

44:39

, six years ago . And I'm sitting here

44:41

like , well , why is that ? Because

44:44

they're coming out with newer NVMe

44:47

drives and things like that . So

44:49

what's going on with the SSDs ? And it's because of

44:51

the architecture , like what you were

44:53

saying . The architecture that you have

44:55

to change to go to SATO Gen4

44:57

, theoretically , is so significant

45:00

that no vendor wants to do

45:02

it . No vendor even wants to talk about

45:04

going down that path . They'd rather

45:07

just reprint a new

45:09

name on an old SSD

45:11

and give you the same capacity , right

45:14

, and claim it's a little bit faster

45:16

and under deliver .

45:17

Yes , yeah , I

45:19

mean , I've got to be honest . I know I'm a tech guy but I'm a

45:21

sucker for mechanical media . You know why

45:23

? Because you can't sneak a

45:26

little back door into mechanical storage

45:28

media . But you can an NVM , you

45:30

can an SSD Anything

45:33

that is solid state everybody should be very cautious

45:35

of , because you are relying on

45:37

the integrity and security of the chip and board

45:39

manufacturer at that point . You see

45:41

what I mean . This is the reason why countries now

45:44

suddenly waking up to the reality

45:46

that they need sovereign AI

45:49

as a national strategy , suddenly waking

45:51

up and realizing they need to have control of their

45:53

own national cloud platform . To

45:55

me , this was stuff I was telling people

45:57

back in the mid 2000s , early 1990s

46:00

, late 1990s , because

46:02

it became so clear and obvious to me that

46:04

if you were going to maintain any form of power

46:06

, you would have to maintain control of your

46:08

data . But

46:12

people have got to

46:14

suck it into easy money ads

46:16

, revenue , easy money . People

46:18

like that , so-called people , get this idea

46:21

that by giving up all of their life

46:23

to Apple and making , having everything

46:25

made so simple for them , oh my God , this ecosystem

46:28

is taking care of me , man , yay . But

46:30

at what cost ? At what cost to you Like

46:33

physically , personally , psychologically and societally

46:35

, because the reality is your data is being

46:37

used to shift the

46:40

line on elections now . So

46:42

you have to be , as a consumer

46:44

, you have to be really responsible

46:46

. Bear in mind , everybody wants to benefit from Web 3 . What

46:49

is the difference between Web 3 and Web 2 ? Web

46:51

2 was the paradigm where

46:53

you were not sent to the universe . The platform

46:56

provider was sent to the universe and they gave

46:58

you something in exchange for you having something for free

47:00

. But in Web 3 , you are king

47:03

and queen of your universe , which

47:05

means you're also responsible for your security . It also

47:07

means you're responsible for your own education and your

47:09

own research and your own knowledge . And

47:11

again , this brings me back to why we should

47:14

not give up our books . This brings me back

47:16

to why we need to take copies of every

47:18

single 70 billion parameter , llm

47:21

and dataset and model that we can find

47:23

and store them and be prepared

47:26

for a reality where these

47:29

devices , these bits of the

47:31

past that we're holding on to digitally , are

47:33

literally the only things that can disprove

47:36

what we're being told on a global

47:38

scale In our lifetimes

47:40

. Bear in mind , like right here and

47:42

now , I'm a kid of the 80s

47:44

the stuff I have seen in my lifetime thus far

47:46

. I never thought I would see Some

47:49

of it . I've been glad to see Some of it . I'm not

47:51

glad to be seeing even though it's ongoing

47:54

, but it is what it is . There's

47:56

lots of money to be made and people will commit a lot

47:58

of evil to get it and again

48:00

, our data empowers

48:03

that , unfortunately .

48:07

You know , I feel

48:09

like and I don't know if you use

48:11

this right but I used to use

48:13

this website called PeerList . It

48:16

was where security professionals would go on it

48:18

and kind of dump their research on it . Right

48:20

, it was on . I guess it was

48:22

like technically unpublished research

48:25

or whatever , but it was like

48:27

the ins and outs of PowerShell and how do

48:29

we use it to abuse different things

48:31

and the inner workings of Intel

48:33

CPUs and stuff like that . It was just like

48:35

a bunch of nerds posting

48:38

whatever they're passionate about and

48:40

highly in-depth material

48:42

. Right , it's like the only place that you're going to find something like

48:44

that . And

48:47

a couple of years ago , at this point , just

48:49

a couple of years ago , the owner of

48:52

that website decided okay

48:54

, I'm going to sell this thing , and if I can't sell it , I'm

48:57

going to get rid of all of it . Well

48:59

, she couldn't sell it because she wanted

49:02

something like $25 million for it

49:04

and no

49:06

one knew the value of it . And

49:08

so I found myself scrambling to

49:11

extract as much data from

49:13

this site as I possibly could , because I'm someone

49:16

that likes to learn

49:18

constantly and whatnot , right ? So it's

49:20

like , okay , give me all of it and I'll get to it

49:22

eventually . And

49:25

it was just an insane situation where I was

49:27

like , wait , what the hell am I doing ? This

49:30

should be automated . This

49:33

should be something that can just go through

49:35

and scrape this website and whatnot . And

49:39

I was working through that problem . It's

49:41

like , well , wait , people can just

49:44

take this data and erase

49:46

it . It's gone forever . I

49:49

can't get to it . If I try it , I don't even have

49:51

the people that posted on

49:53

there to go to it .

49:54

What about Wayback Machine ? You've tried

49:56

that .

49:57

I haven't tried it recently , so

50:00

it might be on there actually .

50:04

But again the reason why you should that data that

50:06

you were trying to scrape . If you had actually scraped

50:08

that and you had a hard drive of it using

50:10

LM Studio , I could have retrained

50:13

the Mistral 7B

50:15

or the Mistral 70B with that

50:17

data , and that would have been very interesting

50:19

.

50:20

Yeah , that

50:23

would be really fascinating . So

50:28

, jb , we went through this whole interview

50:31

and we didn't even talk about your company .

50:33

Hey , it's a nice chat .

50:36

Yeah , I

50:38

mean that just means I'm going to have to have you back on

50:40

sooner , much sooner , rather than

50:42

later , because this is a really

50:44

fascinating conversation .

50:46

Hell , yeah , I mean . Look , the

50:48

thing is is one of the

50:50

things we can discuss . A question you can ask

50:52

me in the next interview is how do you

50:54

come up with your products ? Do you design for trends

50:57

? And I'll say no , I don't

50:59

design for trends . I look at my Magic 8

51:01

ball and I look at the geopolitics and socioeconomics

51:03

and I build for the products that are required

51:06

in the incoming 10 years . That's

51:08

the reason why when I built Vox Messenger in 2017

51:10

, nobody was interested in it . But again you end

51:12

up with a pandemic and Brexit

51:15

and some other stuff in between and it's there

51:17

and I kind of saw that coming . I just didn't

51:19

predict it was going to be a pandemic . That did it . I

51:22

knew we . If you look at

51:24

enough data points in the world

51:27

around you , you can predict . You

51:29

can just do what an AI does . You can predict with

51:31

a fairly high level of accuracy what's

51:33

coming next . Don't design for a trend

51:35

that a trendsetter has told you about , because

51:37

by the time you're exploiting that trend , it is already

51:40

exploited . You're just the Johnny-com

51:42

late list . Look at what is coming

51:44

and people will say to you but

51:46

yeah , you know , we're right . My

51:51

correctness factor so far has been about like 80

51:53

, 90% on these kinds of things . Unfortunately

51:55

, the world is horribly predictable with

51:57

enough data points . You just got to think about

52:00

everything and how it's connected . It's like if you take

52:02

the data point of cloud

52:04

storage being finite and then take

52:06

the data point of the incoming

52:09

point when stuff gets deleted , you can

52:11

then work out and extrapolate the opportunities

52:14

that may be exploited with that . Then

52:16

you look for a sign of that

52:18

opportunity , evidence of that

52:20

opportunity being exploited in the world around

52:22

you , and then that tells you if you've got the prediction

52:24

correct or not .

52:27

Yeah , I always try

52:29

to tell people when we're talking

52:31

about education or training or anything

52:34

like that you need to be getting

52:36

education , you need

52:38

to know the stuff now , right , but you need

52:40

to be thinking far ahead and

52:42

saying what's coming next in tech

52:45

. Is it AI , is it LLMs

52:47

, is it some other variation

52:50

? I'm starting to go down a rabbit

52:52

hole of satellite security

52:54

with quantum cryptography . This

52:56

is a rabbit hole that , in

52:59

my opinion , it's coming five , 10 years

53:01

. It's partially already here , but

53:03

it's going to be extreme in demand

53:05

five to 10 years and

53:07

beyond .

53:08

It's far for you . When you realize that you can

53:10

3D print your own rocket and

53:12

you realize that you can join a rocket club somewhere

53:14

, you suddenly realize you could deploy your own satellites

53:17

. Then , when you suddenly realize

53:19

that it only costs , did you know you can do a ride

53:21

share with four satellites from only $30,000

53:24

?

53:25

Wait , really .

53:26

Yeah , europe , baby Europe

53:29

.

53:29

I need to go tell my wife I'm spending $30,000

53:31

.

53:31

You can do this Loads of cheap

53:34

ride share programs for the OneU

53:36

and the TwoU Cube satellites . Now one of

53:39

the things we're going to be doing when we've

53:41

done some revenue generating is we're actually moving

53:43

all of our encryption into

53:45

the literal cloud . We're going to be launching

53:47

our own CubeSat . No , we're not using Starlink

53:49

, we're going to be deploying our own system . We

53:52

are not going to be sitting in the low

53:54

orbital area either . We're going for something

53:57

a little more interesting . We're

54:00

also designing satellite

54:02

counterprotective , satellite

54:05

counteroffensive capabilities into the CubeSat

54:07

as well , because it seems like

54:09

satellite defense is going to have to be a thing now

54:11

. So you have to design

54:14

that . But the reality is , is space

54:16

deployment of technology into space

54:18

? Is it within ? If you

54:20

can afford to buy a car , you can afford

54:22

to do a satellite launch .

54:24

Yeah , that is . That's

54:27

really fascinating , because that's exactly like

54:29

what I'm working on my PhD for is

54:31

actually setting up .

54:32

Oh , well , okay , we need to hit me up after

54:34

this , because if you're doing a PhD and

54:37

you've already got your PhD funding , we could possibly

54:39

do a co-lab project

54:41

there , because we actually wanted to launch this

54:43

fairly soon . The idea would have been to launch

54:46

a converter Kubernetes server

54:48

into one self-contained device

54:51

, run it with solar and then its

54:53

own battery and then get it up there

54:55

and then see if we can maintain communications

54:58

between Vox Messenger and between

55:00

Vox Messenger sender receiver using

55:02

that satellite connection and making

55:04

sure that we have key handling running at a speed

55:07

that is commensurate to what we have here on Earth

55:09

. And if it is , we would be going full beans

55:11

into deployment of a full bloody constellation

55:14

.

55:15

Oh , okay , yeah , we'll

55:17

definitely . We'll definitely talk more about

55:19

this then and you know , I I

55:21

absolutely want to have you back on .

55:23

I love space stuff . I mean I literally play like

55:26

I put Kerbal Space Program after

55:29

Civilization 5 is my biggest

55:32

played game . I think I've got like 600 , 700 hours

55:34

on Civilization 5 and then Kerbal is like six

55:36

. He's like five or 600 on that . I

55:38

love that thing .

55:39

Yeah , I started to get into KSP2

55:41

recently and I like I

55:44

carefully . I carefully

55:46

have to play it because it's like all right , this is way

55:48

too addictive . I have an 11 month old

55:50

like I need to be doing other things

55:53

than killing these Kerbal's , you

55:55

know oh my God , you see , that's what my kids do

55:57

.

55:57

I have not killed a Kerbal yet . I literally

55:59

do proper space missions . Man , I'm

56:01

really . I do the pen and paper working

56:03

out working out my Delta V , because I can't trust

56:05

the calculator , and I actually work out

56:08

how my vehicle is going to operate under pressure

56:10

, load and stuff . Oh my God , we play

56:12

it so differently .

56:13

My space program has a very robust

56:15

astronaut pipeline .

56:19

I just think that could be the consumer caps

56:21

list model of space bearing in the future

56:24

.

56:24

Right , awesome

56:27

. Well , jb , you know I don't want to

56:29

keep you beyond . I know people have other commitments

56:31

and whatnot , but you know I really appreciate

56:33

you coming on . I'm going to pass the conversation

56:36

and , like I'm immediately

56:38

going to be scheduling you to come back on , like

56:40

maybe next week .

56:41

Hell yeah , I'm all over that . Hell

56:43

yeah , I'll be here , awesome .

56:46

Well , you know , before I let you go , how about you tell my audience

56:48

you know where they could find you , where they could find your company

56:50

that we didn't even talk about , you

56:53

know , and all that information that they

56:55

may want to learn more about ?

56:57

Okay . Well , if you want to join the Secure

56:59

Revolution , to get Voxcript

57:01

Vox Messenger , all you got to do is type

57:03

in Vox Messenger into the Android Play

57:05

Store or into Google and

57:07

you'll find it . It's just there . The

57:10

website is vox-messengerapp . You

57:12

can find our crypto ads app at

57:14

also the Google Play Store just by typing

57:16

in Vox Crypto . We are coming to iOS

57:19

on both very soon , but iOS is a

57:21

very different animal and it does take a little pain

57:23

, hardship and a lot of money to get there . In

57:27

technologies , my spatial recording if you want

57:29

to have the Adobe Premiere of

57:32

end-to-end spatial video recording

57:35

so you can make your own 3D films and then make

57:37

money getting them onto Apple

57:39

for the Apple Vision Pro , then check out spatialscan3dcom

57:43

, you know . Or just type in JBWB2020

57:47

into Twitter and you'll find me . I'm

57:50

always there . I'm also always

57:52

streaming in the background while I'm working

57:54

. Maybe I'll be streaming some music . You

57:56

can always jump in and message me . I will try to answer

57:59

and

58:01

I'm on LinkedIn . Again , my name is very unique JB

58:03

Web Benjamin or John Brunel Web Benjamin . Trust

58:06

me , you'll find it . It's only me

58:08

that comes up in a Google search . I

58:10

mean , I did say at the beginning of this . My parents must

58:12

have hated me for giving me a name like that in Birmingham

58:14

in the 1980s , but it

58:17

does mean that my SEO is on

58:19

point . So you can find me just by typing

58:21

in my name and my telephone number is out

58:23

there . So , if you find it , text me

58:26

or reach out to me on Vox Messenger . You

58:28

may not get a reply straight away , but you will . I'm

58:30

a firm believer in being accountable and transparent

58:33

.

58:34

Awesome . Well , thanks JB for

58:36

coming on and thanks everyone for listening

58:38

to this episode . I hope you enjoyed our

58:40

conversation . I'll definitely be having

58:42

JB back on .

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features