Podchaser Logo
Home
Future of [energy, content, food] (Friends)

Future of [energy, content, food] (Friends)

Released Friday, 9th February 2024
Good episode? Give it some love!
Future of [energy, content, food] (Friends)

Future of [energy, content, food] (Friends)

Future of [energy, content, food] (Friends)

Future of [energy, content, food] (Friends)

Friday, 9th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:14

Welcome to Changelog and friends, a

0:17

weekly talk show about the way of

0:19

the future. Thanks

0:21

as always to our partners

0:24

at fly.io, the home of

0:26

changelog.com. Push your app

0:28

close to your users, find out

0:30

how at fly.io. Okay,

0:33

let's talk. Hello

0:41

friends, Jared here. Today we're taking

0:43

you back to the hallway track

0:45

at that comp one more time.

0:48

We have three fun conversations I think you're

0:50

going to enjoy it. One note

0:52

on the audio for these hallway track episodes.

0:55

We leave the background noise in on purpose

0:57

to bring some of the vibe of actually

1:00

being there. We understand this

1:02

is distracting to a few folks, but the trade off

1:04

is worth it in our opinion. If

1:06

you strongly disagree, let us know. We

1:08

appreciate you for listening and we're listening too.

1:11

Okay, first up it's Sam Goff

1:13

from Minneapolis, Minnesota, who spent the

1:15

last three years at Hewlett Packard

1:17

Enterprise, but left recently to found

1:19

a startup that's still in stealth

1:21

mode. Okay

1:30

so these are like SM57s or something? Yes.

1:34

Okay. Are you an audio nerd? I

1:36

have a recording studio. Oh.

1:39

Yeah. These are pretty prime for

1:41

recording studios. Yeah. I write and record

1:43

music. I play five instruments. Nice.

1:45

Which ones? Piano, cello, guitar, bass

1:47

and drums. Not

1:49

at the same time though. It's

1:51

difficult. One man

1:54

band, right? I am ambidextrous though. Okay, so you

1:56

can go two at once. Yeah.

2:00

That's cool For

2:02

you as you can do the the fretting. Yeah.

2:04

Yeah. No, I when I was younger I

2:07

traveled to New Orleans and you see the

2:09

one-man band and they get they

2:11

got the you know The foot thing the

2:13

foot thing. What are they called the tambourine?

2:15

The tambourine? Yeah horns and stuff like that

2:17

Yeah, all right. Is this just for a

2:19

level check? Is that what's going on? Okay,

2:22

this is what we do is we just talk to

2:24

people. Okay, so it's very relaxed. Okay It's not gonna

2:26

feel like an interview. It's gonna feel like okay, three

2:28

of us are just talking like I'm interested about Your

2:32

musical aspirations or okay life.

2:35

Okay, so Life so

2:37

I had an offer to come

2:40

out to the Twin Cities and be

2:42

professionally produced Okay, some of the people

2:44

that I was working with had worked

2:46

with Janet Jackson. Nice. Yeah so

2:48

but I got out there and discovered that

2:51

The music business was not how I

2:53

would like to make my living Okay,

2:55

and I pivoted to the other thing

2:57

that I was pretty decent at at

2:59

the time and that was Computers

3:03

so started with typography.

3:05

I couldn't afford a book on Quark

3:08

Express so and I

3:10

couldn't afford Extra gas

3:12

to drive to library to discover

3:14

that they didn't have anything modern,

3:16

right? Right. So my girlfriend had

3:18

a job at the Mall

3:21

of America and so I carpooled

3:23

with her. All right, and then I went to

3:25

be Dalton and Walked in

3:27

there and memorized as much of the book as I

3:29

could over the weekend Really? And

3:31

then I aced the interview on Monday got

3:34

the job and then people who

3:36

were using Quark Express, you know eight

3:38

ten hours a day 4050

3:41

hours a week for years they were asking

3:43

me hey What's the keyboard command to do

3:45

blah and blah blah and I knew it

3:47

because I memorized as much of the book

3:49

as I could In one weekend, so

3:52

that's how I got my start in an

3:54

interesting way to ace an interview Yeah, and

3:56

then when this new thing called the web

3:58

came out. I think was 93

4:00

or 94. I saw a presentation

4:03

by Guy Kawasaki, who

4:05

was an evangelist for Apple at

4:07

the time. And he was demonstrating,

4:09

I think it was Claris HomeSight

4:12

or HomeSight. It was a

4:15

Claris product, but basically the thing

4:17

that blew my mind is they had

4:19

a WYSIWYG, and you could toggle

4:21

between that and a code editor, and

4:24

you could do like a split screen. And so

4:26

if you have a large enough screen, then you

4:28

can actually see your code and you can see

4:30

the effect on those

4:32

changes in real time. Kind of like with

4:35

Hot Module Reloading and stuff like that these days.

4:38

But yeah, back then it was, like

4:40

I said, it was probably 93

4:43

or 94. That was kind of revolutionary. But

4:46

yeah, we were using tables and clear gifts

4:50

and all kinds of unnatural things

4:52

back then. I became obsessed

4:54

with performance back then. Oh, do you have

4:56

to worry about straight sounds or

4:58

anything? Not necessarily. It's new for the show.

5:01

We've got a baby filter. Do they have

5:03

a filter for that somehow? Sure. We'll gate

5:05

that baby out. They make

5:07

baby gates, don't they? I recently

5:09

uncovered there's a, was it RipExDAW?

5:12

Basically it allows you to

5:14

take any audio source,

5:18

throw it in there, and you can

5:20

actually see how all the different audio

5:23

sources are spread out across

5:25

the spectrum. And you can select them, you can

5:27

interact with them, you can take a flat

5:30

note, you can take a

5:32

RipExDAW. Try the baby gate then.

5:34

Because it allows you

5:37

to, let's say, take an

5:39

existing recording and you can take out

5:41

all the instrumentals or you can just

5:43

take out certain, like the

5:45

vocals and put in your own.

5:47

I've always wondered how that works because there's a drummer

5:50

I pay attention to, I'm going to jack his name

5:52

a little. L.S. Brino, I believe, he's a phenomenal drummer

5:54

on YouTube. And he's like,

5:56

I saw him on TikTok at first and he's

5:58

just phenomenal. But he's playing to music,

6:01

but he's the drum, right? So

6:03

how do you get that whole track? Everything

6:05

minus the drums. Exactly. So it must

6:07

be something like that. I don't know. Exactly. And

6:10

so I discovered this fairly recently, and of course

6:12

I used Logic and I used a bunch of

6:14

other stuff, but this is exciting

6:17

to me because it would

6:19

allow me to take an experiment

6:21

because if I

6:23

want to experiment with music, yes, I

6:26

can perform everything from scratch, but

6:28

it would be really awesome to be able to isolate, let's

6:30

say, just a Billy Cobham beat

6:32

from an old recording, you know, or

6:34

a little, you know, funk bass or

6:36

whatever, you know, be able to feed

6:39

that in as a seed. Here's

6:41

my frustration about, okay, I'm

6:44

so deprived of caffeine I'm going

6:46

to just jump around topics. Pop,

6:49

pick a drink. Well, it's too hot to

6:51

choke right now. I'm

6:53

barely avoiding burning my hands. He keeps glancing at it. He keeps

6:55

thinking about it. Will it cool off enough to drink? Is it

6:57

too hot for your hand? Because you

6:59

can set it over there. I feel as if having

7:02

my hand on it is drawing some of the heat

7:04

out of it, right? Because

7:06

of the thermal mass. Okay. So

7:09

I'm accelerating. Very intense. Yeah.

7:11

I'm accelerating the time frame.

7:13

You're actively cooling it. It's active

7:16

cooling. Yes. Well, you're

7:18

a conduit for the heat exchange, right? Exactly.

7:21

I'm a heat sink. I'm a heat sink

7:23

for the coffee. Okay, we're short-fin over here. We're

7:25

here at the heat sink talking about his

7:27

frustrations and whatnot. Okay, so we're talking about

7:30

AI. And it was... Really?

7:33

I was talking about AI. Paul, you hadn't said that part

7:35

yet. Yeah. All right. So

7:37

generative AI as it pertains to

7:39

music is so frustrating right now.

7:42

The landscape... I was reading

7:44

Ken Wheeler was apparently doing

7:46

a talk about this subject

7:49

and he did a bunch of research on it. He's like...

7:52

He was disappointed. A lot of us who

7:54

are involved in music and

7:57

AI are highly frustrated

7:59

with... It's like there's a

8:01

huge blind spot around the

8:03

types of solutions that are being pursued.

8:07

They can generate audio of

8:10

a particular style. They can

8:12

give you subject matter, whatever.

8:15

But what I want to do as a musician,

8:17

as a recording artist, is I

8:19

want to be able to seed

8:21

an AI with my

8:23

own sample. With

8:26

Mid-Journey, you can take two images and you

8:28

can blend them together and get amazing

8:30

results. Now, I want

8:32

to be able to do that with my

8:35

own music plus anything that I can hear,

8:37

be able to isolate it and say, okay,

8:39

I want my loop

8:42

plus this other loop to have a love

8:44

child together. I

8:46

want it to be expressible as

8:49

audio, but I also want the MIDI

8:51

file so that I can pipe that

8:53

to any instrument and

8:56

I can get any kind of sound and

8:58

then I can isolate it, I can process

9:00

it, I can do whatever I want with

9:02

it. That's

9:04

what I want to be able to do. Like

9:07

a really talented session artist that you

9:09

would have to pay not a small amount of money to,

9:11

you'd be able to give the

9:14

AI an idea of what

9:16

you're going for and it would elaborate on

9:18

it. And

9:20

then you'd be able to give it feedback in the

9:23

same way you can talk to chat GPT or even

9:25

Mid-Journey where you can vary regions. This

9:28

is great, but I want you to change that one

9:30

thing and you can have that interactive feedback

9:33

and then quickly iterate on it. You

9:35

know, v0.dev that you're self-created,

9:38

they have the same thing but for UI

9:40

where you start with, hey, this is what

9:42

I'm going for. Okay, this is cool, but

9:44

I need you to tweak the one thing

9:46

over here. To

9:49

be able to have that kind of a

9:51

conversation the way you would with a peer

9:53

or a colleague and collaborate with them. What

9:56

do you think explains the gap between what

9:58

you wish existed and what actually existed? What

10:00

explains that? Audio content is

10:02

not as popular, at least on Twitter

10:04

and a lot of other social media

10:07

platforms. I think the visual catches people's

10:09

eye. It's eye candy.

10:12

Video and still image format is

10:15

easiest to gobble up attention.

10:19

Right now, I think we're seeing a distortion

10:21

in the pursuit of different

10:23

areas of AI based on what

10:26

gets engagement on social media.

10:30

Whatever is most popular is going

10:32

to be incentivized. I

10:35

think eventually we will get

10:37

there with the generative AI for musicians

10:40

and for recording artists. I

10:42

was going to say, music is pretty popular. I don't disagree

10:44

with you. It is. Man, you

10:46

think that music will be coming up next or high

10:49

on the hit list for what gets attention. Well,

10:52

and that's the interesting thing though because I think

10:55

professional recording artists are probably

10:58

scared in a lot of

11:01

situations because they now have a way

11:03

to train a model on

11:05

let's say Taylor Swift or Beyonce.

11:08

It will basically re-perform

11:11

a song in

11:14

the style of the artist who actually recorded

11:16

it. Because it is

11:18

not an actual sample

11:21

and I'm holding up my fingers for

11:23

an air quote, scare quote, it's

11:26

not an actual recording

11:29

of Beyonce or Taylor

11:31

Swift. It's a reinterpretation. It's

11:33

a reinterpretation. Right.

11:36

It doesn't set off DMCA. Doesn't

11:39

trigger any takedown notices or anything else

11:41

because it's AI generated. Yeah. It

11:45

certainly is scary, I think. To

11:48

a certain extent. Yeah. Especially if

11:50

you're one of these session artists you're talking about, right? Exactly.

11:53

Who used to be able to take their skills, which

11:55

are very valuable skills, and hire

11:57

them out for a fee that was... sustainable

12:00

for their lives now all of a sudden you

12:02

know I want you to add Carlos Santana

12:05

playing the guitar on to my exactly

12:07

and just as good or close enough

12:09

right exactly it is gonna work to

12:11

lose there are probably the lower down

12:13

in the spectrum like the session artists

12:16

the non Taylor Swift's exactly it like

12:18

if I don't know like

12:20

does Taylor Swift really lose if like would

12:22

somebody be able to make a version of

12:24

Taylor Swift through AI through

12:27

their own work oh yeah you

12:29

know I mean yeah that and when I take her

12:31

down yeah it would have changed her lifestyle would have

12:33

changed her fame well here's here's the

12:35

dirty little secret probably not here's the dirty

12:37

little secret when it comes to the music

12:39

industry it used to be that

12:41

you could be a recording artist who you know

12:44

sometimes to her's and you

12:46

could make a decent living off of that right now you

12:49

have to but now because of

12:52

the realignment of the

12:54

incentives and the pricing model

12:56

of the streaming business you

12:58

know Apple Spotify all those guys

13:01

basically what you have now is

13:03

it is almost impossible to get paid

13:05

for your streams right unless you are

13:07

a Taylor Swift you get into this

13:09

hundreds of millions yeah you get into

13:11

the situation where if you're

13:13

not in like the top you know

13:16

10 or however many artists you

13:18

get you know pennies everything

13:21

in the music industry right now comes

13:23

down to touring so this

13:25

is how artists actually make

13:27

their money because that's the only way

13:29

that they can actually control the flow

13:31

of revenue yeah compared to their recordings

13:33

and recordings are just a way for

13:35

them to get to the ten year

13:37

tour yeah in order to sell your

13:39

CD exactly now you have a CD

13:41

that's that's music exactly for those of

13:43

the back this yeah they're no longer

13:45

compact is now you write music in

13:47

order to bring people to your tour

13:50

exactly the opposite exactly it's not a

13:52

fun lifestyle right it is but you

13:54

also talk about not sustainable I mean

13:56

some people can pull it off like the Rolling Stones and

13:58

yeah for 2040 years but very

14:00

few people like talk about burnout. Yeah,

14:03

I mean I was road man the road

14:05

I was reading this thing about Taylor Swift's

14:07

workout program Preparing for her errors to

14:09

her yeah It's insane because if you think

14:11

about what she has to do every night roughly every night

14:14

it's three and a half hours of Song

14:16

and dance and being here and

14:18

changing your clothes and all these things Yeah And the way

14:21

that she actually had a train for that was she would

14:23

get on a treadmill For three and a

14:25

half hours and she'd sing her entire

14:27

set every day On the treadmill

14:29

as she ran and walked ran and walked and she had

14:31

to sing through it I mean she's

14:34

healthy and of ages she can do that 30s,

14:36

right? I mean you just it's

14:39

not a very sustainable lifestyle. Really part and cannot do

14:41

that. I'm trying to think of that Maybe she could

14:43

I don't know she's hanging in there. She's Could

14:46

she run though? I know what you're saying. Yeah, I

14:48

agree with your point like at some point You just

14:50

can't do that. You can't be

14:52

the same artist and age and maintain

14:54

the requirement to Kind of

14:57

like an open source maintainer, right? It's

14:59

like I write software that people value

15:01

and I give it away and

15:04

They take huge value from it and

15:06

in order for me to do that Just like

15:09

people give their music away pretty much now I

15:11

have to become an influencer and a marketer and

15:13

all these other things our business person and or

15:15

they do that and that's How it is with

15:17

music now like you have to be it you have to be

15:19

touring you have to be a person And to a certain extent

15:21

it's kind of always been like that, but it's just getting harder

15:24

and weirder Yeah, and you

15:26

have to have the name Recognition to

15:28

bring people out exactly. Yeah,

15:30

so that's just what occurred to me when it

15:32

comes to Generative AI

15:34

as it applies to audio things

15:37

are about to get really interesting because We're

15:40

heading into an election season the

15:42

ability to use generative AI for

15:45

deep fakes and To

15:47

be able to create a world where you

15:49

can't trust anything that you see online I mean

15:52

a lot of us haven't been trusting what we

15:54

see online for a while But

15:57

there's still I think a part of

15:59

our animal or lizard brain that looks

16:01

at a video something that

16:03

confirms our biases and

16:06

we're just looking for an excuse to feel a certain

16:08

way. And we're

16:10

about to enter into a situation where

16:13

we can actively reinforce

16:15

our prejudices and actively reinforce

16:17

our biases or we can

16:20

take a step back and go, okay, what do

16:22

I actually know? How do

16:24

I know it? How do I trust these

16:27

information sources? Because I think it's going to

16:29

be very interesting over

16:31

the next year or so. Well, random person on

16:33

Facebook may not be the best person to pay

16:36

attention to for deep fakes. You

16:38

know, like if you see your friend post a video, I

16:41

would say like that's kind of where

16:43

in the way mainstream media does

16:46

lend at least a name

16:48

recognition reliability. But then

16:50

you also have like indie outlets you could pay

16:52

attention to. Yep. I suppose where are

16:55

they getting those videos at? If it's not, your friend

16:57

could get duped and now you're duped by your friend.

16:59

Right. That happens. Exactly.

17:02

So what does society look like when everybody's

17:04

always skeptical of everything? Because that's

17:06

kind of where I'm getting to where I'm like, I

17:08

don't believe much of anything. And like, that's not a

17:10

really healthy way of viewing life. Exactly.

17:13

So I am a very optimistic person.

17:16

I recognize and I see the danger.

17:19

And in fact, I was off of

17:21

social media for over a decade because

17:24

I didn't like the impact that

17:26

it had on me. You know,

17:28

the self-reinforcing pleasure

17:31

cycle in your brain that

17:33

get that dopamine mean hit. It's

17:35

just not healthy in the long term.

17:38

So you have to at least what I had to

17:40

do is I had to figure out a way to find

17:43

balance because I have a highly

17:45

addictive personality. So I put in

17:47

safeguards so I don't get sucked

17:49

all the way in. But

17:52

basically in the

17:54

process, I watched a lot of

17:56

people get influenced by things that

17:59

later turned out to be just

18:01

a scam or a deep

18:04

fake or something, a very

18:06

well photoshopped thing. I

18:09

think I have faith that

18:11

people are going to have to evolve

18:13

past where they are right now in

18:15

terms of their sophistication of their media

18:17

consumption. We're going to have to

18:19

get to a place where we're more

18:22

intentional about the diet of the

18:24

mind that we feed

18:26

ourselves. Because right now, everybody

18:29

is competing for everybody's

18:31

attention. It's all about manipulating

18:33

and bludgeoning people into

18:36

giving up their attention, whether it's for a minute

18:38

or an hour, and then

18:41

reinforcing that habit and

18:43

then monetizing it. So, something

18:46

has to change. This is not sustainable in the

18:48

long term. And that's the

18:50

beauty of how human beings and

18:52

living systems evolve. You

18:55

push things beyond a breaking

18:57

point. You push things into

18:59

a problem state and something

19:01

changes. Either it changes

19:04

in a way that you hope

19:06

for or it changes and you're

19:08

just going to be left

19:11

with the repercussions. It might

19:13

not be something that you would hope for. In

19:15

fact, it might be the very thing that you fear.

19:18

But that's the thing about equilibrium.

19:22

And when

19:24

you stress people, when you put people

19:26

into a situation where it's unsustainable, something

19:29

has to change. Or break. Or

19:32

break. Or break. Breaking changes. But

19:35

when they break, it still causes a change.

19:37

For sure. Exactly. So,

19:40

it's going to be interesting to see what happens

19:42

with automation as it applies to our industry in

19:44

the near future. So here,

19:46

I'm a pessimist, but here's something

19:48

hopeful based on what you're saying. That

19:51

the conversations that we've been having amongst

19:54

techies over the last 18 months, almost

19:57

all of them have some form of this conversation that

19:59

we're having with you. with you in

20:01

it, like inevitably. In

20:03

fact, we started to have chapters, I'd call

20:05

it like the obligatory AI chapter. It's

20:08

going to work its way into all of our

20:10

conversations. Even when ostensibly the

20:12

conversation's about an entirely different thing, here

20:15

it comes. And so the reason why that's hopeful is

20:17

because I feel like this is permeating

20:20

our zeitgeist and we're all thinking about it

20:22

and we're all concerned about it. And

20:24

we are being, like you said, more

20:26

mindful of our media diet and what's

20:29

changing around us. And I think that we

20:31

are also well positioned in a place where we can affect

20:33

change in that space. There's

20:35

my hopefulness. That is hopeful. I'm

20:38

being very mindful of my media diet. Yeah,

20:40

to the point where I don't

20:43

really have, I suppose my wife might

20:45

disagree that I have an addictive personality.

20:47

I like healthy obsession more than

20:50

addictive. This is a

20:52

marketer here. I like healthy obsession.

20:54

Healthy obsession. Right? And

20:56

I cannot even

20:58

allow myself to go on TikTok.

21:01

It's just too much. There's just too much things that

21:04

I want. It really is just

21:06

too good. There's a lot of things I want to know and I

21:08

have a curious mind, so I'm just naturally curious. And

21:11

about things I don't even really necessarily care about,

21:13

like the main thing, so to speak. And

21:15

so I will find myself, I'm

21:18

bored, let me give myself permission.

21:21

10 minutes. Well, that won't be 10 minutes.

21:23

It will be an hour. Oh yeah. Now

21:25

I just don't even allow it. Don't

21:27

even allow it. I just know that I'm

21:29

gonna go on there and find interesting things

21:32

and be entertained or be educated

21:34

or whatever. And I

21:36

just don't allow it anymore. What's

21:39

healthier for you? I'm a T-toddler. It's

21:42

a zero. Sure, but I mean like then what do you, when

21:44

you have that 10 minutes, when you wanna decompress, when you

21:46

would- Go to sleep. Yeah, sleep

21:48

it out. 10 minute nap. Not

21:52

necessarily. I would go on

21:54

TikTok like during, like when I should be going

21:56

to bed. Before bed. It's like

21:58

a reading thing. You know, so. What

22:00

about like you're on lunch hour never having

22:02

a sandwich? What do you do? Something

22:05

productive probably check email organize

22:07

my view work through your oh not necessarily I

22:09

mean, this is an example of what I do

22:12

sure I would just be present in the moment

22:14

I do have books I listen to a lot so

22:17

I'll listen to I'll re-listen the books Like

22:19

if I like that book, I'll re-listen to it again. I

22:21

might give it six months I'll re-listen to it again, but

22:23

I go back to books or read

22:26

books or Catch up with

22:28

someone via the phone or something like that. I'm

22:30

just not Diving into social

22:32

media that is really not that important to

22:34

me Yeah, doesn't really need to feed the

22:37

like you said does mean beast basically. I

22:40

don't need it I don't want to do that

22:42

anymore That's not how I want to operate and

22:44

I know that I'm less healthy mentally When

22:47

I allow myself to be in that zone,

22:49

right? And so the more I'm making for

22:51

of I

22:53

guess my present state of mind and my present

22:55

feeling about whatever's around me is the better for

22:58

me Yeah, cuz there's an easy way you can

23:00

go on there Just like just lose your time like

23:02

in a lot of cases social media really is about Losing

23:05

your time to something else that's in

23:07

full control and in tiktok's case in

23:09

particular very much at the

23:12

whim of the algorithm All right, whatever

23:14

it's gonna give to you next and

23:16

it's all designed on swipe engage swipe

23:18

engage You know exactly so yeah, they've

23:20

really got that feedback loop Boiled

23:23

well oiled machine. It's also a good thing

23:25

to like I'm not gonna say that tiktok's

23:27

bad I just needed to

23:29

be more mindful of how I use it and

23:33

When I allow it to be used by me like I'll still

23:35

hop on here and there But

23:37

just tell everybody they should start a tiktok. Yeah.

23:39

Yeah, totally. I still think it's like a great

23:42

platform But I think

23:44

I think it begins really even more so with

23:47

a smartphone being in my pocket I have access

23:49

to anything I want I can be on Chat

23:52

GPT way more because it's got an iOS app

23:54

and the iPhone does a great job of taking my

23:57

voice Turning it to pros. Yep.

23:59

I've said this before on podcasts where I

24:01

would just talk to chat DPT via the

24:03

iOS app versus typing it out. So

24:05

I could do the same thing there. It's just at

24:08

a state in humanity today, we have access

24:10

to literally a lot of

24:12

communication, whether it's positive or negative,

24:14

in our pocket for the most part. Most

24:17

modern society folks have access to that.

24:19

And I think there's a, when you talk about balance,

24:22

I think we could all exercise

24:24

a healthy balance with that.

24:27

So it's interesting that you bring

24:29

that up because somebody

24:31

posted a meme the other day, I saw

24:33

it, I think just yesterday, and

24:35

said, you know, at one point we used

24:37

to think that

24:40

the problem with the world is

24:43

that people just don't have access to

24:45

information. Well, now everybody

24:47

has access to all the information, whether

24:50

it's true or false. But

24:52

I mean, essentially that theory has been disproven,

24:54

right? But what I would say is now

24:57

people have access to their

24:59

own alternate facts, their own

25:01

alternate version of reality. Because

25:05

in the case of Google, let's

25:07

say I'm going to conduct a

25:09

Google search on, let's say something

25:11

that borders on the political, the

25:14

results I'm going to get

25:16

are completely different than

25:18

an uncle or cousin or

25:20

somebody on the opposite side of the political

25:23

spectrum. So that's going to

25:25

reinforce their version

25:27

of reality or their existing biases.

25:30

Right, and yours as well. Exactly.

25:32

They feed us ourselves. Tucker Carlson, love him

25:35

or hate him, he was

25:37

on a debate with

25:39

one of the Young Turks hosts

25:41

recently and I saw it. And I

25:44

think the headline was like Tucker Carlson beats this

25:47

guy, like dunks on him basically, demolishes

25:49

from an argument standpoint. But one thing

25:51

he said was about

25:54

government and this is kind of getting political to some

25:56

degree, but he was saying about government and

25:58

he was saying, you know, So in no time

26:00

in history have we ever had a private

26:03

company be more financially

26:05

stable and well funded

26:07

than our government is, and how much

26:09

power Google, and particularly like you mentioned,

26:11

has. We've never had the

26:14

algorithm, the search algorithm be so in

26:16

control of society that we

26:18

can be, like you just said, your

26:20

results are different than your uncle or your aunt or whoever.

26:23

That we all have this sort of,

26:26

they have just so much power over what we

26:28

can see and what our filters

26:30

are and what our bubbles are and our spectrums

26:32

are. These are private

26:35

companies, they're for profit, and

26:37

they could be good or they could be

26:39

evil, but that's something that we've never really

26:41

experienced in all of human history, having so

26:43

much power. And

26:46

yet, when I go into Safari on

26:48

my iPhone and I search something, whether

26:50

it's a product, or looking for something at Home Depot

26:52

or whatever, it begins with a Google

26:55

search in a lot of cases. Or

26:57

you're asking chat GPT, which is a different

26:59

company, but in the same scenario, right? For sure, for

27:02

sure. When that thing answers all

27:04

your questions, pretty much. And you

27:06

can't trust them all, but you can trust them for the most part. But

27:09

his point was, was how much control that

27:11

company not only has over what we see,

27:13

but how it sways political

27:16

with lobbying, with all

27:18

the money. Well, it's everything.

27:20

Yeah, exactly, like so much power, not

27:22

only as a utility that

27:25

is very much for good or very much

27:27

for evil if you wanna go that route,

27:29

but that it also has so much power over our government.

27:32

And you multiply that by Google, Microsoft,

27:35

Apple, Amazon. It's not just

27:37

one, it's many. And they all

27:40

have similar sway over our government. And

27:43

back to the whole pessimist, I don't

27:45

trust anybody, what does that even do

27:47

if you're already there? If you're already

27:49

like, I can't trust anybody, yet

27:51

this is true. Pull the plug, pull the

27:53

plug. So here's the interesting thing. I

27:56

think at one point, if you go back

27:58

enough decades ago, probably have

28:00

to go back more than 40 years ago. But

28:03

basically, I believe that

28:05

even if you were of a

28:07

completely different political persuasion, we

28:10

at least had a commonly

28:12

accepted set of facts that we

28:15

could all agree about. We

28:17

might disagree about what to do

28:19

about that reality, but at least we could find

28:22

common ground on, okay, the

28:25

earth is not flat. Okay. It's

28:27

not flat. And unless

28:29

you're of a particular persuasion, you

28:32

believe that the earth is a

28:34

little bit older than, say, a few thousand

28:36

years, right? So starting

28:39

with certain fundamental,

28:42

foundational common

28:45

points of agreement, the

28:47

range of possibilities that you could arrive

28:50

at about what should we

28:52

do given this reality that we can

28:54

all agree upon isn't so

28:56

night and day compared to when

28:58

you have a completely separate set of

29:00

facts and a completely different understanding

29:03

of reality. And that reality,

29:06

the Venn diagram is broken because there's no

29:09

overlap between, you know,

29:11

one political reality and another political

29:13

reality. And it's

29:16

causing people to arrive at vastly

29:18

different, but I want

29:20

to say by design, irreconcilable

29:23

conclusions about what to do with

29:25

two completely different realities. Right?

29:29

So I think the solution is

29:31

for us to trust

29:34

our eyes and ears less and

29:37

start going back to original

29:40

sources and get

29:42

to a place where we can actually trust

29:44

the information that we see and hear and

29:46

consume. And we

29:48

need immediacy of that in our

29:51

media and in every

29:53

other source, where as you're consuming

29:56

something, there's a way

29:58

for you to evaluate how to do it. hey,

30:00

almost like a real-time fact checker. And

30:03

I feel as if this is where AI

30:06

could be unleashed, to give people

30:09

contextualization of the information. Because

30:11

you look at Twitter, somebody

30:13

says something outrageous, eventually

30:15

community notes will

30:18

come in there and say, oh, by the way, you're

30:20

missing this crucial context which

30:23

basically flips the narrative. So

30:27

the thing that you agreed with and liked

30:29

and shared got completely

30:31

turned upside down. Imagine

30:33

if you could have that kind of

30:35

accountability to a broader context

30:38

in real time. And

30:40

up until now, it required

30:42

people who are experts or people who

30:44

are freaks of nature who

30:47

just happen to have certain subjects

30:49

memorized so that they

30:51

could provide that accountability. And they had

30:53

to be available and their attention had

30:55

to be trained on that. But

30:58

imagine if you had AIs that

31:00

could provide that for you and

31:03

provide that accountability in real

31:05

time and actually reduce disinformation

31:08

before the disinformation has

31:11

actually had a chance to do its damage. Because

31:14

right now, all of the incentives are

31:16

aligned around the people

31:19

who are doing the worst in

31:22

our society, the people who are saying

31:24

things that are clearly not

31:26

true. They receive benefit of it

31:28

because it spreads across the internet

31:31

immediately. What was it?

31:34

A lie can spread around the world before

31:36

the truth is even laced up at shoes.

31:39

So imagine being able to flip

31:42

that economic incentive so

31:45

it's actually painful

31:48

for somebody to tell a lie in a

31:50

public space. Yeah, community notes is

31:52

interesting. Do you know how it works exactly? I know

31:54

a little bit about how it works. So there have

31:56

to be enough people who

31:59

have this. disagreements on other

32:01

topics. Exactly. And so these

32:03

are people who the community

32:05

has decided that their feedback

32:07

is worth upvoting. Right.

32:10

So they're considered a trustworthy

32:13

source of information so

32:15

that, and the problem with that

32:17

is if you get enough people acting in

32:19

bad faith you can completely

32:21

distort that kind of an algorithm.

32:23

And it relies on, it relies

32:26

on people acting in good faith. So

32:29

to me that seems like a critical vulnerability.

32:31

If I were, if I

32:34

were trying to hijack a democracy I would

32:36

take advantage of that. Wouldn't we also have

32:38

to assume that the AI is acting in

32:40

good faith and isn't there ultimately a public

32:42

master for any piece of software? Well

32:46

that's, yeah, that's a really good point. Somebody's training

32:48

the model, right? So you

32:50

have, you're kind of rearranging the furniture

32:52

but it's faster. I agree that's better.

32:54

Yeah. I like the way community notes

32:57

works because people have to

32:59

have disagreed on other topics. Yeah. Like they're kind

33:01

of saying these are not all the same persuasion

33:03

but about this thing they all agree that that

33:05

thing's wrong. Yeah. Right. And so there's a,

33:07

there's a counterbalance there. I'm not sure how well

33:09

it works in practice. It seems like when I

33:11

read a community notes it seems

33:13

pretty good most of the time but it's

33:15

late. Like you said it's probably is gameable.

33:17

Yeah. It takes a day or two or

33:19

three. Right. You know. And by that point

33:21

the people who read the

33:24

original message and either

33:26

liked or shared it, retweeted it.

33:28

Yeah. They eventually get

33:30

the notification. Oh hey that

33:32

thing that you liked, that thing that you

33:34

shared, that thing that you retweeted, a community

33:36

note was added. Okay great but the

33:38

damage has already been done. Same problem

33:40

in our, in our publications as well.

33:42

Like the correction comes days later. Yeah.

33:44

At the bottom of the page or

33:46

it's definitely not a headline. You know

33:49

and like the headline has been consumed

33:51

and moved on from already. Yeah. And

33:53

the correction gets a tenth, a hundredth

33:55

of the viewership as the wrong thing

33:57

was. So that also is maybe

33:59

AI. can help with that as well. It

34:02

gets even more interesting when you look

34:04

at the reproducibility issue in scientific publications.

34:09

Because that's where the economic

34:12

incentives are even more distorted

34:14

because it's publisher perish in

34:17

that world. So if you're not publishing, then

34:20

you're not going to get funded. You're not going to have

34:23

any means to be able to

34:25

actually continue your research, right? The

34:28

incentive is to actually continue to publish, publish,

34:30

publish. If there's a retraction,

34:33

it's an afterthought. The

34:35

thing is, the thing I'll never forget is

34:38

there's a gentleman, I think he was out

34:40

of Stanford and he founded

34:42

something called the Retraction

34:45

Watch. That's what it's called.

34:47

But basically, what they discovered is over

34:50

50% of the

34:52

landmark studies that have been done on the

34:54

subject of cancer are not

34:56

reproducible. So you have huge

34:58

companies spending billions of

35:01

dollars on trying

35:04

to cure cancer or treat

35:06

cancer more effectively, but

35:08

over half of what they thought

35:10

they could count on is

35:13

actually not reproducible and

35:15

it's kind of frightening when you think about it. Trust

35:18

nothing. Well,

35:22

it'd be awesome if we could get to a place where you

35:24

could establish trust

35:27

more quickly and more

35:30

genuinely. Question everything is

35:32

maybe another version of that. Trust nothing,

35:34

question everything. I heard an

35:36

adage that I can't recall the

35:38

source necessarily, but that anytime

35:42

a civilization creates social media

35:44

soon after it implodes on

35:46

itself. That's what

35:48

happened with the Romans, obviously. They

35:52

started scrolling graffiti on the

35:54

walls and it

35:56

was a straight line. Assuming

35:58

the multiverse or assuming... Whatever

36:00

the multiverse I'm just worried possible science fiction books

36:02

you're reading I really I really wish I could

36:04

recall the source But it was it was interesting

36:07

to think that you know if

36:09

it may have been a book you may have been science fiction

36:12

I don't know, but it sounds interesting that

36:14

we like very predictable. That's the question maybe

36:16

not but it's the least plausible or understandable

36:19

in the fact that when social media

36:22

is introduced into a society soon after

36:24

it begins to overexpose

36:26

itself to itself Interesting

36:28

and therefore begins to see

36:30

the differences in the biases and

36:33

hate becomes the primary versus

36:35

the love Interesting, you know,

36:37

I think is somewhat true We've

36:40

experienced it just based on our

36:42

need for a diet. There's something there that

36:45

is not healthy normal Yeah,

36:47

it's definitely something worth noting about.

36:49

Yeah, and is it social media's

36:52

fault? I wouldn't say

36:54

necessarily but like the internet is

36:56

a very fantastic Thing

36:58

but we've also layered on this social

37:01

fabric onto the internet the

37:03

information highway superhighway That

37:06

now allows us to you know at knowledge

37:08

you just share you know and consume

37:11

True or not true things. Yeah, and so

37:13

that cannot Be

37:15

sustainable long term and if

37:17

we have to say trust nothing at

37:19

the end of a podcast Is

37:21

that a good thing not a good thing? I don't think you have

37:23

to agree with that at least even

37:26

if the adage of Every

37:28

time a civilization invents social media

37:30

X happened that may be

37:32

the science fiction I'm gonna community

37:34

notes that quote though. I Don't

37:38

trust it. I Agree

37:42

with it, but I don't trust it interesting

37:45

yeah, so I'm optimistic because

37:47

I see the potential

37:49

for damage right now and I

37:52

see it getting worse before it gets better.

37:55

Yeah, but at the same time We're

37:57

going to reach a breaking point The

37:59

current. That I'm conditions cannot continue.

38:02

And accelerate. What? Do you think

38:04

would break her? what would be that war

38:06

that potentially look like gamasy can tell the

38:08

teacher So oh that was my I can

38:10

look at the recent past I'd remember what

38:12

happened on January sixth. a new habit of

38:14

of and attempted to the Agassi a post

38:16

election hill. Sequitur. Wooded Areas

38:18

problem breaking point, right? My. So

38:21

if we have the world's oldest

38:23

democracy come to an end because

38:25

enough people believe that the election

38:28

is rigged, they're gonna be changes.

38:30

And they're not going to be

38:32

necessarily changes that the vast majority

38:34

people want to live through that.

38:37

so something has to change and

38:39

he somehow city that as optimistic.

38:42

Like a Vampire America less if you are

38:44

about that because when there's a safety valve,

38:46

when there's a a minor. Pressure relief

38:48

people like oh well, this

38:50

is not that big a

38:53

deal. It's okay right? And

38:55

I feel as if you

38:57

know, but the world of

38:59

automation, the world of Ai.

39:01

Up until now we've been

39:03

distracted by a lot of

39:05

middle pressure release valve lots

39:07

and lots of little things

39:09

that are prevented it from

39:11

becoming an apocalypse. Write something

39:13

will teams either by our

39:15

design or because we have

39:17

no choice. Because something has

39:19

broken. My theory is. That.

39:21

In the near future were

39:24

to have better accountability, we'd

39:26

have better ability to establish

39:28

trust and that the economic

39:30

incentive to why and act

39:32

in bad faith in public

39:34

is gonna get undermined. His

39:36

owner is could be neutralized

39:39

other the medicine and add

39:41

if you don't mind it's

39:43

it's car on. are

39:45

we have this ad for rewrite own from chris

39:47

dixon and the i read the copy for it

39:49

i haven't read the book as yeah with the

39:51

premises pretty interesting it says this about elites in

39:53

the ad far as of this is not ago

39:55

we were pay to see this elsewhere not in

39:58

this context and series now we did And

40:00

it says read write own is a

40:02

call to action for a more open

40:04

transparent and democratic internet one that opens

40:06

the black box of AI Tracks

40:09

the origins as you're saying tracks

40:11

the origins we see online and

40:13

much more There's our chance to

40:15

reimagine the world changing technologies We

40:17

have and to build an inner we

40:19

want not the one we've been inherited inherited essentially So

40:21

I don't know what the content of the book is

40:23

but there's this similarity like I think

40:25

he even mentioned like one way to track

40:27

some of this AI stuff is And

40:30

watching why was blockchain always this

40:32

is literally a track, but it

40:34

is verifiable. I play yeah Oh,

40:36

you can't alter it after it's

40:38

been right Community

40:40

it's immutable. It's trustless When

40:43

done correctly obviously well that's the thing depends

40:45

on whether it's a proof of work or

40:48

proof of stake There's a

40:50

lot of ifs there or it depends

40:52

is but the math seems sound on

40:54

it being able to have a trustless

40:56

public Chain of events right

40:58

well, that's the thing though it you're

41:01

still trusting something you have to have

41:03

enough No, it goes back to math

41:05

right it goes back to having enough

41:07

nodes on the system that concur With

41:10

your accounting so just

41:13

like with the onion router tour

41:16

at one point I think it was

41:18

the NSA who actually owned a number

41:20

of nodes They had compromised enough nodes

41:22

that they could do a timing based

41:24

attack where they would analyze The

41:27

amount of time that it takes between the different

41:29

layers, and they were able

41:31

to figure out where somebody was In

41:34

the world and they were able to basically

41:36

figure out who they were Based

41:39

on that timing the NSA

41:41

trust nothing But

41:44

theoretically you could do that with

41:47

with a a blockchain network Yeah,

41:50

if you own enough nodes, then you

41:52

can skew the results totally

41:55

So as you can fake consensus essentially

41:57

yeah, well, that's why the Bitcoiners go

41:59

back Bitcoin they say it's

42:01

the longest standing most diverse

42:04

secure blockchain there

42:06

is because of how much value

42:08

is there and how long it's been not

42:10

broken now maybe there's

42:13

somebody with 51% of that you know of

42:15

course Satoshi has a whole bunch of Bitcoin

42:18

that has never moved in it yeah

42:20

honestly the moment that moves

42:22

I think the network falls

42:24

really yeah falls I

42:27

mean okay yeah

42:30

okay yeah I mean maybe not yeah

42:32

I think so something changes so trust

42:34

trust erodes so a couple

42:36

things first of all when it comes

42:38

to blockchain specifically Bitcoin and others that

42:40

are based on proof of work not

42:43

proof of stake yeah I have a

42:45

real issue with them because the energy

42:47

intensity of the

42:49

computation is so high that

42:52

you could power multiple countries now

42:55

right and so what's happening

42:57

is a lot of dirty

43:01

energy sources such as natural

43:03

gas coal oil

43:05

things like that where

43:07

increasingly it is economically

43:09

unsustainable because grid powered

43:11

solar and wind have

43:13

completely leapfrogged them in

43:15

terms of cost efficiency

43:18

right if you're building brand new energy

43:20

sources right now by far

43:22

you can build anywhere from six

43:25

to nine times more grid scale

43:27

solar or wind compared to nuclear

43:30

which is the second cheapest or

43:32

third cheapest I should say but

43:34

everything else is more expensive by comparison

43:37

but the problem is you know you

43:39

go to an oil field where they

43:41

have these toxic gases and

43:44

they set fire to those toxic

43:46

gases as a way of making

43:48

it safe well that's wasted energy

43:50

so what's happening

43:53

here in Texas is a bunch

43:55

of Bitcoin miners set up in

43:57

the oil fields and they've

43:59

harvested did that heat energy, that what

44:01

would normally be just waste heat, and

44:04

that becomes an additional revenue

44:06

stream for those oil extractors,

44:09

those oil companies. It

44:13

basically takes what was

44:15

a dying revenue model, a

44:17

dying industry, and it

44:20

breathes new life into it, and it slows

44:22

down our adoption of renewables. Basically,

44:27

it vastly increases our risk

44:30

of transforming our environment to

44:32

a place where it's

44:35

going to kill people to go outside,

44:37

whether it's due to the extreme heat

44:39

and humidity or the extreme cold, because

44:42

the natural conveyor belt that

44:44

used to exist is kind of collapsed, or

44:47

it's sometimes collapsing. Yeah,

44:50

potentially. I've also seen a lot of

44:52

proponents of innovation around clean energy sources

44:55

because of the value in

44:57

the network and the people that are willing to invest in.

44:59

I think it's probably a mixed bag. I don't know enough

45:01

about it to speak better

45:03

than that. I'm just speaking to the

45:05

security of it, which is what they... I don't even know if

45:07

that's... I trust nothing. I

45:09

don't even know if that's fair. It

45:13

seems to be what they're saying, but how

45:15

do you go back to origins with a chain of

45:18

trust that you can actually prove out? So

45:21

far, that seems to be the one

45:23

good use case of blockchain. Sure. Do

45:27

you watch Netflix? Yeah. Do

45:29

you know how much power they use? It's not

45:31

small. It's not small. But you watch

45:33

Netflix. Yeah, I do. This is where I

45:35

have a challenge because there's such high value. I'm with Jared

45:37

on this point that it's such high value that

45:40

we need innovation around... We have

45:42

a sun, right? Right. It's

45:45

out there right now. The only reason we're here right

45:47

now is because it's there, right? We

45:49

need to better harness the most available

45:52

energy source ever. And that's solar.

45:54

That's all the things that the earth provides as

45:56

its natural ways. I'm for

45:59

Bitcoin. I understand... the whole, it sucks

46:01

a bunch of energy, but let's be humans

46:03

and innovate and find ways around the dirty

46:06

ways. And again, to Jared's

46:08

point, I don't know a ton about energy

46:10

necessarily, but at the same time, the world,

46:12

I feel is more hybrid. There's

46:15

places where something diesel

46:17

powered will trump for the moment,

46:20

the output of something that is electric

46:22

powered or LiPo battery powered or whatever

46:24

it might be powered. The

46:26

world is more hybrid. I think we need balance

46:29

rather than cut off. At

46:32

some point, yes, maybe those things

46:34

need to be less available, but

46:36

there's so many, if you just cut off the

46:38

dirty ways, I suppose, you'll see a crippled earth.

46:41

There's just so much reliance on

46:44

diesel powered, gas powered, natural powered,

46:46

clean powered. We need a more

46:48

balanced process of it rather than

46:50

saying it's only this way or

46:52

only that way. Like,

46:55

why not both? Is my best off when

46:57

it comes to hard choices? Why

46:59

not both? Can we do both? And

47:01

when it comes to Bitcoin and powering it,

47:04

let's find ways to use things that are

47:06

more renewable, things that are not overly

47:08

draining the system it's on. I'm

47:11

all for that. If it's clean or even this,

47:14

what an awesome thing to reuse those off

47:16

gases though. It was once waste and now

47:18

it's not waste. It

47:21

may stump the opportunity for renewables, but

47:23

it also is a reuse of

47:25

something that was previously just waste,

47:28

which is always a positive. There's

47:31

so much opportunity with decentralized,

47:33

non-owned by government entity

47:36

currency for the world that

47:38

it scares government. Just

47:41

that not right alone, it's almost worth exploring is

47:44

trust nothing. It's

47:47

almost worth exploring because then the

47:49

trust becomes the network itself rather

47:52

than simply like I trust my government or

47:54

my government trusts that government and so therefore

47:56

it is trusted. A

47:59

couple things. First of all, look at

48:01

who is the primary beneficiary of Bitcoin.

48:04

And a lot of it comes down to

48:06

these ransomware gangs and

48:09

people who want to be able to move, elicit

48:13

substances on the dark web and things

48:15

like that. So believe it or not,

48:18

if you map out the parts

48:21

of Bitcoin that are in

48:24

other cryptocurrencies that are actually profiting

48:26

and benefiting different parties, it's

48:30

particularly nefarious when

48:32

it comes to Bitcoin and other currencies

48:35

that are based on that. So the other

48:37

thing is I've actually invented a blockchain-based

48:40

technology that

48:42

generates a ledger of

48:44

renewable energy. In fact, part of the idea behind

48:47

it, and I came up with this over a

48:49

decade ago, is the idea that

48:51

when you walk into a coffee

48:53

shop, you don't have to get

48:55

approvals. You don't have to go

48:57

through a multi-step process to connect

48:59

to the Wi-Fi, right? It just

49:02

automatically discovers. It's zero config. Super

49:05

simple. Well, right now, if you

49:07

want to add solar to your roof or

49:09

wind or anything to the local

49:11

grid, you have to go through a month-long

49:13

process. You have to get all kinds of

49:15

different people involved, get different

49:17

stamps of approval because our grid

49:20

was designed over 100 years ago,

49:22

and it hasn't been updated since

49:24

then. So the concept was

49:26

to basically take some of the brains

49:28

of the internet where it's self-configuring. It's

49:31

able to automatically discover the capabilities

49:34

of all the different devices around

49:36

it, and it can

49:38

also self-heal. So

49:40

take that resiliency and put it into an

49:43

electrical grid. So when you hook up your

49:45

solar panels, you don't have to go through

49:47

all of the government-red tape, and

49:50

the solar panels communicate with the

49:52

local electrical grid using

49:55

blockchain as a way to

49:57

actually preserve a

49:59

house. historical record of the capabilities

50:01

of the production of that system so

50:04

first of all the right people get paid but

50:06

also that's more valuable to the

50:08

grid to know what that system is capable

50:10

of historically so

50:13

and then also to move more intelligence

50:16

away from a command and control structure

50:19

and move more intelligence into the nodes

50:22

around the electrical grid so if

50:24

there's a terrorist attack or if there's a

50:26

natural disaster it becomes far

50:28

more resilient more secure

50:31

more fault tolerant and

50:33

it's able to respond much faster

50:35

than one person like Homer Simpson

50:37

you know watching a dial and

50:40

adjusting adjusting levers and knobs you

50:42

can actually automate a lot of that stuff

50:45

in the way that a lot of the

50:47

the internet the backbone of the internet is

50:49

actually capable of uh... self-healing

50:51

and rerouting traffic uh...

50:54

you can do the same with electricity so tell

50:57

us more tell us where we can learn more about that

51:00

i like to buy license technology like

51:02

a decade ago i'd coming

51:05

on i'm sure it's being

51:07

used by somebody somewhere have

51:09

no idea but the

51:11

the interesting thing about that is the

51:13

intersection of the blockchain from

51:16

my perspective blockchain has

51:18

a lot of potential to establish trust and

51:22

to basically provided historically

51:24

accurate verifiable information

51:27

in a way that cannot be forged

51:29

after the fact right so then

51:31

you can start to establish trust and

51:35

make use of historical information in

51:37

a way that benefits everybody i

51:40

don't know how pertinent it is but in my small town

51:42

dripping springs we have a co-op our

51:44

energy provider is a co-op okay and i'm

51:46

still learning exactly what that means but basically

51:49

they are for the grid itself and

51:52

if there's money to go back if they overcharge

51:54

me i get money back yeah that's actually better

51:56

if i don't get money back to their doing

51:58

their job exactly but it's a energy grid that

52:00

is for the community, it's powered by the

52:02

community, it employs people, and

52:05

that's how it works. And two

52:08

books that I've read in my life, The World

52:10

is Flat and Hot, Flat and Crowded, those are

52:12

a little dated. I think Hot, Flat and Crowded

52:14

was kind of like predictive to a lot of

52:16

stuff, which The World is Flat was

52:18

about the workforce of the world

52:20

being flattened. You could be in Dubai and

52:22

work for the change log, producing podcasts kind

52:25

of thing. And The

52:27

Hot, Flat and Crowded was this prediction that we would,

52:29

because of the energy grid and it needs to be

52:31

smart, like you talked about, we need to have more intelligence

52:33

in the grid. I totally agree with that, but

52:36

at the same time, it wasn't the

52:38

world stopping it because you see deregulation in energy,

52:40

you see randos being able to

52:43

essentially hedge energy, make a lot

52:45

of money. It's such a weird kind

52:47

of wild, wild west in a way, where

52:49

it, I don't know if it needs to be government

52:51

source, because I mean, who can trust their government as

52:53

much? Maybe it needs to be

52:56

regulated by something, but like how every

52:58

state is different in the United States in terms of how their

53:00

energy works. I don't know how it works in Nebraska, but like

53:02

everybody's got a little different. And who's

53:04

in charge of like literally upgrading the

53:07

energy grid and how do we need it

53:09

now, not like incrementally,

53:11

iteratively over the next decade. We need

53:13

it almost immediately, but who's in charge

53:15

of that? What's the consortium making that happen?

53:17

Who's agreeing on making that happen? Yeah.

53:20

So the federal government actually, here's a fun

53:22

fact. I don't know if you were familiar

53:24

with what happened a few years

53:26

ago when we had the cold snaps. I

53:29

used to work at Hewlett Packard Enterprise, HPE.

53:32

A ton of my coworkers are

53:36

out of Texas. And so,

53:38

you know, Austin, Houston, that whole

53:40

area. And a

53:42

lot of them were without power for

53:44

days and weeks. I live here. I

53:47

was one of them. Yeah. So you

53:49

know what it's like. The interesting thing about

53:51

it though is that the ERCOT

53:53

was built by the same people who

53:56

built Enron. Okay.

54:00

People whose theory

54:02

is, well, let's just

54:04

turn it into a free market experiment.

54:07

Let's see what happens. Well, the thing is,

54:09

the free market, free market pixie

54:11

dust is not a cure for every

54:13

ill. Okay? So

54:16

just because you sprinkle

54:18

free market pixie dust on

54:20

an energy grid doesn't

54:23

mean that it's always going to produce

54:25

better results, right? And in the case

54:27

of the winter, you know, a couple, few

54:29

years ago, it has disastrous

54:31

impacts, right? But the interesting thing

54:33

is, at the same

54:35

time that ERCOT was collapsing,

54:39

you have the panhandle of Texas,

54:41

you got El Paso, which are

54:44

on neighboring grids. Both

54:46

of them were sustainable, and

54:48

they survived all of those cold snaps

54:50

because they had a

54:52

larger grid, and the fundamentals of

54:54

the grid are completely different than

54:57

how ERCOT operates. So

54:59

here's the thing. The thing that

55:01

a lot of people use as a

55:03

reason not to switch over faster to

55:05

renewables, they say, oh, well, the sun

55:07

doesn't always shine, the wind doesn't always

55:09

blow. Well, here's the thing. That

55:12

only applies if you have tunnel vision.

55:14

If you're zoomed in so close to

55:16

your point of reference, that

55:19

that's the only thing you're looking at. To

55:21

zoom out, all of a sudden, you notice

55:23

that if you could look at

55:25

the entire globe at once, the sun

55:27

is always shining somewhere, and it is

55:29

always blowing somewhere. And if

55:31

you can basically generate

55:33

energy on a large enough

55:36

scale, and then ship that

55:38

energy using the magic of

55:41

science and three-phase electricity, you

55:43

can ship that energy

55:45

anywhere, I mean, within reason. For

55:48

sure. So on a large enough scale,

55:50

there is more than enough energy. There

55:54

was a study probably 20-plus years ago where

55:56

they looked at all the different wind sites

55:58

that it had ever had. been examined

56:00

at the time they said oh

56:02

well we have more than enough energy wind

56:06

energy to power the entire

56:08

earth five times over just

56:10

on the existing sites that

56:12

we've actually surveyed and

56:14

since then you know it's just gotten even

56:16

better and more efficient so the thing is

56:18

there's a way of measuring how

56:21

much this stuff costs it's called a

56:23

levelized cost of energy LCOE right and

56:26

it's basically a survey of the

56:28

actual projects that are being built

56:30

every year and how much it's

56:33

costing and how much it's generating

56:36

and all things being

56:38

equal you can compare these things

56:40

and go oh well let me

56:42

see new nuclear is

56:44

anywhere from six times to

56:47

nine times more expensive than

56:49

new grid solar or

56:51

wind natural gas

56:54

coal oil all of these

56:56

things are many times more expensive than

56:59

even new nuclear so I

57:02

understand you know the idea of

57:04

well we can't just throw a

57:06

switch and cut it off however

57:08

every time somebody spends billions of

57:10

dollars building new nuclear I

57:12

look at that and go yeah well we could do that

57:14

in order to get X amount

57:16

of generation capacity or

57:19

if you took all that money that

57:21

was spent on nuclear or a natural

57:23

gas plant or anything else switching

57:25

it over to grid scale solar and

57:27

wind with just a little bit of

57:30

storage we could build five

57:33

ten times as much well if

57:35

we had five or ten times as

57:37

much as we actually need it that's

57:40

called over provisioning here's the

57:42

thing if you generate to five ten

57:44

times more energy than you actually need

57:47

you don't need to store nearly as much of it so

57:50

that your costs of storage actually go

57:52

down vastly but with just

57:54

a little bit of storage and a little bit of

57:56

over provisioning we could be switching

57:59

over to renewal in

58:01

a matter of just a few years. Like

58:03

literally, we have all the technology, we

58:05

don't have to have more research. Literally, we could

58:08

take off the shelf solutions right now and

58:10

it would actually save us money. Because

58:13

everywhere that this has been done, the

58:15

cost of electricity actually goes down, the

58:18

grid becomes more reliable. Look at what

58:20

Tesla did with their massive

58:23

battery that they installed in Australia. They

58:26

said, you know, we can do this in, I can't

58:28

remember what it was, under three months.

58:32

Turned out they did it in 90 days or

58:34

something like that. So they

58:36

basically said, we can stabilize the

58:38

Australian power grid during the worst of the

58:41

summer months. And it

58:43

did, they stabilized it, they were able to

58:46

install it within 90 days or

58:48

something like that. And their cost

58:50

of energy actually went down because

58:53

it's more reliable. And all of

58:55

your equipment lasts longer because of

58:57

that. All your

58:59

heavy machinery

59:01

and stuff like that. If you have

59:04

like a brown out or if you

59:06

have an interruption to your power supply,

59:08

that's really freaking expensive. Sorry,

59:10

I'm a nerd. I love a rant, I just wish

59:12

it would happen. I feel like maybe

59:14

in the case of that being effective,

59:17

you know, not necessarily at the end

59:19

of the session, but like Elon

59:21

himself might pose a risk to that because

59:24

he's so bombastic. He's

59:27

so polarizing. You want

59:29

somebody like that who's willing to be risky with

59:31

SpaceX and Tesla and Boring Company, these

59:34

things. But at the same time,

59:36

he's kind of a weirdo. You

59:39

can't really trust him very much because he's

59:41

a good person. Yeah. He's fallen

59:43

in with a bad crowd. Well,

59:46

you know, I want that. I

59:49

want somebody to focus on that

59:51

and do well. My question to that plan

59:53

is how far do you have to zoom

59:56

out realistically inside the United States? Not far.

1:00:00

of our grid and we have an east

1:00:02

coast, we have an east grid

1:00:04

and a west grid and then you got ERCOT and you

1:00:06

got one or two other smaller grids along

1:00:08

the way but for the most part it's two

1:00:11

big grids. So we need a co-op

1:00:13

for all of the US. We

1:00:16

need a co-op for this for the grid

1:00:18

not against the grid. It's not for higher

1:00:20

cost of energy, it's for stabilized sustainable

1:00:23

energy for everyone and

1:00:26

it needs to almost go side by side to it and

1:00:29

incrementally replace old with new similar

1:00:32

to the way the internet is grown

1:00:34

from dial up to fiber.

1:00:37

Exactly. The problem is the

1:00:40

oldest sources of energy, coal,

1:00:43

oil, natural gas, all those guys, they

1:00:46

have permanent structural tax

1:00:49

incentives built in. Everything that

1:00:51

is remotely renewable always has

1:00:53

a sunset. It always has

1:00:56

EVs up until the

1:00:58

act that was passed by Biden within the last couple

1:01:00

of years, Chevy,

1:01:02

Tesla and a number of others, all

1:01:05

of the tax incentives went bye-bye because

1:01:08

as soon as you succeed to a certain

1:01:10

point then all the tax incentives went bye-bye.

1:01:13

So I did want to bring it

1:01:15

back to energy efficiency and

1:01:17

specifically the computing industry. I'm

1:01:19

down for that. What's that? I'm down for

1:01:21

that too. Okay. I'm talking about energy

1:01:23

efficiency myself in my home lab. Yes we

1:01:26

could build up

1:01:28

more generation capacity but

1:01:30

what if we took what we're already doing

1:01:32

and we made data centers

1:01:35

and the fabric of the internet hundreds

1:01:38

or even thousands of times more efficient than they

1:01:40

are right now. That's

1:01:42

the kind of problem that I've been wrestling

1:01:44

with for the last 10 plus years working

1:01:47

in technology. For

1:01:50

example, image optimization. Right

1:01:53

now something like three

1:01:55

quarters of all images shuffled,

1:01:57

moved over the internet are not actually

1:02:00

optimized. And so there's

1:02:02

a tremendous waste of energy

1:02:04

and compute resources and storage

1:02:07

and bandwidth just

1:02:09

with image optimization or lack thereof,

1:02:11

right? Microservice architectures, you know, very,

1:02:13

very popular. It's kind of the

1:02:16

tool that large enterprises use so

1:02:19

that they don't have to worry about back

1:02:21

end for front end, right? And

1:02:24

the problem is I had

1:02:26

a view in an app that I

1:02:28

was developing for HPE and

1:02:31

it required over two dozen

1:02:33

rest calls before

1:02:36

it could render a single view. So

1:02:39

I reimagined what that would look like if I

1:02:41

did all my data fetching and edge functions. And

1:02:44

then I optimized it, tree shook it,

1:02:46

okay? And then of course

1:02:48

encoded using Brotli. By the time it was

1:02:51

all said and done, I was

1:02:53

able to reduce bandwidth by 99.916%. So

1:02:57

that's interesting. That's a lot of percents. That's

1:03:00

a big improvement, right? 99.96, do you say? Yeah,

1:03:06

that's like almost 100%. Almost.

1:03:10

But here's the thing. We still had

1:03:12

real time data and

1:03:14

we still were missing

1:03:16

out on a tremendous opportunity

1:03:19

to vastly improve the efficiency

1:03:21

of even that system. Why?

1:03:23

Because when you have lots of data changing

1:03:25

all the time, every time

1:03:28

you deliver data to the client, the

1:03:30

client has no idea if its

1:03:32

data is fresh or stale.

1:03:34

So you always have to start with

1:03:36

the assumption, if five seconds have passed,

1:03:40

stale. It's stale. So you

1:03:42

always have to call back to the

1:03:44

data source and you always have

1:03:46

to fetch and then it

1:03:48

has to exercise database queries, it has

1:03:51

to do all the things that

1:03:53

are expensive and it's

1:03:55

a huge waste of resources

1:03:58

throughout our entire industry. If

1:04:01

you take and add just

1:04:03

a little bit of intelligence and move

1:04:05

over to an event driven architecture where

1:04:07

the data source knows what the data

1:04:09

dependencies are, it knows when

1:04:11

something has changed and it pushes

1:04:15

those changes, publishes those

1:04:17

changes to let's say

1:04:19

an edge function. Instead of re-validating

1:04:21

the data source at the edge,

1:04:25

you can actually say, hey, I

1:04:27

know that you're interested in this data and by the way,

1:04:29

here's the delta, here's the

1:04:31

set of changes and by

1:04:33

the time you're done applying that

1:04:35

change at the edge, the

1:04:38

hash of what that modified resource

1:04:40

will look like is included

1:04:43

in the e-tag that

1:04:46

I'm including from the data source

1:04:48

to the edge function and the edge function can

1:04:50

do a patch at

1:04:52

the edge and then update the

1:04:54

cache at the edge and

1:04:57

then you use server sent events to go

1:04:59

from edge function to client

1:05:01

and then the client only

1:05:03

updates when there's actually a

1:05:06

useful benefit to

1:05:08

actually making a network request and

1:05:11

so you've eliminated polling, you've

1:05:13

eliminated all of this wasted

1:05:15

resources and wasted infra bills,

1:05:17

right? Your AWS cost is off

1:05:20

the charts. You

1:05:22

go from, you know, instead

1:05:24

of 99.916% reduction you

1:05:27

can take that and you

1:05:29

can eliminate every single network

1:05:31

request, every single DB op

1:05:34

that does not produce a useful result

1:05:37

and in my previous role, we figured out

1:05:39

that 99% of the time it

1:05:42

was read only, there was no mutation of

1:05:44

the data so what that means

1:05:46

is you have 99% opportunity for cache hits

1:05:50

so only 1% of that would

1:05:52

actually require a new

1:05:54

database operation and then you

1:05:56

could just push the changes to the edge

1:05:59

and update the cash right there. So

1:06:02

imagine 99.916% more efficient times

1:06:08

an additional 99% reduction in infra costs. And

1:06:13

all of a sudden, you're talking

1:06:15

thousands of times more efficient. And

1:06:18

saving yourself potentially millions of

1:06:20

dollars in AWS or other

1:06:22

infra costs. So those

1:06:24

are the types of things that I get

1:06:26

excited about because if we just

1:06:28

take the tools that are right now on the

1:06:30

show. That's the key word right there. If

1:06:33

we, no, tools. Tools, okay. If

1:06:35

we just take this set of

1:06:37

techniques, this set of

1:06:39

architectural patterns that have been in

1:06:42

existence for decades, we know

1:06:44

that they work and they're actually

1:06:46

easier to do now than ever before.

1:06:49

Right, but we don't have the tools. Yes

1:06:51

we do. You got them? Yeah, I've been

1:06:53

building with them for the last decade or

1:06:56

so. But you're the only ones got them. No,

1:06:59

no, no, no. There are other people who

1:07:01

do it. Give us the tools, Sam. Give us the tools, Sam.

1:07:03

Tell us. Okay, so for edge functions,

1:07:06

I'm a big fan of Vercel. And

1:07:08

the reason why I'm a big fan

1:07:10

of Vercel is because they

1:07:13

basically built the

1:07:16

same exact infra that I've

1:07:18

built at previous roles. Where you

1:07:20

evaluate best of breed and in

1:07:22

the case of edge workers, CloudFlare,

1:07:25

those guys, are

1:07:27

amazing. All of their

1:07:30

isolates mean no cold

1:07:32

starts. You don't have to wait, you know, dozens

1:07:35

or hundreds of milliseconds for

1:07:37

your edge function to spin

1:07:39

up. It's ready to go right there.

1:07:42

And so it's fundamentally more efficient

1:07:44

if you've got easy access

1:07:47

to your data and

1:07:49

there are ways to architect that, whether you're

1:07:51

using CloudFlare, R2, D1, any

1:07:55

of that, planet scale. There

1:07:57

are ways for you to move your

1:07:59

data. into an event-driven

1:08:01

architecture. Postgres has triggers. You

1:08:05

know, a ton of different message mesh solutions. You

1:08:10

know, you've got nats.io, you've

1:08:12

got Red Panda and Kafka

1:08:14

and RabbitMQ. You've

1:08:16

got a ton of different options out there to be able

1:08:18

to really wire up all

1:08:21

of these parts without having to reinvent the

1:08:23

wheel, without having to do everything. Let me change my

1:08:25

one word then, sorry. Because I know that all these

1:08:28

tools exist. The word is not tooling,

1:08:30

the word is packaging. Gotcha.

1:08:32

And so, because you're talking about architecture,

1:08:34

you're talking about a practice, a technique,

1:08:37

that you can use tools in order to accomplish. Yeah.

1:08:40

Right? But that technique has to be packaged. That's

1:08:42

why I said tooling is the word. Gotcha. for

1:08:45

you out of the box, blammo. What you're saying

1:08:47

is really a SAS that makes it

1:08:49

an easy button for people. Doesn't necessarily

1:08:51

have to be a SAS, but something

1:08:53

that says, hello, web developers in the

1:08:55

world. Here's a much better way of

1:08:57

doing it. Gotcha. And you describe it.

1:08:59

Yeah. And then you say, and here's how

1:09:01

you do it. Out of the box, it just works. Yeah. Like

1:09:04

that's how you get that technique, which has

1:09:07

to be moved around, permeate the

1:09:09

industry for it to actually have the huge order

1:09:11

of effects that you'd like to see, right?

1:09:13

Yeah. Not just at HPE, but

1:09:15

at every shop. Exactly. Exactly. And

1:09:17

that requires my original orders tooling, but

1:09:20

packaging of the tools that are existing.

1:09:22

Yeah. And like, education, and here

1:09:24

it is. Yeah. Is

1:09:26

there anything like that, like a stealth mode startup

1:09:28

or what's going on? Well, I can tell you,

1:09:31

but then. But then he wouldn't, then he'd be

1:09:33

out of stealth mode. Exactly.

1:09:35

So let's just say that

1:09:37

the next six months to a year

1:09:39

are going to be very interesting. Okay.

1:09:42

Okay. Now we're hopeful. Trust

1:09:45

in something. We totally trust you. Well,

1:09:49

okay. So let's put it this way. What I've just

1:09:51

described to you is I'm

1:09:54

not inventing anything

1:09:56

terribly unique by describing what

1:09:58

I just described to you. guys. This

1:10:01

is a set of patterns that I think a

1:10:03

lot of people are very well familiar with, but

1:10:06

you're right. There is no easy button. It

1:10:09

takes a lot more work and

1:10:11

effort and experimentation and profiling,

1:10:15

and there are a lot of foot guns. There are a lot

1:10:17

of ways that even if you're

1:10:19

doing everything right, except for

1:10:21

one or two things, instead of

1:10:23

being a cost savings, it can be a

1:10:26

cost multiplier. So yeah, individual results will vary,

1:10:28

but I suspect it's about to get easier

1:10:30

within the next year. Okay. All the

1:10:32

enough we think about this as a podcast because we

1:10:34

build our own platform and we think about CDNs

1:10:37

and delivering MP3s around the

1:10:39

world and how to do it well. We've

1:10:42

been working fast here for many years,

1:10:44

we're considering change. We've even

1:10:46

considered on a podcast building our own CDN. Yeah,

1:10:48

I've done that. But this is kind of like

1:10:50

in the similar vein, and I agree with you.

1:10:52

I think if we had more efficiency, it's interesting

1:10:54

to think about the client not pulling, but the

1:10:57

edge pushing, the whole push mechanism,

1:10:59

because that's where the intelligence is at

1:11:01

of the data being changed or not.

1:11:03

Because it knows every time

1:11:05

data is added. Right? Exactly.

1:11:07

Every time it writes new data, it

1:11:10

knows there's change somewhere to a client.

1:11:13

That seems very smart to me, but like Jared

1:11:15

said, how do we buy the package? Yeah.

1:11:18

No, really, how do we buy the package? In six months

1:11:20

or so, it can be very interesting. Let's leave it right

1:11:22

there on the T. Thanks, Sam. This has been awesome. Appreciate

1:11:24

it. All right. That's been fun. Thank you. Yeah, it's been

1:11:26

a lot of fun. Thank you. What's

1:11:51

up, friends? This episode of Change

1:11:53

Talking Friends is brought to you

1:11:55

by our friends over at Vercel.

1:12:00

product. Lee, I know

1:12:02

you know the tagline for Vercel, develop

1:12:04

preview ship which has been perfect but

1:12:06

now there's more after the ship process.

1:12:08

You have to worry about security, observability

1:12:10

and other parts of just running an

1:12:12

application production. What's the story there? What's

1:12:14

beyond shipping for Vercel? Yeah, you know,

1:12:16

when I'm building my side projects or

1:12:18

when I'm building my personal site, it

1:12:20

often looks like develop preview ship, you

1:12:23

know, I try out some new features,

1:12:25

I try out a new framework, I'm

1:12:27

just hacking around with something on the

1:12:29

weekends, everything looks good, great, I ship

1:12:31

it, I'm done. But as we talk to more

1:12:33

customers, as we've grown as a company, as we've

1:12:35

added new products, there's a lot more

1:12:37

to the product portfolio of Vercel nowadays

1:12:40

to help pass that experience. So when

1:12:42

you're building larger, more complex products and

1:12:45

when you're working with larger teams, you

1:12:47

want to have more features, more functionality.

1:12:49

So, tangibly, what that means is features

1:12:51

like our Vercel firewall product to help

1:12:53

you be safe and to have that

1:12:55

layer of security. Features like our logging

1:12:57

and observability tool so you can understand

1:12:59

and observe your application and production, understand

1:13:02

if there's errors, understand if things are

1:13:04

running smoothly and get alerted on those.

1:13:06

And also then really an expansion of

1:13:08

our integration suite as well too because

1:13:11

you might already be using a tool

1:13:13

like a Datadog or you might already

1:13:15

be using a tool at the end

1:13:17

of this software development lifecycle that you

1:13:19

want to integrate with to continue to

1:13:21

scale and secure and observe your application.

1:13:23

And we try to fit into those

1:13:25

as well too. So, we've kind of

1:13:27

continued to bolster and improve the last

1:13:30

mile of delivery. That

1:13:32

sounds amazing. So, who's using the Vercel platform

1:13:34

like that? Can you share some names? Yeah,

1:13:37

I'm thrilled that we have some

1:13:39

amazing customers like Under Armour, Nintendo,

1:13:41

Washington Post, Zapier, who use

1:13:44

Vercel's running cloud to not only help

1:13:46

scale their infrastructure, scale their business and

1:13:48

their product, but then also enable their

1:13:51

team of many developers to be able

1:13:53

to iterate on their products really quickly

1:13:55

and take their ideas and build the

1:13:57

next great thing. Very cool. The

1:14:00

reconfiguration for over thirty five

1:14:02

frameworks Forcell phone and Clough

1:14:04

makes it easy rainy seem

1:14:07

to deploy their apps sedating

1:14:09

a star with the fourteen

1:14:11

day free trial oversell Pro

1:14:13

or get a customized enterprise

1:14:16

demo from their team. Visit

1:14:18

forcel.com/sees Law applaud to get

1:14:20

started that vercl.com/seems off hot.

1:14:32

Next up we are speaking

1:14:34

with Just San from Kotor

1:14:36

Kotor, a youtube channel that's

1:14:38

filled with practical tests for

1:14:40

the beginner web developer. What's

1:14:54

your favorite thing in life? That's

1:14:58

a deep lesson of us.

1:15:00

Have my head services Tiramisu.

1:15:02

Yeah. My

1:15:05

life was derby so it's good. I

1:15:07

had a college roommate who is from

1:15:09

Italy and see made like to me

1:15:11

see with like see me the real

1:15:13

espresso and everything is like really good

1:15:15

solid the value of and other term

1:15:17

he says for you to the so

1:15:19

this almost yeah yeah why it is

1:15:21

still enjoy business you can still either

1:15:23

Yeah. I've had better his life

1:15:25

he say about him is you via

1:15:27

my wife who's told me this and

1:15:29

shoes her own eyes are under appreciated.

1:15:31

It's a very complicated things are it's

1:15:33

they are not complicated complex for house

1:15:35

yeah not not the kind of desserts

1:15:37

as if anybody can whip up. you

1:15:39

know there's a lot of different steps

1:15:41

in their hearts and if the beyond

1:15:43

the top right I am I making

1:15:45

money last towards the top of it

1:15:47

is that the one that has I'm

1:15:49

really I'm aware of as bad as

1:15:51

private them as a premier description psycho

1:15:54

lady finger. yeah hoodies or

1:15:56

whatever so din rom of

1:15:58

around very important yeah If

1:16:00

that's true, I mean, it's like rum. Yeah, I

1:16:02

love rum. I was like married

1:16:04

in Jamaica. Have to. Some kind of cream on

1:16:06

top. Right. And also espresso. I

1:16:09

never made it. So like I don't know how to make

1:16:11

it. I just eat it. You just like it. Yeah. I

1:16:13

love espresso too. So coffee, sweet, rum,

1:16:17

and pastry, right? Is

1:16:19

like a pastry around it? No.

1:16:22

Isn't like a lady finger a pastry thing? What

1:16:24

you mean by a pastry? Lady fingers are soaked

1:16:26

in rum. So they're basically like a cake. Yeah.

1:16:29

I'm way off then. It's not your thing in a

1:16:31

creme brulee. Yeah, I'm not thinking of no, not this

1:16:33

case. No, you're not. I was thinking like a cream

1:16:35

horn, which is like a pastry with cream

1:16:37

in it. And we've called those. I've

1:16:39

heard them called lady fingers before. Oh, maybe there's

1:16:41

multiple times a lady finger. Like an Eclair type

1:16:43

thing maybe? Yeah, like in a creme. But smaller.

1:16:45

You have bigger ones. It's kind of like a

1:16:47

cannoli too. It's similar to a cannoli. It's always

1:16:50

great. But not the exact same shape wise, but

1:16:52

the pastry is different than a cannoli. Anyways,

1:16:55

that got me excited. I would

1:16:57

love to have me some tiramisu. We're

1:16:59

a cannoli. We're a lady finger. Well, let's talk

1:17:01

for 90 minutes about different desserts. Right? No, no,

1:17:04

no. This is just for the fun, right? That

1:17:06

was an interesting answer to what's your favorite thing

1:17:08

though. Tiramisu. That's a good answer.

1:17:10

I didn't want to ask her about her breakfast. Because we

1:17:12

all had the same breakfast. Yeah, it's like, you know. That's

1:17:15

exciting. If you're excited about breakfast, you like breakfast?

1:17:17

I love breakfast food. If

1:17:20

we laid down the best breakfast for you right

1:17:22

now, what would it be? Like the one, like

1:17:24

you're dying tomorrow. This is the last breakfast. What

1:17:26

would it be? I'd say corned beef hash. OK.

1:17:29

But it has to be sort of crunchy

1:17:31

on one side. Because it's like,

1:17:33

you know, been seared. And

1:17:35

then eggs. Yeah, of

1:17:37

course. I like punched eggs. Same.

1:17:40

Yeah. And then

1:17:42

some English breakfast tea

1:17:45

with milk and sugar. Wow. Little

1:17:47

pastries? I'm

1:17:50

not a huge like pastry person

1:17:52

normally. Yeah. Of croissants? I

1:17:55

like croissants. I don't know

1:17:57

anybody who dislikes them. Yeah, I guess it's

1:17:59

a pastry. You don't eat them because you don't want to have carbs

1:18:02

or gluten or something, but no one's like croissants.

1:18:04

Those are terrible So we've established you

1:18:06

have good taste in food. You do have good taste So

1:18:08

what else do you have good taste in like let's talk

1:18:10

about software and tech and stuff like that Like what are

1:18:13

you into? I mean on my youtube

1:18:15

channel? I like pretty much deal with like

1:18:17

the basics HTML CSS a

1:18:19

tiny bit of JavaScript. Yeah, but it's really

1:18:21

just about trying to Talk

1:18:24

about practical things So it's like all of the things

1:18:26

that I wish I had known when I was just starting

1:18:28

out because I'm a self-taught developer I didn't

1:18:30

like get a CS degree or whatever Yeah And

1:18:33

so learning for me and I learned on

1:18:35

the job because I got a job first

1:18:37

off a Craigslist and then I landed a

1:18:39

job A couple years later at an advertising

1:18:42

agency. So I had to learn it

1:18:44

kind of a breakneck pace You

1:18:46

know while I'm frantically googling trying

1:18:48

to meet my deadlines So it's just

1:18:51

kind of like educating people who are trying to

1:18:53

get into the field with the things that I

1:18:55

wish that you know I had known when I

1:18:57

was starting out to hopefully make it a little

1:18:59

easier and less painful for them So

1:19:01

in terms of tech stacks, I'm not really you

1:19:04

know up to date on sure the hottest

1:19:07

Technologies and stuff. I kind of deal with like

1:19:09

the basics I think there's not enough people talking

1:19:11

about the I don't want to say this necessarily

1:19:13

boring stuff like the practice Right. It's kind of

1:19:15

in a vein. You might think oh, that's not

1:19:17

as cool because you just said that kind of

1:19:19

yourself What would you consider practical

1:19:21

then like give me an example of some recent

1:19:23

videos you've done? It's been practical knowledge you wish

1:19:25

you had when you first started. I think

1:19:28

some of it is research and

1:19:30

then problem solving so I started

1:19:33

making some videos where I'm literally just building

1:19:35

a website from like a design file or

1:19:38

whatever and talking through my thought

1:19:40

process and being willing

1:19:42

to show in the video like the things that

1:19:44

I get stuck on and You

1:19:47

know, there's some things that had to be edited out if I'm

1:19:49

just spending 45 minutes like reading the

1:19:51

documentation and like yeah trial and error kind

1:19:53

of thing but that kind

1:19:55

of learning how to problem solve seems to be

1:19:57

a skill that I think a lot of People

1:20:00

starting out don't know how to develop,

1:20:02

so I think it's helpful for them to sort of see

1:20:04

that in action. So yeah, I

1:20:06

would say problem solving is kind of a good

1:20:09

practical skill to have. For sure. So

1:20:11

Adam, to give you a little bit of background that

1:20:13

I got before. Please do. On

1:20:16

the mic here, Jess has built a channel

1:20:18

since 2017. Called

1:20:20

Coder Coder. And she's built

1:20:22

it to almost 500,000 subscribers. Dang!

1:20:26

And her husband is her editor.

1:20:28

Yeah. So this is

1:20:30

now sustainable. It's not like a software

1:20:32

engineering salary, but it's enough that she

1:20:34

can do it. And it can

1:20:36

sustain her. And pretty

1:20:38

cool. That is pretty cool. I did subscribe. I haven't watched any

1:20:41

of your videos, so I just met you five minutes ago. That's

1:20:43

very cool. But I did notice you got the classic,

1:20:46

mouth wide open, excited thumbnail.

1:20:49

And I love to

1:20:52

hear your thoughts on your thumbnails. It seems

1:20:54

like most YouTubers I meet, they're just doing

1:20:56

it, because they feel like they have to.

1:20:58

Is that pretty much you? Yeah. I mean,

1:21:01

it works. I do try to

1:21:03

not get overly

1:21:05

sensational, like Mr. Beast

1:21:07

style. But I think having the

1:21:09

picture of yourself, it

1:21:13

works for people who recognize your channel.

1:21:16

Because I was trying some A-B testing with

1:21:18

thumbnails, and the ones that I'm not in

1:21:20

in the thumbnail, they don't click this much. So

1:21:23

yeah. That's what people say. You gotta get

1:21:25

that lizard brain emotion to get people to

1:21:28

click on your video. It's

1:21:30

interesting how the algorithm wins there for creators.

1:21:33

Because there's actually, if you paid attention to this,

1:21:35

there's like a revolt. A lot of

1:21:38

creators are stepping away from YouTube. They have been

1:21:40

there for sometimes a couple years,

1:21:42

maybe even a half a decade or longer,

1:21:45

because of the treadmill of YouTube.

1:21:47

And some of it's YouTube's fault,

1:21:50

and some of it that they feel like

1:21:52

they have to create content that serve the

1:21:54

algorithm, not so much their creative

1:21:57

creativity, what they think should

1:21:59

exist. And so they almost have

1:22:01

to like this one in particular He runs

1:22:03

his name's Caleb and he runs DSLR video

1:22:05

shooter And I've been paying attention for years

1:22:07

because I kind of get into video and

1:22:09

photography and it's helped us over the years

1:22:12

And I like the guy a lot. I

1:22:15

respect his work big time. He's got great

1:22:17

opinions. And so he's got videos on super

1:22:20

simple YouTube you maybe even see him over

1:22:22

the years is He

1:22:25

was like I I just would procrastinate

1:22:27

on my videos because I would like

1:22:29

overly make them perfect To

1:22:32

not have to ship it because

1:22:34

my identity would then be rooted in

1:22:37

its its reach I did

1:22:39

this one flop I put this

1:22:41

effort into it and he was just saying

1:22:43

how the input doesn't always match the output

1:22:45

right that he desires Because

1:22:48

he thinks you know Creatively, this

1:22:50

is what I want to put out there and sometimes The

1:22:53

algorithm is kind of in control. So you mentioned the

1:22:55

thumbnail like he's got good emails too, but you kind

1:22:57

of have to pull into this Algorithm

1:23:00

serve it trap. Are

1:23:02

you feeling any of that? Do you feel that at all?

1:23:04

Yes, I understand that I Take

1:23:08

kind of an opposite approach I

1:23:11

feel like a lot of the creators who are stepping

1:23:13

down are because you know They've done it for 10

1:23:15

years like maybe it's kind of move on to something

1:23:17

new But there's definitely much like a treadmill mindset that

1:23:19

I personally try to not sort of

1:23:21

be Driven

1:23:24

by just because I'm trying to

1:23:26

do it as sustainably as possible Like

1:23:29

I don't want to burn out so I

1:23:31

actually don't upload very often. So I upload

1:23:33

maybe every couple weeks or

1:23:36

months Recently, but I

1:23:38

did take a hiatus of like nine months

1:23:40

cuz I'm working on finishing this course that

1:23:42

I have So I feel like you

1:23:45

can get on that treadmill and feel like you have to churn

1:23:47

out content every week But I think it's

1:23:49

possible to make it work without doing that.

1:23:51

What did you find during that nine months?

1:23:54

Did you find stagnation? Did you lose subscribers

1:23:56

or did you when you when you posted

1:23:58

that first one back? Was it

1:24:00

bigger, smaller? Like, did it

1:24:02

feel like it had real ramifications on your channel or

1:24:04

was it just like, nah, you can take nine months

1:24:06

off, there's still video. I think I

1:24:09

did have some slowdown in

1:24:11

the views for the first few videos,

1:24:14

but then I released another video that has

1:24:16

done really well, so it's really

1:24:18

hard with YouTube because you can't truly

1:24:20

A-B test something because a video

1:24:23

might not succeed and a lot of people kind

1:24:25

of blame the algorithm, they're like, oh, I'm getting

1:24:27

shadow banned or whatever, but in all

1:24:29

honesty, it's like, the video kind of sucked. So

1:24:32

I had to look back at the videos that weren't

1:24:34

doing well and be like, you know what, those videos

1:24:36

kind of sucked. And so you kind of learn from

1:24:38

that and you move on and then you do better,

1:24:40

so I think I'm doing okay, taking

1:24:42

a nine month break. Obviously I'm

1:24:44

losing a lot of views in that meantime,

1:24:46

but I think YouTube is

1:24:48

actually one of the more forgiving platforms

1:24:51

where you don't have to necessarily keep churning

1:24:54

that content and running on that

1:24:56

treadmill. So yeah.

1:24:59

How do you feel Jared about our treadmill?

1:25:02

So we just posted a few. I pay attention to the stats

1:25:04

less, it doesn't make me

1:25:06

get more or less excited about what we're doing,

1:25:08

although I do pay attention to, okay, that one,

1:25:11

we look back at stats and we're sort of like, okay,

1:25:13

that one trended higher than others. How

1:25:15

do you, what's your lens on? So we're

1:25:17

not putting a lot of work into our

1:25:19

stuff, it's all side effects of our podcast.

1:25:21

Like we're not crafting videos, we're making clips,

1:25:23

you can see them here. So it's people

1:25:25

talking with captions, right? So we're putting work

1:25:27

into it so far as we're taking interesting

1:25:29

parts of our podcast and putting them in

1:25:31

a video, completely different kind of channel, right?

1:25:34

I find that if we post

1:25:36

consistently daily a clip, like a clip a

1:25:38

day, basically five days a week, maybe

1:25:41

on a Saturday if I'm bored, that

1:25:43

everything goes better and if we don't,

1:25:45

then everything just kind of chills out.

1:25:48

I'm just figuring that's the way the algorithm wants you to

1:25:50

post more, so it's easy for us, because again,

1:25:53

they're just clips. But I Don't even

1:25:55

know how much of that's just my intuition or accurate

1:25:57

or wrong, but that's just the way it feels. It

1:26:00

might be the audience that is kind of

1:26:02

strange to keep watching eel I guess you

1:26:04

are uploading clips every day and like your

1:26:06

audience's kind of expecting that riots and those

1:26:08

are all the people falling in the if

1:26:10

you don't as he like sort of don't

1:26:12

meet those expectations an email things I slow

1:26:15

down a bit yeah what a lot of

1:26:17

our stuff is short and and know that

1:26:19

like ten minutes people consider to be like

1:26:21

the right length of a youtube video and

1:26:23

I have noticed will post them longer ones

1:26:25

like along with for us five minutes a

1:26:27

lot of other inside sixty seconds of overtook

1:26:29

one. Put him on shorts and Instagram as

1:26:32

over sixty seconds with a horizontal and for

1:26:34

the so maybe second class and you're not

1:26:36

gonna get a lot of watch time because

1:26:38

even and even a full watches as thou

1:26:41

lot I know they look at watched time

1:26:43

quite a bit. like the ones that are

1:26:45

longer generally says a big deal for the

1:26:48

algorithm. yeah, like did you capacity percent Yeah,

1:26:50

that's a big deal right? That's engagement so

1:26:52

that's good for shorter com consent to Juri

1:26:54

more likely to get past the longer once

1:26:57

and the do have a better by were

1:26:59

in there. Were counting hundreds and

1:27:01

thousands of watches and not huge amounts

1:27:03

of also like sample size idol is

1:27:06

it even a big enough sample size

1:27:08

to be meaningful And all right. Well

1:27:10

there's a lot Eclipses There's some the

1:27:12

went viral on different platforms for us

1:27:15

was those when suited but I think

1:27:17

he even wrote a post Whoop the

1:27:19

season part of the never gone viral

1:27:21

something really move that I love. the

1:27:24

others are now never been to Iraq

1:27:26

podcast Don't really do what you tube

1:27:28

does when Tic Toc does yeah Instagram.

1:27:31

It's always slow and steady and I

1:27:33

think people get burnt out a lot

1:27:35

because they don't see the impact of

1:27:37

their and with podcasts months either. Youtube

1:27:39

was a really happy in that way.

1:27:41

I I'm sure you get off the

1:27:43

comments lot the watches and go on

1:27:45

an immediate feedback of like yeah are

1:27:47

watching my stuff Will podcast is only

1:27:49

get that much and so sprayed with

1:27:51

so I work. And. so the burnout

1:27:53

have it's because you put in the work and

1:27:56

we don't see the impact as well as there's

1:27:58

a disconnect the audience said they plan for to

1:28:00

really grease those skids for you, which is

1:28:02

great. Yeah, for sure. And so the post

1:28:04

I put out was basically encouraging podcasters that

1:28:07

just because you're not having this huge impact

1:28:09

in terms of numbers that you can see,

1:28:11

there's still a depth there that's really meaningful.

1:28:14

And so that's what the post is about, but it's easy to

1:28:16

burn out in the podcasting game because it's a lot of

1:28:18

work and because you don't necessarily know if

1:28:20

you have an audience or not. Yeah. It's

1:28:23

a bit more disconnected. Yeah. And even

1:28:25

if you have an audience, you're probably not hearing from them

1:28:27

very much because they have to email you or follow you

1:28:29

on social media and these kinds of things

1:28:32

where it's really nice to have the comment threads right

1:28:34

there with the video, you know, and this is very

1:28:36

nice interaction with your audience that way. For

1:28:38

sure. What makes you do it? Like,

1:28:41

why do you do YouTube at all? Yeah.

1:28:44

So I worked in marketing and

1:28:46

advertising for several years and I

1:28:48

loved it. I liked learning, but I

1:28:50

felt like at least toward the end of the

1:28:53

time working a regular corporate job that I was

1:28:55

just spending a lot of my time filling

1:28:57

these like marketing landing pages. And

1:29:00

if I do a good job, I make the company money, but

1:29:03

I'm not really necessarily benefiting from

1:29:05

that. And I don't feel like I was really

1:29:07

helping people because I'm just like encouraging people to buy something.

1:29:11

So making content that's educational and

1:29:13

can give people, you know, marketable

1:29:16

skills has been way more

1:29:18

satisfying. And also, like you said,

1:29:20

with the comments, getting direct feedback, like, you know,

1:29:23

I've gotten comments from people who've said I've helped

1:29:25

them like get a job and like now they

1:29:27

are working as a software engineer and like that's

1:29:29

like incredibly like motivating and

1:29:32

like it's I feel like I am

1:29:34

actually helping people, you

1:29:36

know, in the little space that I have.

1:29:39

And so like that's been good. And

1:29:41

yeah, I just enjoy helping people get

1:29:44

from a point of like not understanding

1:29:46

something to understanding something and like helping

1:29:48

them achieve excellence in a certain skill

1:29:50

set. So you said when you look

1:29:52

back at some of your videos that didn't

1:29:54

do as well, it's because they sucked. And

1:29:57

Then you have something to do well, and you think they're probably better. So

1:29:59

What makes a good. Video. For. On

1:30:01

your town. You. Know specifically. For

1:30:04

the sucky want to do you

1:30:06

ah I'm good I think is.

1:30:08

You. Need to be giving value

1:30:11

to your audience. So I think

1:30:13

success. on you tube and probably

1:30:15

any kind of content creation as like understand

1:30:17

your and some like wherever struggles what are

1:30:19

they trying to do. And speak

1:30:21

to those specific pain points and I

1:30:23

think that video that haven't done as

1:30:26

well have been easier. To

1:30:28

focus on me for example like we

1:30:30

made a video a couple years ago

1:30:32

where it was like officer or like

1:30:34

second all year that I use and

1:30:36

like that didn't do very well as

1:30:38

surprising as use it and a trend

1:30:40

yeah I'm glad Strider experimental yeah I

1:30:42

think my audience is not there to

1:30:44

see my mind the earth as ask

1:30:47

you about of the sale I can

1:30:49

you teach me something new right as

1:30:51

as more for like Mary lifestyle influencer

1:30:53

that have another much as like teacher

1:30:55

because you're a teacher effect of yeah

1:30:57

yeah so I think. Is my channel

1:30:59

is like you know, focus on different

1:31:01

keyboards are years and like that would

1:31:03

make a lot more sense that I

1:31:05

think the see Ten census to what

1:31:08

my normal nieces said Slag? Yeah, you

1:31:10

want to make your view or the

1:31:12

hero of their story. You don't want

1:31:14

to put the focus on yourself as

1:31:16

a creator, so it's like you are

1:31:18

helping your viewers on their journey to

1:31:20

heal at least in my sandals case,

1:31:23

become a web developer or get better.

1:31:25

Your career says is a matter of

1:31:27

understanding what your. Audience is hoping to

1:31:29

see and it can be difficult

1:31:31

or maybe even limiting sometimes because

1:31:33

I do think that you tube

1:31:35

is not super forgiving when it

1:31:37

comes to being experimental. So actually

1:31:39

just created his second channel but

1:31:41

says I have two types as

1:31:43

a years once I've as a

1:31:45

shorter like Super at it is

1:31:47

sorted. Tory only other time was

1:31:49

like four to eight hour long

1:31:51

live coding like when does building

1:31:53

As I've said from scratch and

1:31:55

like it's not super edited and.

1:31:57

Those words, it's not doing well And. My

1:32:00

theory that I'm still in progress is

1:32:02

that the to formats are two different.

1:32:04

So I'm putting all the long videos

1:32:06

on a new channel hoping that it'll

1:32:08

attract Nine Seal who are looking for

1:32:10

those kinds of videos and the sort

1:32:13

of a ozil send the main channel.

1:32:15

So yes, interesting because I think the

1:32:17

you often adam have talked about you

1:32:19

two people been able to put everything

1:32:21

once analysts fear me right there with

1:32:23

the prices may be thinking that's not

1:32:25

working or what seen them experiments and

1:32:27

the channel and then they would say

1:32:29

if. You like this on now? creating a new

1:32:31

channel or says it would take me ten. I've

1:32:34

seen that happen to, but I've also seen you

1:32:36

know, like three or four different format styles within

1:32:38

a single succeed or two here. So I'll the

1:32:40

there's really a respite. It's like this is the

1:32:42

way Yeah, you know. But I do agree with

1:32:45

that. like. In the case of

1:32:47

like Free Good can for example of fun

1:32:49

of the Quincy Larson about they have courses

1:32:51

on their you do in their you tube

1:32:53

is is like oh yeah didn't say years

1:32:56

now but as a citizen a long time

1:32:58

to get there is it is but they

1:33:00

focus on like this super long format and

1:33:02

wasn't like concise little you know educational things

1:33:05

that was like oh on that twelve hour

1:33:07

courses with chapters and like he was like

1:33:09

no no no this long format is thriving

1:33:11

there is got and own of why I

1:33:14

can what he said but I was surprised.

1:33:16

By that because I didn't expect that

1:33:18

islam or format. And. I

1:33:20

guess in their case compared yours is

1:33:22

that there's as a bit more curriculums

1:33:24

and edited probably to some degree elise

1:33:27

chapters whereas maybe know the failure time

1:33:29

out is a bit more like. Bloodstream,

1:33:32

it's sort of like you're exhausted always

1:33:34

a your buy products that using her

1:33:36

valuable the she wanted probably still share

1:33:38

cause a you get to see everything

1:33:40

vs like this zoom and version of

1:33:42

the problem only right exactly problem he

1:33:44

as like a birds idea now that

1:33:46

yeah the livestream those are you are

1:33:48

just report him yourself on there are

1:33:50

pre recorded yeah and i really do

1:33:53

live streaming and have a really understand

1:33:55

lives for me i suppose i'm at

1:33:57

get a for the community the person

1:33:59

but like It seems ephemeral

1:34:01

like Twitter spaces even or x spaces

1:34:03

well Yes, and no because

1:34:06

you can there's tools in order to capture

1:34:08

the and turn that into something else I

1:34:11

think that's what a lot of people the live stream on Twitch, and

1:34:14

then they'll pull sections out of that Mm-hmm

1:34:16

into YouTube and you know that kind of thing

1:34:18

seems to work It seems really

1:34:21

good if you have a tight-knit community people who

1:34:23

like to hang out and talk and hang out

1:34:25

with you as a personality As

1:34:27

a viewer. I don't have any time for

1:34:29

that like synchronously watch somebody else code like

1:34:31

no I would love to have time to

1:34:34

watch me code That's

1:34:36

just where I am in my life Yeah Dance for

1:34:38

young people especially if you're learning and you're learning a

1:34:41

lot from somebody watching them code live is it can

1:34:43

be very As a yeah exactly as

1:34:45

a as a young person getting in if I

1:34:47

could find like somebody I resonated with yeah I'm gonna

1:34:49

identify with and I can I can like just be

1:34:51

a fly on their shoulder as they all say You

1:34:53

know like that kind of thing or the fly on

1:34:55

the wall That's kind of what

1:34:57

which is a microphone Flown microphone and

1:35:00

that's kind of what switches night in that case

1:35:02

then I can see a lot of value but

1:35:05

the live streaming thing is just It's

1:35:07

not my game personally But as a young person if I

1:35:09

was trying to get in or I was trying to learn

1:35:11

I would want to see like Caleb Like

1:35:13

I mentioned him really I would totally be like Let

1:35:16

me just just live stream your whole

1:35:18

YouTube set up that you've just done Like

1:35:21

don't give me the video give me the behind the scenes

1:35:23

of the video and don't even worry about editing it I

1:35:25

want to see you turning those knobs. I

1:35:27

want to see you attaching the thing I want to see how

1:35:29

it works not the finalized

1:35:32

thumbnail that YouTube blesses as

1:35:35

Good for the algorithm like give me the

1:35:37

unpolished version of it. Yeah, I think I

1:35:39

just don't have the personality to be a live

1:35:41

streamer I feel like you need to really

1:35:43

lean into the entertaining side. Yeah. Yeah Think

1:35:46

successful live streamers are definitely

1:35:48

more entertainers than anything else

1:35:51

not that they aren't good at their coding or what they're doing But

1:35:54

you have to show folks they are they put on a

1:35:56

show yeah, and they can do it for long periods of

1:35:58

time I mean it's very important impressive in some cases

1:36:01

that you can command an audience for that long. For like

1:36:03

eight hours a day? Yeah, for like eight hours a day,

1:36:06

you know, with Vim and Tmux or something.

1:36:08

Like that's impressive. But

1:36:10

definitely not for all creators. That's why it's great that

1:36:13

there's different kinds of things to do. Here's

1:36:15

a question for you. If there's listeners out there, listeners right

1:36:17

now, is it your advice to go

1:36:19

and create a YouTube channel? Like what's your advice for

1:36:22

those thinking, man, I can, I like what Jess has

1:36:24

done and I've got a version of that for me.

1:36:26

Are you encouraging to say YouTube is

1:36:28

like the hub and you have spokes? Should

1:36:31

you, you know, like how should someone, should someone even

1:36:33

get into this kind of thing? What's

1:36:35

your recommendation? Yeah, I think, you

1:36:37

know, we always think that

1:36:39

the content creation space is super saturated,

1:36:41

which it is. I do think that

1:36:43

YouTube for the programming niche has gotten

1:36:45

more competitive in the last like, you

1:36:48

know, six, seven years since I've been there. But

1:36:50

I do think there's always room for

1:36:52

more people. Yeah,

1:36:55

I do use YouTube as kind

1:36:57

of the main hub. And then I sometimes post

1:36:59

on Twitter, sometimes I post on Instagram, not

1:37:01

really on TikTok. And then I've

1:37:04

email, like an email newsletter. So

1:37:07

YouTube is kind of the main like bread and butter.

1:37:09

And the reason for that is, and I actually started

1:37:11

out on Instagram and then I did

1:37:13

YouTube later. Really? But

1:37:15

I felt like Instagram was a little bit too ephemeral,

1:37:17

I think as was mentioned before. And, you know, they

1:37:20

will really punish you if you don't post, you know,

1:37:22

a certain number of times a week. And

1:37:24

I was trying to not have to feel like I had to

1:37:26

do that. So I thought YouTube is

1:37:28

nice because you can get that

1:37:30

immediate sort of viral traffic. But you

1:37:32

can also get a lot of traffic

1:37:34

later on down the road with like

1:37:36

SEO type, you know, keyword titles and stuff

1:37:39

like that. So there's a

1:37:41

lot more, I would say longevity on YouTube,

1:37:44

which is why I've kind of planted

1:37:46

my flag there. Have you experimented with

1:37:48

shorts? Short form video? I've done

1:37:50

some shorts. I feel like that's

1:37:52

again with like the different types of formats.

1:37:55

Like, you know, this is all like anecdotal,

1:37:57

but like I've heard of people who.

1:38:00

who felt that shorts has kind of killed their

1:38:02

channel. But I've also heard of people who felt

1:38:04

shorts has really helped grow their channel. So

1:38:07

I'm not really making shorts at

1:38:09

the moment. And if I did, I might actually, I

1:38:12

would put that on another channel, and I don't really want

1:38:14

to make a third channel. Just

1:38:16

again, if you have a lot of shorts,

1:38:19

you're sort of training your audience to expect

1:38:21

these 60 second videos from you, and

1:38:23

they might not want to watch your longer eight

1:38:26

to 20 minute videos. So

1:38:29

I don't know, it's still like, it's

1:38:31

a mystery how YouTube works,

1:38:33

and things are changing all

1:38:36

the time. Have you considered open platforms?

1:38:38

PeerTube is one that I'm thinking about for

1:38:41

us. Okay. Haven't jumped on it yet. Not

1:38:43

familiar. Okay, so it's a decentralized thing.

1:38:45

It's a fediverse kind of thing. Decentralized

1:38:48

video platform, which

1:38:50

for a long time languish in

1:38:52

obscurity, I think, in the world of people

1:38:55

who like decentralized things and open source things.

1:38:58

It's gotten a bit of a bump

1:39:01

of late because Macedon has done kind of

1:39:03

well. Yeah. And because people

1:39:05

are starting to say, okay, what if

1:39:07

we actually have open, decentralized, non-algorithm driven

1:39:09

platforms? And so for

1:39:12

instance, Flipboard, which is, remember Flipboard, the digital

1:39:14

magazine that was so cool on the iPad?

1:39:16

That was cool. That was awesome. They're still

1:39:18

around, they're still making moves. And

1:39:21

they have a lot of publishers on their

1:39:23

magazine app, and they're all going in on

1:39:25

the fediverse now, so

1:39:27

they have Flipboard.video, which they also post on

1:39:29

YouTube, but it's the same videos, and they

1:39:32

put them on their PeerTube, and

1:39:34

it's all federated, blah, blah, blah,

1:39:36

blah. I'm not sure how the bandwidth requirements

1:39:38

and stuff are. I haven't gotten that far

1:39:40

into it. What's interesting is this is an

1:39:43

established, high quality company

1:39:45

and brand who are now posting

1:39:48

their videos onto YouTube, but also

1:39:50

on the PeerTube, and the

1:39:52

one they embed into their blog post is

1:39:54

the PeerTube one. They're pushing people towards Flipboard.video,

1:39:56

a domain that they own, and

1:39:58

which is not algorithm based driven. And

1:40:01

maybe that's a future for YouTubers,

1:40:03

a potential side future. I

1:40:06

don't know much more beyond that. But

1:40:08

I'm curious, I always ask YouTubers, are you considering

1:40:10

that kind of thing? Because if you

1:40:12

are a slave to the algorithm and you

1:40:15

can somehow find freedom somewhere else,

1:40:17

obviously the audience is why you're there.

1:40:20

Maybe the centralized web doesn't have

1:40:22

the audience, or maybe the same audience.

1:40:25

But if there's enough people there that you can help them, and

1:40:28

it's just a matter of uploading twice, or

1:40:30

whatever it is, maybe it's an idea worth pursuing.

1:40:33

Interesting, I've honestly not heard of that. I think

1:40:35

that's really interesting. I think a lot of people

1:40:38

have not heard of it. I

1:40:40

have not heard of it. PeerTube, they're on

1:40:42

my list of people to bring on

1:40:44

the show and talk to about it. That'd

1:40:46

be cool. I think my perspective is creating

1:40:49

a business and a product. Do you

1:40:51

care about where you sell your thing? No, I must sell it

1:40:53

in this brick and mortar store. And if I don't sell it

1:40:55

in this brick and mortar store, then

1:40:57

I don't care about making the thing. I'm more

1:40:59

like, okay, I make a thing, I

1:41:01

want distribution. So wherever distribution is happening

1:41:03

for me, and I suppose the

1:41:06

long-term freedom, and I suppose the

1:41:08

shackles being taken off in terms of

1:41:10

the algorithm, I think if over

1:41:12

time PeerTube was better for

1:41:14

distribution, where's my audience? I wanna tell people

1:41:16

what I have to say in the world.

1:41:19

As all creators, it's not necessarily the topic, it's

1:41:21

that you have something to say. But

1:41:23

you have something of value to give back to the

1:41:26

world. And I think for me, with

1:41:29

podcasts, it is the freedom. I was asking

1:41:31

Jared earlier what he thinks

1:41:33

about our stats and how that plays

1:41:35

into feelings, basically. And I

1:41:37

don't really feel that pressure and stress

1:41:39

of having to match numbers every

1:41:41

episode. I kinda get bummed, oh,

1:41:43

I love that show, but the

1:41:46

listenership wasn't there for whatever reason. Or maybe that

1:41:48

was a summertime and people were on vacation. Who

1:41:50

knows why? I might feel that, but I'm not

1:41:52

like, my identity is not crushed by it.

1:41:55

But I feel like I would

1:41:57

be of the mindset to be wherever

1:41:59

distribution is. I want to

1:42:01

put my product there and that's a podcast or

1:42:03

that's my words or my prose or whatever

1:42:05

That's probably how I would approach it Rather

1:42:08

than saying nah, it must be YouTube, you

1:42:10

know, right versus fear tube or whatever early

1:42:12

Wherever I get freedom wherever the audience wants

1:42:15

to go kind like your thought to

1:42:17

with Twitter versus X

1:42:19

versus mastodon. Where should we post

1:42:21

to social wherever people wherever our

1:42:23

people are? We want to be there Yeah, that's kind

1:42:25

of yeah, that's our current strategy is

1:42:27

go where the people are that makes sense Give

1:42:30

them what they want and go where they are. Yeah,

1:42:32

give them what they want Go where they are and

1:42:34

give them what they want, right? But one example I

1:42:36

think which is a nice analog to YouTube and potentially

1:42:38

peer to I don't know is that

1:42:41

You know on Twitter we have a following and

1:42:43

on mastodon We have a following and

1:42:45

the following on mastodon is about

1:42:48

a tenth of the following on Twitter Mm-hmm, but we

1:42:50

can post this exact same poll on

1:42:52

Twitter and on mastodon and get double the

1:42:54

responses on mastodon with one tenth of The

1:42:57

audience so there's something about that which to

1:42:59

me it's like, okay less people but they're

1:43:01

actually there, you know You

1:43:04

know most of your subscribers don't get to see your videos

1:43:07

That's lame, right? Mm-hmm Like you work really

1:43:09

hard to get a new sub and you

1:43:11

have to say subscribe and hit the notification

1:43:13

bell It's like I'm not gonna say hit

1:43:15

the notification bell. Sorry. I'm just not going

1:43:17

to I've never hit the notification literally one

1:43:19

time Lots of

1:43:22

I'm a big time on YouTube for love But

1:43:25

I do want to see the new videos of the

1:43:27

people I'm subscribed to and the fact that YouTube doesn't

1:43:29

just show me those Things makes me mad as a

1:43:31

viewer and as a creator even more mad like you

1:43:33

work real hard to get that sub And now they're

1:43:35

never gonna I mean they may they may not I

1:43:38

have a few subscriptions on YouTube Where

1:43:41

I was going to my subscriptions right? I'm like, I

1:43:43

forgot I subscribe to this channel It's been nine months

1:43:45

since I've seen their videos. I go click on it.

1:43:47

Oh, they got plenty of new videos in the last

1:43:49

nine months Why am I not seeing those? So to

1:43:52

me that's Lame and that's

1:43:54

the kind of stuff that we could get away from

1:43:56

if we had enough people on these alternative networks Maybe

1:43:58

peer-to-peer never get away with that get to the

1:44:00

point where that matters for that network, but

1:44:02

it'd be really cool if it did. The

1:44:05

analog of drawing is the Twitter versus Mastodon

1:44:07

poll thing where maybe it's smaller in terms

1:44:09

of subs, but you've got higher engagement. Yeah.

1:44:12

Anyways. Always food for thought.

1:44:14

Food for thought. Fusche,

1:44:16

man. All right. We podcast together way

1:44:18

too much, so. I can tell. Thanks

1:44:22

for talking to us today. Yeah, thanks for having

1:44:24

me on. Thanks for interesting. Coder Coder, check it

1:44:26

out on YouTube. Hit that notification bell so

1:44:29

you get all the notification bell. Do it.

1:44:31

Do it now. Do it. If

1:44:56

you're listening, you may remember the

1:44:58

early days of the internet where

1:45:00

open networks like HTTP and SMTP

1:45:02

led to an explosion of websites

1:45:05

and online communities. Building a fan

1:45:07

site and connecting over shared passions

1:45:09

led so many of us to

1:45:11

careers in software. Back then, it

1:45:13

seemed like anything was possible because

1:45:15

the internet of the 90s was

1:45:17

built to democratize

1:45:19

information, not consolidate it with

1:45:22

a handful of big tech

1:45:24

companies. Read, Write, Own.

1:45:26

Building the next era of the

1:45:28

internet is a new book from

1:45:30

startup investor, Chris Dixon, that explores

1:45:33

how network architecture plays out in

1:45:35

our online lives and the decisions

1:45:37

that took us from open networks

1:45:39

governed by communities of developers to

1:45:41

massive social networks run by internet

1:45:43

giants. Read, Write, Own

1:45:45

is a playbook for reclaiming control

1:45:48

and for reimagining applications so users

1:45:50

can own, co-create, and even profit

1:45:52

from the platforms they use every

1:45:55

day. From AI that compensates

1:45:57

creators to protocols that reward open networks.

1:45:59

and source contributions, this

1:46:02

is our chance to build the

1:46:04

internet we want, not the one

1:46:06

we inherited. Order your copy of

1:46:08

Read Write Own today or go

1:46:11

to readwriteown.com to learn more. An

1:46:16

hour and 46 minutes in and you're

1:46:18

still here. Or you skipped

1:46:21

straight to this spot using your podcast

1:46:23

apps handy dandy chapters feature. Either way,

1:46:25

we have one more treat for ya.

1:46:27

Adam talks to Vanessa Villa and Noah

1:46:29

Jenkins about ag tech. Just Adam on

1:46:31

this one, sadly, as I had already

1:46:34

flown Omaha by day three. But still,

1:46:36

it's a great combo. Let's get to

1:46:38

it. Ooh,

1:46:46

okay, so there's this huevos rancheros

1:46:48

thing. So it's like tortillas and

1:46:51

like salsa. And then you put in. Oh

1:46:54

my gosh, I love it. That's

1:46:56

our icebreaker, by the way. Really? We

1:46:58

usually ask people out there for breakfast. Everyone's passionate about breakfast,

1:47:00

right? Sure. You have a favorite breakfast

1:47:02

or a favorite dessert. Yeah. So

1:47:05

something like that, that usually gets ya. Going,

1:47:07

but it also gets you hungry. This

1:47:09

is true. Well, then there's a barbecue tonight. There

1:47:11

is a barbecue tonight. So I think we could

1:47:13

do it. Yeah, we can eat the barbecue, the

1:47:15

ag barbecue, right? So we're here with Vanessa and

1:47:19

Mike. No, sorry. Noah. Noah,

1:47:21

there was a Mike over there. That's why I said Mike. There

1:47:23

was a Mike, yeah. That's why I said, how's your Mike? Sorry

1:47:26

about that. We're here with Vanessa and Noah.

1:47:28

All good. Noah was in the

1:47:30

home lab session with us and then the podcasting

1:47:33

one, I think just by happenstance. Yeah, I just

1:47:35

kind of stuck around after. But then Vanessa came

1:47:37

on purpose. The only person who came on purpose.

1:47:39

Really? Was Jake, of course. Oh.

1:47:42

Just talking about podcasting and tech. And then we

1:47:44

started talking about Ag Tech, which is a

1:47:46

podcast you're thinking about creating with a friend of

1:47:49

yours. Yes. That has heads of

1:47:51

cattle or head of cattle, as you

1:47:53

said. Yeah. In the thousands and you

1:47:55

care about agricultural because you went to

1:47:57

school for this. Yeah. So

1:48:00

I grew up in a really rural

1:48:02

agricultural town and it was about you

1:48:04

know, the only industry in that town

1:48:06

was Orange so oranges

1:48:08

like they partnered with Sun kiss

1:48:10

and every Putty that I grew

1:48:12

up with was basically an orange

1:48:14

grower for Sun kiss And so

1:48:17

we had the packing house was literally across the street

1:48:19

from my house for all of the oranges that we

1:48:21

you know We're growing in my town And so I

1:48:24

was like this sounds like an opportunity for technology Especially

1:48:26

at the packing house so you don't have to

1:48:28

have manual labor to sort through oranges So

1:48:31

I originally went to school for

1:48:33

computer engineering at Cal Poly

1:48:35

San Luis Obispo Which is half agriculture

1:48:38

and half technology in order to study

1:48:40

ag tech So that's kind of

1:48:42

how I got into it and I met my roommate there

1:48:44

and she has cattle And so that's how

1:48:46

we started talking and do you live near each

1:48:48

other now? Or you have you moved on

1:48:50

your careers and stuff like that, right? We've moved on

1:48:53

our careers I've been in tech now I

1:48:55

think full-time for about six and a

1:48:57

half almost seven years and she's been

1:49:00

in cattle as well She's doing cattle

1:49:02

and she's doing veterinary science and

1:49:04

what's your story? No You were talking about so

1:49:06

this began because we were like knee-deep and good

1:49:08

content and I'm like always be

1:49:10

recording That's my philosophy a br and our audience

1:49:12

knows that yeah, and so we had a whole

1:49:15

conversation outside of this But you were talking about

1:49:17

hydroponics you were talking about Precision

1:49:19

what was the precision precision agriculture is

1:49:21

an agriculture you like shoot Like

1:49:24

fertilizer fertilizer directly at it with computer

1:49:26

vision Yeah, so I forget how the

1:49:28

conversation started, but it was just we're

1:49:31

talking about agriculture technology And and I

1:49:33

think we actually started with distributed agriculture

1:49:35

and the idea of and we used

1:49:37

the analogy load balancing Right comparing instead

1:49:39

of having centralized pockets of agriculture

1:49:41

distributing it among communities and household Yeah, it's kind

1:49:43

of empower, right? Isn't that like a means of

1:49:45

empowering people to be in charge of their own?

1:49:47

Yeah, so it's a few things number one you're

1:49:50

empowering the consumer to produce their own food You

1:49:52

could argue that helps that could help fight like rising

1:49:54

costs especially as you grow You're at

1:49:56

because at the same time you're growing supply, right maybe

1:49:58

on a smaller scale But you're also lowering demand

1:50:01

because people are now producing their own food so

1:50:03

their demand is Lower

1:50:05

while the supply is higher because they are

1:50:07

producing it They might be trying to selling

1:50:09

it and then also it makes the entire

1:50:11

because you could say ecosystem of agriculture more

1:50:13

durable, right? So when you have these centralized

1:50:15

locations if there's a natural disaster a war

1:50:18

for example the war in Ukraine that drove

1:50:20

wheat prices up So if you have it

1:50:22

more distributed, it's more durable and it's it's

1:50:24

more sticky. I would yeah So

1:50:26

what is your story? How are you in the egg?

1:50:29

I I've always just been fascinated with agriculture, you

1:50:31

know growing up I always like growing on

1:50:33

plants. I love the idea of crossbreeding. I

1:50:35

love the idea of like cloning plants I

1:50:37

say clone you basically just they're producing a

1:50:39

sexually right so you're taking shit exactly. Yeah,

1:50:41

exactly So that's always

1:50:43

fascinated me and I've always loved the idea

1:50:45

of owning a process and what what can

1:50:48

be done on scale What can I do

1:50:50

myself? So growing stuff? I'm at home I

1:50:52

have to hydroponic basically little farms for

1:50:54

growing herbs and little vegetables I've grown my

1:50:56

own like jalapeno peppers my own tomatoes So

1:50:59

that always fascinates me and the best look like who

1:51:01

wants to go to the store for like one jalapeno

1:51:04

exactly, right? I don't grow my own.

1:51:06

So I'm I'm that. Oh, okay. Love it. I

1:51:08

would love it Yeah, you know it but I

1:51:10

think it's the point is like I don't have

1:51:12

access easily without like learning a bunch of stuff

1:51:14

to Sure be empowered in certain ways Like I

1:51:17

watch certain tick-tock videos There's this guy who will

1:51:19

just take a pineapple and make another pineapple tree

1:51:21

from the pineapple Like he shows you with these

1:51:23

little ninja tricks I'm like how you can take

1:51:26

an avocado and turn an avocado into a tree

1:51:28

From the seed and the same with like

1:51:30

taking bananas and putting them the peels at least

1:51:33

into water and letting that put the nitrates Into

1:51:35

it using that as fertilizer. Yeah, you're playing So

1:51:37

I got all these little things that like we

1:51:39

just don't know. Yeah, it's like individuals out there

1:51:41

who just are norm wise Kind of

1:51:43

lost knowledge like in the 40s like you

1:51:46

had the Victory Gardens and that was a

1:51:48

well-known pamphlet distributed by the US government out

1:51:50

to people in order to combat the war

1:51:52

during you know World War two and

1:51:54

so you had these Victory Gardens where people were growing

1:51:56

their own tomatoes their own lentils their own You know,

1:51:58

whatever they needed at home home for their

1:52:01

own household. And that knowledge

1:52:03

just kind of got lost and that's okay. That's

1:52:05

okay. Like we're in an- That's

1:52:07

not okay. Well, I

1:52:09

mean, we went to a- My grandmother, she

1:52:11

had all this stuff outside of like, I

1:52:13

remember like, just

1:52:15

all these things outside. Aloe vera, things that like,

1:52:18

you know, that's the healing. I think a lot

1:52:20

of people know about that, right? Aloe vera is

1:52:22

a pretty common one, but just

1:52:24

so much stuff that my grandparents, like, this is

1:52:26

like what? Two generations in my family? My dad

1:52:29

and his parents, my grandparents,

1:52:31

right? That's two generations. They were

1:52:33

knee deep in tomatoes and all

1:52:35

the things. And here's me not

1:52:37

into all those things. And

1:52:39

it's because of that, I think like

1:52:41

it's a lost art form in

1:52:43

a way, maybe a lost practice, a lost

1:52:46

knowledge base that, and maybe it's

1:52:48

some of it's the industrial complex

1:52:50

of food, the food industrial complex. Is

1:52:52

that the thing? I don't know what the term is

1:52:55

for it. I think it's fair. That's

1:52:57

fair. You got GMOs, you got,

1:52:59

you know, certified seeds, you've got patented seeds.

1:53:01

You got all this lock out. It's been

1:53:03

industrialized. Right. Like food has been

1:53:05

industrialized to a point where like, it's

1:53:07

a monolith and a monopoly at the

1:53:09

same time, right? Which is not good. It's

1:53:12

not great. It's two monos, right? Yeah. Monolith,

1:53:15

monopoly. Like

1:53:17

if you look at the growers, if you look

1:53:19

like, hey, you know, all types of corn in

1:53:22

this particular area is done by this one seed

1:53:24

and you only have one seed variety for that

1:53:26

season. But that seed variety has

1:53:28

been optimized, you know, to combat pests. It's

1:53:30

been optimized for that climate. It's been optimized

1:53:33

to handle, you know, that type of soil.

1:53:35

So it's like, yes, it was

1:53:37

just slowly industrializing farming to

1:53:39

a point where it doesn't seem

1:53:41

as accessible knowledge wise for an

1:53:44

everyday person. Like that's, that's kind

1:53:46

of what ended up here.

1:53:48

It's weird because when you

1:53:50

juxtapose that to cloud, right?

1:53:54

And to, I just had a conversation with somebody

1:53:56

who's very smart and we're talking about how her

1:53:58

photos are in the cloud. And

1:54:00

she's having trouble getting them down, but just this whole

1:54:02

idea that like that should be pretty simple for the

1:54:04

most part I should be Yeah,

1:54:06

but this we have the cloud now we

1:54:08

have this thing there and like at some

1:54:11

point We'll just rely on the

1:54:13

cloud. We won't on-prem anything

1:54:15

anymore And we talked home lab earlier,

1:54:17

which is like kind of like on-preming

1:54:19

your own things your own household I've

1:54:21

got my own large-scale storage for my Plex server.

1:54:24

I've got a piehole I've got home automation and

1:54:26

that's like a version of us taking power back

1:54:28

Right into our own hands not that it's being

1:54:30

clawed away from us But if

1:54:32

you don't know you can have it, you know You don't know

1:54:35

if you want it or that you need it and I feel

1:54:37

like food is like that like if I could more

1:54:40

realistically farm my own food not crazily,

1:54:43

but like yeah What lettuce is do we

1:54:45

use off and what greens we use often

1:54:48

like is a jalapeno tree or whatever it

1:54:50

is a plant plant? Yeah, like I don't

1:54:52

even know see this Don't

1:54:54

even understand what they call it. I love the idea

1:54:57

of a jalapeno tree now I want

1:54:59

I want that well and here's what's cool because a lot

1:55:01

of people now I feel like real estate is very different

1:55:03

than it was back in the 40s, right? Like

1:55:05

now you have a lot of people in apartments

1:55:08

and a lot of people with less real estate

1:55:10

And yeah, exactly and people might say hey, I

1:55:12

don't have the real estate for farming Well going

1:55:14

back to hydroponics one of the things I love

1:55:16

about those is it's in you can do it

1:55:19

indoor So you can literally in your kitchen have

1:55:21

anywhere from one to five even varying sizes You

1:55:24

can make them yourselves you can you can buy

1:55:26

them pre-built and it lets you basically turn

1:55:28

any Real estate you have into

1:55:30

whatever you need it to be and you

1:55:32

can grow and you can get bigger ones

1:55:34

You can grow full-on bushes or just small

1:55:36

little plants and it's really exciting with how

1:55:38

Acceptable it is and really it yes,

1:55:41

you can spend a lot on them There's some that

1:55:43

have like smart capabilities some that are just very basic

1:55:45

So there there's you know range of product there, but

1:55:47

the barrier to entry to start getting into that producing

1:55:50

your own food It's like a home depot bucket Yeah,

1:55:52

it's very low barrier to entry which is exciting. I

1:55:54

feel like we need like a home lab For

1:55:57

AG. Yes in my house Like

1:56:00

I there's certain things I would totally grow. Oh, yeah, right

1:56:02

There's certain things I would just definitely not mm-hmm Like

1:56:04

I'm not gonna have a cow in my backyard cuz that's

1:56:06

just not feasible right exactly No, I hate you will

1:56:08

be like no and then to it just like I don't

1:56:10

care for it sure You know maybe my kids will love

1:56:13

it. Maybe we have some zebus. They don't do anything

1:56:15

Yeah, I'm not gonna kill my zebu and eat it right

1:56:17

which is like a mini cow basically Yeah, if you

1:56:19

know what a zebu is yeah, I learned about this recently

1:56:22

You know it's not a real cow that you've already made

1:56:24

you do eat them I don't know maybe people easy booze

1:56:26

do be easy. Not that I'm aware of I

1:56:29

could be wrong. I'm open to being wrong. Yeah, maybe

1:56:31

maybe you mentioned doing a home lab project right here

1:56:34

So it's cool if you really wanted to there's

1:56:36

so many fun IT projects You could do based around that

1:56:38

you could set up your own like camera and do instant

1:56:40

basic image recognition See which plants are

1:56:42

performing you can then set up like hey Let

1:56:44

me try different types of food And then you

1:56:47

can actually measure that take the data and then

1:56:49

actually build your own like data analytics project and

1:56:51

basic AI image recognition Projects and say

1:56:53

hey, I'm gonna test these different types of plant

1:56:55

food. I'm gonna see across these different forms I'm

1:56:57

gonna see which one performs better So there's a

1:56:59

lot of even like what's

1:57:01

happening on scale like we talked about the idea

1:57:03

of the precision agriculture tech right where? You have

1:57:06

these robots going in and instead of like spraying

1:57:08

all this fertilizer It's being very precise and saying

1:57:10

okay only these you know plants need it you

1:57:13

can do something like that Obviously not on

1:57:15

scale, but you can do something at home

1:57:17

with the similar foundation, right? There's also a

1:57:19

lot of room for potential projects. You can

1:57:21

do in the IT technology space I

1:57:23

want to add this one point I think you'll like

1:57:25

it too is that I feel like there's a recipe

1:57:27

you ever often have ingredients in your house And

1:57:29

you don't have a meal yeah, I have ingredients and

1:57:32

not a meal I feel like you described ingredients to

1:57:34

the thing who can package it

1:57:36

and I think my question is like you're into ag

1:57:38

tech Yeah, right and you want to do this podcast

1:57:40

which is where this began Who

1:57:42

can package this thing into something? Consumers

1:57:45

can actually use because we have ingredients

1:57:47

not a meal so to speak right

1:57:49

there's a couple of folk are Already in

1:57:52

that kind of space where you have like

1:57:54

okay? Here's your hydroponic herb garden like that's

1:57:56

what my friend got for Christmas is she got

1:57:58

a hydroponic herb garden and it reminds or through

1:58:00

an app, like, hey, your water

1:58:02

is running low and your hydroponic, hey,

1:58:04

your fertilizer tablet has fully dissolved and

1:58:07

your fertilizer levels are too low, so

1:58:09

you need to add a new one.

1:58:11

So it's just kind of like, there's

1:58:13

already tech, I would say,

1:58:15

distributors on this idea for an

1:58:17

at-home garden, but it's on the

1:58:20

very small scale, I think, to

1:58:22

get up to a

1:58:24

larger scale, like a jalapeno plant, or maybe like

1:58:26

a dwarf orange, yes,

1:58:29

you could do a dwarf orange, they only

1:58:31

get up to about five feet, so that's

1:58:33

not too bad. If you

1:58:35

get up to that scale, now you're talking about like a larger

1:58:37

manufacturing effort, and so that's

1:58:40

where the questions start to come in,

1:58:42

it's like, okay, you have a large

1:58:44

treat, that requires a different water level,

1:58:46

a different type of fertilizer, it requires

1:58:48

a slightly different system than something like

1:58:51

an herb garden. Sure, right. What would be

1:58:53

fascinating is, because you mentioned how

1:58:55

your friend has a system where it reminds

1:58:57

her. If

1:58:59

you do get that bigger system, where you have a collection of

1:59:01

systems, I'm thinking how cool would it be to add more automation,

1:59:04

like, okay, this adds to the manufacturing process, right? Yeah,

1:59:07

this adds to manufacturing process. And again, this is probably

1:59:09

my Python developer in me talking, like, okay, what if

1:59:11

you built like a tank that could distribute the water

1:59:13

based on the needs, or you have a little, almost

1:59:15

like a PES dispenser, but for fertilizer tablets, it'll automatically

1:59:18

dispense them. Have you ever been to a

1:59:20

hydroponic greenhouse? No, I've not. Ooh,

1:59:22

okay. I've seen footage, but never like

1:59:24

in person. So they essentially

1:59:27

have like that distributed system

1:59:29

across all the barrels for

1:59:31

the hydroponic systems. But the neat part

1:59:33

is like some of them, you

1:59:35

know, you say like, oh, this one needs more

1:59:37

fertilizer. Fertilizer is essentially

1:59:39

like fish poop. And so

1:59:41

they'll have fish, and they'll

1:59:44

just funnel the fish to the different

1:59:46

barrels. I love that so much. That

1:59:48

is awesome. That's a natural way to do

1:59:50

it. It's a natural way to do it. And the fish is probably like,

1:59:53

yeah, I don't wanna poop over there anymore. I'm

1:59:55

doing it over here now. Yeah. for

1:59:58

the consumer, mate, instead of doing like a hard

2:00:00

work. hard-coded plastic, make it clear, and

2:00:02

then like, hey, not only are you growing food,

2:00:05

you also are keeping fish. How cool is that?

2:00:07

For sure, yeah. And so they

2:00:09

do that though through computer vision, and then

2:00:11

they'll open the pipes or close the respective

2:00:13

pipes to allow the fish to swim from

2:00:15

barrel to barrel. I love that so much, I can't even explain

2:00:17

that. So

2:00:20

what is it that excites you about Ag

2:00:22

Tech? You're gonna do a

2:00:24

podcast on this. I think so. It

2:00:27

might be agtech.fm. You have to now, actually.

2:00:29

You just have to now. I know, I've been in

2:00:32

this space, or I've been studying this space for a

2:00:34

long time. I went to school for this space. It

2:00:36

sounds like it's time to make the podcast. See you

2:00:38

later. I know. Okay, sorry, what

2:00:40

was the question? Well, the question is, what is

2:00:42

possible, I suppose, in this, what do people need

2:00:45

to know about Ag Tech? Like, what is out

2:00:47

there? I know we talked about John Deere and

2:00:49

Right to Repair and the fact that

2:00:51

tractors are not computers, basically. What

2:00:53

else is in the ether of Ag Tech?

2:00:55

What is out there? Is it time to

2:00:58

talk about drones? I

2:01:00

think so. Okay. I think so. Boss

2:01:02

out the drones. All right, okay. So, one

2:01:04

of the hardest problems to

2:01:07

solve is monitoring agricultural stuff

2:01:09

at scale. So, nobody's

2:01:11

farming at five acres anymore. People

2:01:15

are farming in like the tens of acres,

2:01:17

hundreds of acres, for one farmer.

2:01:19

Like, that's how farming works nowadays.

2:01:22

And monitoring all of that for pests,

2:01:24

for gophers, for snakes, for whatever, it's

2:01:27

impossible. And so, what they're doing

2:01:29

now is either they're leveraging satellite

2:01:32

imagery or they're using drones.

2:01:34

Do you wanna talk about drones? I do.

2:01:36

Do you know about drones, though? So, I'm

2:01:38

actually an FAA licensed drone pilot. Oh, that's

2:01:40

right, you are. I forgot about that. Yeah,

2:01:42

good, yeah. So, now granted, I've not used

2:01:45

that license in the agriculture space. It's primarily

2:01:47

been just for commercial, real estate commercial advertising,

2:01:49

things like that. But yeah, so like Vanessa

2:01:51

was saying with drones, there's a few different

2:01:53

applications. So, number one is for monitoring, right?

2:01:56

You can monitor. And what's cool is you

2:01:58

can use different types of cameras. Sensors

2:02:00

you can you can actually be measuring

2:02:02

the thermal. The case what's the temperature

2:02:04

like in these different regional you need

2:02:07

a monitor thermal and order so as

2:02:09

okay adding onto that Jody attempts at

2:02:11

so not every. Agricultural.

2:02:13

Space needs just like your. Rgb

2:02:16

camera for in particular

2:02:18

are avocados. So avocados

2:02:20

to not like because there is

2:02:22

a green on green plants. Young,

2:02:24

The only way you can detect the

2:02:26

quantity of avocados on a three is

2:02:28

by using a thermal camera not as

2:02:30

the desert hot or cold as industry

2:02:33

exile general is. Is is hotter colder

2:02:35

than be respective leagues. Policy so

2:02:37

cool. Avocados will use of these hotter than

2:02:39

the respective least. Okay so that's the only

2:02:41

way to spot them using varieties and are

2:02:44

using a camera yep or as and then

2:02:46

on top of that you've also have people

2:02:48

that and I think this will become less

2:02:50

and less as precision agriculture takes over for

2:02:53

you. Also have people they will do Drones

2:02:55

are in those voice the hey here's what

2:02:57

areas need more fertilizers with areas need more

2:02:59

fuel, food whatever and oh gosh distribute that

2:03:02

of said fertilize that foods via drone which

2:03:04

very cool and a you get semi. Precise

2:03:06

relic into the hawk it's of it and

2:03:08

again I think that will go down as

2:03:10

you have club now like more land than

2:03:12

base. yeah that is precision. Words is going

2:03:14

to the root in actually like sniping which

2:03:17

which plane was running out of like U

2:03:19

C Davis right? Yeah yeah but some he

2:03:21

drones are fascinating in that regard. The original

2:03:23

question was like way of what would tack

2:03:25

is meet the right right and is. Most

2:03:28

of it's again we we talked about image recognition

2:03:31

and a i detecting my case with pockets of

2:03:33

who need which he of image recognition a was

2:03:35

at some of. It's. In the clouds are

2:03:37

a lot of just detect that the

2:03:39

mainstream industry uses is being library in

2:03:41

space is just a matter of the

2:03:43

house. It's positing lever

2:03:46

to and easy accessible. Which

2:03:48

is like the biggest part now. like and

2:03:50

and practical even. Is is it

2:03:52

possible later? I mean when when sample

2:03:54

grand Land temple gran. Aren't

2:03:57

there is one north of

2:03:59

Fellas. All right. So there's

2:04:01

this lady in the cattle space

2:04:04

and she wanted to move cattle

2:04:06

essentially. Like how do you efficiently

2:04:08

move cattle from pen to the

2:04:11

butcher house? And

2:04:13

she revolutionized the way

2:04:16

that cattle essentially,

2:04:18

they approached this, is like cattle move

2:04:20

in herds. So you cannot move them

2:04:22

in a straight line and you cannot

2:04:24

move them down a funnel, they'll freak

2:04:27

out. So you have to gradually move

2:04:29

them along a curve in order for

2:04:31

them to not freak out. And that's

2:04:33

just the way cattle move in herds

2:04:35

when they're out in grasses. And so

2:04:37

that's what they did. And that's now

2:04:40

what they do at butchering houses. But

2:04:42

in order for her to get that

2:04:44

movement up, it took her

2:04:46

decades. Like it took decades for that

2:04:48

to be adopted by the industry, even

2:04:50

though she had study after study showing

2:04:53

that if you don't want cattle to

2:04:55

essentially spook, you have to move them along

2:04:58

a curve. So coming

2:05:01

back to this, it's like we introduced tech

2:05:03

and sensors and moved

2:05:06

tractors to the cloud. Like John Deere

2:05:08

has done this, they've been doing it at least since 2012.

2:05:10

Great. How much is

2:05:14

a tractor now that's connected to

2:05:16

the cloud? How can you repair

2:05:18

it, right? If any of the

2:05:20

sensors break, how much is that

2:05:22

cloud subscription for them to even look at the

2:05:25

data that they're aggregating through these sensors

2:05:27

versus the generational knowledge that

2:05:29

they've had, passed

2:05:31

down. So this is the big debate. It's like

2:05:33

how long is it going to get lay

2:05:36

men tech to farmers,

2:05:39

to everyday farmers, or is it going to

2:05:41

be controlled by at scale farming? So

2:05:46

That's the question, right? And Then I would

2:05:48

ask you, okay, because the goal is we

2:05:50

want to make it accessible, right? And I

2:05:52

would argue like year over year, most of

2:05:54

that does become accessible just over time to

2:05:56

a degree. For Example, I mean cloud in

2:05:59

general for most. People is accessible mustn't I

2:06:01

can go to Azure I can spin up of the

2:06:03

of yes it had have a credit card right? You.

2:06:05

Can go to Azure, you can set up

2:06:07

a Vm. You know how to do that

2:06:09

because you've been educated in this. Quest

2:06:11

for with So that's my point though

2:06:13

is okay, what? how do you been

2:06:15

It's okay. do you need the smaller

2:06:17

former to have that goal to learn

2:06:19

right? but it has also had you

2:06:21

engage with them and I think habit

2:06:23

like for example I think I am

2:06:26

a man I'm Suzanne as I see

2:06:28

if you started apart as for example

2:06:30

usa as someone on a when not

2:06:32

if when when thank you yes when

2:06:34

started pods as you creating a resource

2:06:36

with farmers to the smaller farmers the

2:06:38

hate I did you I will. I'm

2:06:40

someone that understands. The act space and I

2:06:42

want to work and are a once you

2:06:44

guys to get there and let me show

2:06:46

you how and I think especially maybe the

2:06:48

next generation of farmers. I think just we'll

2:06:50

get there as much or up on things

2:06:52

off the never know easier if you're right.

2:06:54

Like as I say, like the as

2:06:56

I say The Future Farmers of America

2:06:58

and their entire organization basically takes you

2:07:00

know, I think it's like up the middle

2:07:03

for the middle school students all. The way through

2:07:05

high school students and then it's university.

2:07:07

Yes, on early? Yeah, forgot. They take

2:07:09

em early and they have courses on

2:07:11

like hey like cloud is com and

2:07:13

what does that mean for farming Okay

2:07:15

cool was that mean for you know

2:07:17

the the pork industry what does that

2:07:19

mean for you know poultry? What does

2:07:21

that mean for titles and so like

2:07:23

yeah like as I say has that

2:07:25

educational course I would say though they'd

2:07:28

like still behind. That age

2:07:30

and hidden with right to repair Like

2:07:32

they're very hesitant to buy into it

2:07:34

because you're buying into day right like

2:07:37

your everyday farmer. it like their live

2:07:39

and like season to season and it's.

2:07:41

Subsidized by the government's happily and

2:07:43

as soon as they don't make

2:07:46

a payment, foreclosures and is forty

2:07:48

threats or whatever. And I'm curious

2:07:50

if you know this, how much

2:07:53

is big Ag tax. Involved

2:07:55

in the Fcs like do is

2:07:57

their influence over the curriculum. And

2:08:00

what's being taught? Cause if you control

2:08:02

the knowledge, you control of people, So

2:08:04

it's the as I say simmer. It's okay.

2:08:06

What I know it as a real thing

2:08:08

is wrong for us. I know, I know,

2:08:11

you dare I wanna out of your get

2:08:13

together as a as the at of Ebay

2:08:15

Future Farmers of America. I know this because

2:08:17

as we have egg really blew my family.

2:08:19

I was an idiot in this moment and

2:08:22

will judge Yellow for the up. And. I

2:08:24

was like the as I say and

2:08:26

then for a city where I like

2:08:28

super civic on the oregon all that

2:08:30

us I mean these athletes. Is this

2:08:32

operation. Their separation there different

2:08:34

organizations. I would say there

2:08:36

there's athletes a council at,

2:08:38

but they're not super involves

2:08:40

necessarily like. Each chapter will.

2:08:42

Have their own policies on how they want to get

2:08:45

him on us, and so that's. Something to consider.

2:08:47

but you know? sponsoring? For scholarships his

2:08:49

his thing and I'd sell tonight. So

2:08:51

I'm I'm piercing your thoughts on this

2:08:53

and we're talking about how it's hard

2:08:55

for the and there's hesitancy. were there

2:08:57

any for the former to go into

2:08:59

tax? Yeah, I mean. There's has an

2:09:02

easy with me going into security system

2:09:04

so. Do things for the industry to

2:09:06

give were needs to be: Do you think

2:09:08

it's people with heck me to come into

2:09:10

and help out the farmers? Maybe it's partnering

2:09:12

with him saying hey, Whether that starting a

2:09:14

service and specifically the catering to that smaller

2:09:16

former, Maybe it's hey, I want to come

2:09:18

in with your business partners. Here's the value

2:09:20

I can add. You think it will take

2:09:23

tech coming into the farmers rather than the

2:09:25

forms come into test. I. Think

2:09:27

it's going to be. It's going

2:09:29

to have to be. Sig.

2:09:31

Tax Working. With pig farming isn't

2:09:33

the only way that it's gonna move

2:09:35

forward and nine force the small upon

2:09:38

the free going down to the on

2:09:40

our farmers but also making the price

2:09:42

point in accessibility of it lawyer and

2:09:44

more accessible for every just mercosur. like

2:09:47

that's that's kind of gonna have

2:09:49

to be the movie because coming

2:09:51

at you know your everyday farmer

2:09:54

like that's not gonna be like

2:09:56

to be realistic justly with manufacturing

2:09:58

prices and education level and even

2:10:01

getting connected. I used

2:10:04

to work on a project where it's like,

2:10:06

okay, how do we solve farming in Africa

2:10:08

and India? How

2:10:10

do we help them with tech?

2:10:12

Because there's droughts, there's famine, there's

2:10:14

maybe unreliable weather patterns,

2:10:16

or you have to use precision

2:10:19

agriculture due to the resources. So

2:10:21

how do you get that connected, and how

2:10:23

do you, there's no such thing as

2:10:25

5G. So

2:10:28

what do you do? And they discovered that

2:10:30

TV white spaces travel

2:10:32

across large areas,

2:10:35

and there's very little data loss, and it's very

2:10:37

low power. So that's kind of

2:10:39

the ongoing thing. It's like, okay,

2:10:41

we're going to do TV white spaces, put

2:10:44

sensors that communicate in that protocol,

2:10:46

and then you have a hub

2:10:48

on the edge that they

2:10:50

connect to, and now you're using kind of

2:10:53

like a mesh system. In order to aggregate

2:10:55

the data, you have a centralized hub. Only

2:10:57

that centralized hub needs to connect up to

2:10:59

satellite or up to the cloud. Yeah, okay, very

2:11:01

cool. So I

2:11:03

wonder, so I'm thinking immediately open

2:11:05

source, what open source solutions could

2:11:07

be developed for AgTech? I'll

2:11:09

be thinking of hardware, that's something you have to

2:11:12

take into mind, that's hard to open source, but

2:11:14

you also might have, hey, here's some software solutions

2:11:16

that leverage, like let's say a Raspberry Pi, or

2:11:19

like an Arduino board or something, or

2:11:21

hey, here's an open source solution I

2:11:23

built that's based off Google Maps satellite

2:11:26

imagery, right? I'm wondering if tech got

2:11:28

more involved or was inspired to build

2:11:30

open source solutions, if that would help

2:11:32

make more cost-accessible solutions for everyday farming.

2:11:34

I think it would be more cost-accessible.

2:11:37

Again, it's like, it's

2:11:39

the mindset of like, hey, I've

2:11:41

been dealing with this for

2:11:44

my generation, my parents have been dealing this

2:11:47

for their generation, my grandparents have been dealing

2:11:49

with their generation. If

2:11:51

I go back, it's like all the

2:11:53

way, my great-grandparents, like a

2:11:55

hundred years ago, like she had like the same plot

2:11:58

of land, like we know the seasons that go. goes

2:12:00

through, we know which areas are problems, it's

2:12:02

the generational knowledge, and I'm like, okay, cool,

2:12:04

tech. Tech's already telling me something that

2:12:06

I already know. Or quote unquote, already

2:12:08

know, right? Is it

2:12:11

an easier solution and more convenient? Sure, but why

2:12:13

would I need that? Like, I already know it.

2:12:16

It comes down to the education. It comes

2:12:18

down to the education, and the small farmer

2:12:21

knows their land. I don't want to

2:12:23

discount their knowledge. It's like, we were talking

2:12:25

earlier about how gardening and stuff was very popular

2:12:27

during the 40s because of the food shortage. It's

2:12:30

the same thing. It's like, they maintain that

2:12:32

generational knowledge. It's just been lost to the

2:12:34

rest of us. Well, H-E-B

2:12:36

is like right down the street. True. And

2:12:40

that's the beloved Texas grocery store.

2:12:42

It is basically the epicenter of

2:12:44

all love. I'm getting one like five

2:12:46

minutes from my house. I'm so excited. It's

2:12:50

the best. And I suppose

2:12:53

when you have accessibility, do

2:12:55

you need to grow your own thing? I

2:12:58

say kind of. Probably should

2:13:01

in some cases, but should you in every case?

2:13:03

I'm not gonna grow my own grapes. No, right?

2:13:05

I'm not gonna grow my own raspberries. There

2:13:08

might be certain things I might be willing to. Would

2:13:10

I be willing to because I want to

2:13:12

have a lower price point or because I

2:13:14

want to reduce the, like we're doing with

2:13:16

solar even in our homes. Do we want

2:13:18

to reduce the pressure on the system? Is

2:13:21

that really the best way? Is

2:13:24

industrialized farming the best for

2:13:26

humanity? If done sustainably and

2:13:29

in wise ways that really is for the people and not

2:13:31

just for the profit. Cause I

2:13:33

think H-E-B has been a brand that's been

2:13:35

uniquely positioned in the food industry to

2:13:37

be for the people. Now are they for profit? For sure. And

2:13:40

I pray that they continue to be the H-E-B we

2:13:43

love today because they've been very good stewards of the

2:13:46

food buying process. However, when

2:13:48

I go to H-E-B right now, my food

2:13:50

bill is way high. Yeah. I'm

2:13:53

sure they're doing something to keep my costs low, but

2:13:55

you have a very particular

2:13:57

well-known Texas loved grocery store.

2:14:00

that seems to be for the people. When we had,

2:14:02

I think it was Hurricane Ike, or just one of

2:14:05

the recent hurricanes in the last four or five years,

2:14:07

the very first trucks to come in for support was

2:14:09

not the National Guard. Loved them too, of course, I

2:14:11

was in the military. But the very

2:14:13

first truck was an HEB truck to

2:14:16

come in and save the day. How cool. To

2:14:18

provide resources because it was either PR

2:14:20

or they just truly love, I don't know. But they've been

2:14:22

first in a lot of cases. I know they served with

2:14:24

the FAA. F-A-A?

2:14:26

F-F-A? Gosh,

2:14:28

so messed up here with the FAA. And

2:14:31

they're very involved with Houston Rodeo. Like,

2:14:33

I know that, because that's their market.

2:14:35

But I guess the question is, should

2:14:38

we be farming at the

2:14:40

local level in hydroponics and

2:14:42

things like that? Or should we develop

2:14:45

because of the population density or

2:14:48

the availability of land in places like Texas

2:14:50

or elsewhere where there's more acreage, enable

2:14:52

farmers to just do their job

2:14:54

better and keep the big industrial

2:14:58

food industry going? I

2:15:01

don't wanna say going, but I realize it

2:15:03

has to be industrialized in a way to

2:15:06

meet the needs of population. But

2:15:09

is the right answer to do what we've

2:15:11

done with solar and bring it localized to

2:15:13

the household? Or does it make sense

2:15:15

just to bolster and better enable the

2:15:17

complex? So there's an

2:15:19

in-between here that we haven't really discussed. Okay, what's

2:15:22

the in-between? So there's certain, I would

2:15:24

say, produce levels that you could do

2:15:26

at a community level. You don't have

2:15:28

to do it in the house, or

2:15:30

you don't have to do it. Point taken, I love that, community

2:15:33

gardens. Community gardens, or even like,

2:15:35

okay, on the outskirts of large centralized

2:15:37

hubs, we have a couple of warehouses

2:15:39

that do produce certain

2:15:41

amounts of produce for that localized

2:15:43

area. And so that's something like strawberries,

2:15:46

very easy to produce in hydroponics.

2:15:49

Instead of getting all of our strawberries from

2:15:51

Fresno, California, or Oxnard, California, where the best

2:15:53

strawberries come from, by the way, then

2:15:57

you have your own little strawberry hydroponic system.

2:15:59

You know just outside of Seattle or just

2:16:01

outside of New York and you don't have

2:16:03

to ship strawberries from Salinas or

2:16:05

Oxnard anymore And should it be

2:16:07

the community being invited in there

2:16:09

to like maybe you work for

2:16:11

free intend the garden so to speak to have

2:16:13

access to a membership I don't know like Because

2:16:17

the knowledge base is being lost because like you're like

2:16:19

you can give that knowledge back Is to

2:16:21

bring the people in who are consuming it

2:16:23

and care about it, right? And it's just like when

2:16:25

when you say you want water and XYZ it's horrible

2:16:27

country You don't just go and give them money you

2:16:29

go and help them teach them how to build a

2:16:32

well and maintain the well Because if they don't care

2:16:34

and they don't know how to deal with it

2:16:36

They're gonna rely upon you as the third-party resource

2:16:38

to save their day or to educate them if

2:16:41

you give them the knowledge They can fish of

2:16:43

course, right? So how do

2:16:45

we work out the community's involvement

2:16:47

in these community gardens? That's

2:16:49

a wonderful question and I don't

2:16:51

know if it's the community necessarily

2:16:53

or if it's just distributing Ag

2:16:55

a little bit, right? Do

2:16:58

you mind if I offer it? So the original

2:17:00

question was or there's been several original players. Yeah

2:17:04

The most recent one was you know, do we

2:17:06

keep it industrialized or to be distributed, right? And

2:17:08

I'm with Vanessa here I think it's not an

2:17:10

or situation. That's an ant situation So number one,

2:17:12

I think like let's acknowledge to

2:17:14

sustain the food production that like, you

2:17:16

know Where we're at today. It needs

2:17:18

to continue to be industrialized Like I

2:17:21

think that's I don't know if anyone

2:17:23

disagree with that They might but I think I think it

2:17:25

would need to be just to sustain the levels No,

2:17:27

I think can we get better can we

2:17:29

become more sustainable? Can we get better more

2:17:32

efficient and not just again for profit sake

2:17:34

but also like hey, can we make it

2:17:36

more efficient? So like we're producing more and

2:17:38

lowering costs and helping that for just the

2:17:40

better of humanity Absolutely, but with that I

2:17:43

think we can also encourage people and grow

2:17:45

that knowledge to build a more

2:17:47

individualized local solution No to the second one

2:17:49

follow-up which was how do you get the

2:17:51

community involved, right? I think number

2:17:53

one is okay. What

2:17:55

can you do in the household? I Think people

2:17:57

naturally have a they would like a sense of ownership.

2:18:00

Right? This is not our garden. This is

2:18:02

my garden. The some family's gardens. So I

2:18:04

think building resources I'm getting say that. I'm

2:18:06

not sure that I'm serious. The response of

2:18:08

the way or era contradicting her point of

2:18:11

go for it. Yeah, I saw as much.

2:18:13

If I give you V, I'm assuming the

2:18:15

I think starting there's where it's important right

2:18:17

now. Die I say start after that. I

2:18:19

think building can meet. You know do more

2:18:21

community century things. That's going to be the

2:18:23

best of both worlds. for then it's had

2:18:25

a good The community involved in office who

2:18:28

owns that process is it just hate some.

2:18:30

Nice people like don't it is land and

2:18:32

I hate this we own. This is going

2:18:34

be for community purpose, is the city or

2:18:36

county? I'm not sure. I think ultimately where

2:18:39

it starts is in the home or and

2:18:41

I think that's how you get people started

2:18:43

at least for the new building. That knowledge

2:18:45

in there that like to go to the

2:18:48

community garden with my literal neighbor. Yep, you

2:18:50

know the only way you're going to have

2:18:52

stronger Neighborhoods survey for neighborhoods is people caring

2:18:54

about the people next to them. Sure people

2:18:56

often people write this the only way yet

2:18:59

as if I care about my neighbor. Another

2:19:01

suit on. Yet another, I

2:19:03

would never be desiring any sort

2:19:05

of insults against them physically. Yeah,

2:19:07

because they're my friend there for

2:19:09

me. Now that that's also assuming

2:19:11

you have a neighbor that is

2:19:13

willing to also love you back.

2:19:15

Yeah, but you can also sizes

2:19:17

Hawksbill. You're not welcome here because

2:19:20

the the know a whole roll

2:19:22

right right as A as a

2:19:24

bookish Read it. The Axle Rose

2:19:26

has a was a lot of

2:19:28

our why others. But. I

2:19:30

would love to go to my

2:19:32

community garden and do whatever makes

2:19:35

sense to for myself or support

2:19:37

that process with my little neighbors.

2:19:39

i mean it could be deathly more the

2:19:41

cooperate like you sign up like okay i

2:19:44

want to be part of this farming community

2:19:46

i want to be part of growing this

2:19:48

thing and like okay cool awesome so now

2:19:50

by signing up here agreeing like okay hey

2:19:52

i'm gonna you know how with the fertilizer

2:19:55

costs or if you don't want to be

2:19:57

and fertilizer okay i'm gonna help take the

2:19:59

strawberry and containerize them and like okay

2:20:01

by signing up and putting in you know you put

2:20:03

in as much as you get. Maybe there's a

2:20:05

credit system I give you an hour or

2:20:08

two a week or an hour

2:20:10

a week or every two weeks or whatever it

2:20:12

is like you like you do with sports like

2:20:14

I go with my son Saturdays and Mondays we

2:20:16

spend a couple hours a week in a sport

2:20:18

now that's for not him to be an excellent

2:20:21

basketball player it's for him to be an

2:20:23

excellent team member right right which is what

2:20:25

a community is I'm an excellent team member

2:20:28

of my community sure right so I would

2:20:31

love to see the whole foods of this

2:20:33

right yeah or the HAB of this like

2:20:35

who can do this at a scale that

2:20:37

profits yeah maybe like you said I'm gonna

2:20:39

take that back with the profit the co-op

2:20:42

idea which is like it is there to

2:20:44

serve the community is not there to serve

2:20:46

the profit yeah and maybe there are literal

2:20:48

profits but they're not there to be just

2:20:51

scooped up by shareholders it's meant to reinvest

2:20:53

and re-enable communities and ag tech just seems

2:20:55

to be all in this I think right

2:20:57

yeah all in this I like the idea

2:21:00

of co-op especially in this specific

2:21:02

community if you have tech people right right

2:21:04

that might be the perfect opportunity for people at

2:21:06

home labs and if they want to like help

2:21:09

allocate some of those resources to say hey let

2:21:11

me get some sensors installed and let me make

2:21:13

this data available to everyone involved but then that

2:21:15

way I can publish these reports hey these are

2:21:18

my findings where do we want to go with

2:21:20

right that would open up some really exciting collaborative

2:21:22

opportunities and also that would get more

2:21:24

people I think interested in tech which I'm all

2:21:26

for right again ultimately it goes back to how

2:21:28

can we serve the community which I like the

2:21:30

way you put that Adam or

2:21:32

even like maybe there's farmers maybe you don't have

2:21:34

to like reinvent the wheel maybe there's already the wheel sure

2:21:37

so maybe it's about farmers enabling

2:21:40

the community to support them one by

2:21:43

buying potentially directly yeah right yeah and then

2:21:46

which you could do it farmers market the

2:21:48

whole point of those things is like enabling

2:21:50

communities accessibility but those kind of things have become

2:21:53

like basically where you go get your coffee where

2:21:55

you go get your other things I don't know

2:21:57

some random things they tend to be like flea

2:21:59

markets but upscale versions of them and not

2:22:01

just simply farmers markets. You can get your

2:22:04

corn there, of course, and whatever else might

2:22:06

be growing. But maybe it's also our ability

2:22:08

to the farmers letting

2:22:10

us in to support them. But

2:22:13

there has to be a system. The system is

2:22:15

like we can only be efficient if

2:22:17

there is a system, a workflow. And it seems

2:22:19

like there's no systems and workflows in

2:22:21

this realm that enables communities to serve and be

2:22:24

a part of the food

2:22:26

that serves their community. So what do you

2:22:28

think that system looks like? Would you envision

2:22:30

that like an open source platform communities

2:22:32

can sign up for, give them access

2:22:35

to some basic resources? What does that

2:22:37

system look like? Is it a service

2:22:39

that a company offers to communities? I

2:22:42

think it has to be the latter, to be

2:22:44

honest. It definitely has to be the latter because

2:22:46

an open source system, like you're

2:22:48

losing one or you're losing the other. And that's kind

2:22:50

of the hard part. You're either losing

2:22:52

the tech community or you're losing the farmers.

2:22:55

And so it definitely has

2:22:57

to be an established player in

2:23:00

farming, in tech, in ag. That kind

2:23:03

of brings all three together and makes

2:23:05

that a thing. I'm now

2:23:08

picturing, okay, let's say you have a private player

2:23:10

building this. Right, like a Whole Foods Amazon

2:23:12

kind of situation. I'm trying to picture what they

2:23:14

could offer. Could they offer a hardware solution where

2:23:16

it's like, hey, these are like greenhouses that you

2:23:19

can use. Right. Do you offer add-ons like,

2:23:21

hey, this gives you additional monitoring that you guys

2:23:23

all get access to. Yeah, I think that one

2:23:25

would work the best. I think that makes for

2:23:27

an interesting play. I'd be curious how... A

2:23:29

market play. A market play, yeah. Because it's tricky,

2:23:32

right? Because you're offering both physical

2:23:34

solutions potentially. Or maybe you offer

2:23:36

solutions that could go into an existing one, like just

2:23:38

sensors and just the platform. Okay, it

2:23:40

has to be a physical solution

2:23:43

because this is a physical problem. Like

2:23:46

if a lettuce was a piece of software,

2:23:48

I think we would have solved it by now.

2:23:51

Right, that's fair. Maybe. It might have bugs.

2:23:54

That's a good pun. I love it. I love it. Is

2:23:57

it a pun when you say it's a good pun? I

2:23:59

think so. Yeah. Yeah.

2:24:01

So it's like buying the warehouse, setting up

2:24:04

the hydroponic system. Yeah. Like, okay,

2:24:06

yeah, the software and stuff, like we could definitely open

2:24:08

source. You can like get the community building that. Sure.

2:24:11

Right? But if you think

2:24:13

about where Texas Centralized, it's cities. If you

2:24:15

think about where your major populations are, it's

2:24:17

cities. That's what a major population is. Who

2:24:19

has the highest need for produce and the

2:24:21

highest demand? It's going to be cities. Can

2:24:23

they grow it? They don't have the land

2:24:25

for it. Yeah. And so that's

2:24:27

why it's like a smaller scale community gardens.

2:24:29

Yeah. Like, okay, you know, 20 by

2:24:32

20 square feet. My rooftop gardens. Yeah.

2:24:35

Like rooftop greenhouses. Exactly. There's probably

2:24:37

so much of this out there that maybe you're

2:24:39

aware of. I'm just not. And

2:24:41

you asked that question really like, should we do it? I don't think

2:24:43

I'm equipped to even be like, yes or no. I

2:24:45

can give a lot of ideas and obviously provide a

2:24:47

platform for folks like you to talk and share what

2:24:49

you do know. But I would say I would love

2:24:51

to see you do agtech.fm.

2:24:55

If you like that domain, we just was riffing, but if

2:24:57

that's not your thing, whatever. Either way, I would

2:24:59

love to see this podcast be done because food

2:25:02

is humanity's utility. We cannot

2:25:04

not have food, right? We

2:25:07

can't even not have electricity, but

2:25:09

even more so we have to have food

2:25:11

and we have to have food that is loved

2:25:14

by us. I suppose that's not crazy

2:25:16

expensive. The grocery bills out the, I

2:25:18

can't even take going to the grocery

2:25:20

store now, like anxiety of like, what

2:25:23

will my bill be today? And I'm like,

2:25:25

like bearing down what I'm getting, like being

2:25:28

more mindful of what I'm purchasing even, you

2:25:30

know, and trying to be more mindful of

2:25:32

that. The idea of community is super important

2:25:34

to me. Obviously podcast and this conferences, that

2:25:37

conferences all about community. It's one of the

2:25:40

three pillars, right? Is education, community,

2:25:42

and what was the last one? Oh,

2:25:44

I don't remember network. Yeah, that was

2:25:46

it. There you go. And

2:25:48

I think that's really what this conversation was about. I don't know

2:25:50

how much more we have to say, but I think

2:25:53

the conversation could probably go on. I would encourage

2:25:55

you to do this podcast because it sounds like

2:25:58

There's some connection out there and some opportunities. The out

2:26:00

there for sure you that kinda thing

2:26:02

as you like tech and says a

2:26:04

goal food will play a role. Hardware

2:26:06

software yeah in some way. shape or

2:26:09

form as lot of ways can slice

2:26:11

and dice. More more plans

2:26:13

This thing. Love it! I can't help

2:26:15

it. But it's

2:26:17

infantine eyes eating else you want a

2:26:19

forgiving. Oh man I guess

2:26:21

like if you would like to address

2:26:23

any it as rebuilding this at skill

2:26:25

platforms you know where to have a

2:26:28

lot of security. Vulnerabilities and security

2:26:30

needs. So if. You want

2:26:32

to check out any security services available through

2:26:34

a P Eyes and as sick a sec

2:26:36

out. Pansy A Cyber. You're better

2:26:39

than I am just going to say I'm

2:26:41

on twitter I Gg voices now You are

2:26:43

better than I have ended my death as

2:26:45

in my final know I was I would

2:26:47

say them gender with the question right and

2:26:49

I would say to finance of course you

2:26:52

should with death From start this podcast sort

2:26:54

of platform in see what if you find

2:26:56

the answer yeah hold for answers I think

2:26:58

that's really excited. I agree. I agree. We

2:27:00

need champions in it. And

2:27:02

who better to be? the same for

2:27:05

the new Vanessa things as ones are

2:27:07

they good for dinner. And

2:27:13

move to and a half hours.

2:27:15

This is certainly our longest since

2:27:17

I've been friends ever and lucky

2:27:20

you still here. Listening just might

2:27:22

be crazy fool. you're crazy cool

2:27:24

for listening for this long. Here's

2:27:26

something else crazy cool dance party.

2:27:28

Our newest Break Master Cylinder album

2:27:30

is now out on Change Log

2:27:32

Beats. Enjoy the drop of a

2:27:34

fine it and spot a fi

2:27:36

an Apple Music in new to

2:27:39

music and all the rest to

2:27:41

search for change Log beats. Or

2:27:43

simply follow the link in your

2:27:45

show Notes: Disapproval: Sweden Robot Dance

2:27:47

Make our music! Thanks once again

2:27:49

the flight Io to Bmc of

2:27:51

course to Cloud Flair for bringing

2:27:53

us to that toss and to

2:27:55

Clark and his team for putting

2:27:57

on an awesome event. Oh and.

2:28:00

Also to our long time sponsors Sentry,

2:28:02

use code CHANGELOG to save $100 on

2:28:05

the team plan at sentry.io. Next

2:28:08

week on the CHANGELOG, news on

2:28:10

Monday, Stefano Mafuli, the executive director

2:28:12

of the open source initiative on

2:28:14

Wednesday, and long time listener and

2:28:16

CHANGELOG++ supporter Jamie Tana right here

2:28:19

on CHANGELOG and Friends on Friday.

2:28:21

Have a great weekend, share our

2:28:23

work with your friends who might

2:28:25

dig it, and we'll talk to

2:28:27

you again. No sense. Thanks

2:28:30

for watching. I'll see you next week.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features