Podchaser Logo
Home
168 - What Happens When the Robots Do the Editing?

168 - What Happens When the Robots Do the Editing?

Released Tuesday, 31st October 2023
Good episode? Give it some love!
168 - What Happens When the Robots Do the Editing?

168 - What Happens When the Robots Do the Editing?

168 - What Happens When the Robots Do the Editing?

168 - What Happens When the Robots Do the Editing?

Tuesday, 31st October 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

1:55

I

2:00

wrong it is a lot. It's a daily right?

2:02

Yeah, we haven't missed a day The goal was every

2:05

weekday morning have a new one ready to go and

2:07

four years coming up right on four years. That's

2:09

crazy That's fun. Yeah, that's

2:12

nuts And that's like navigating

2:14

around things that are outside of your control

2:17

like your voice working and that's that's really

2:19

hard to do Which

2:21

is why I have programmed this

2:23

AI using a learning model

2:26

based on the 10-minute Bible hour podcast right

2:28

now Would you like me to play some of it for you? Oh, that'd be

2:30

spectacular. Thank you. All right, listen to this Herpes

2:33

aren't the worst experience of my life

2:36

It's probably the double herpes that

2:38

I would say is the worst experience

2:40

of my life. Also make sure to read your

2:42

Bible React to that. What do you think about

2:44

what I just did? I'm very disappointed I would

2:47

never sound a something like that. I don't know just

2:49

like you did that was it sounded

2:51

a lot like me Listen to this what all I mean

2:54

and this one if I were stranded

2:56

on a desert island with Hitler and

2:59

four monkeys and I had a submarine

3:01

that could only take the four monkeys off of the

3:03

island But if so, I'd have to eat Hitler. I

3:06

wish I wasn't on that island React

3:08

to that one. What did you think? I think those

3:10

AI learning models they degrade over time Yeah, make

3:13

less sense that one wasn't as good. But I mean

3:15

you've given me so much data, huh? I Hear

3:19

I'm gonna just throw one out there. I'm gonna just have

3:21

it make the most offensive Arrangement

3:24

of words I can listen to this and go What

3:30

do you think about that one?

3:31

Well, I mean it was just

3:32

it was just

3:33

all the swears in order. I Don't

3:36

think I literally have ever heard a swear that wasn't

3:39

in that list How long do you think it took me

3:41

to train that model in order to use your

3:43

voice like that? Year two years. Yeah.

3:46

Yeah, actually I didn't I just got you to play along

3:48

and just say things at a microphone No,

3:55

but you're you're using AI

3:57

to do what? You're well, at least you're

3:59

testing it Or is it even AI testing?

4:02

If it's even AI, and is any of the

4:04

AI we're using really even AI? It's

4:07

all just modeling at this point. I don't

4:09

think it is Skynet at this point, it's

4:11

my understanding. We're repurposing existing

4:14

material and reformatting it based

4:17

on pattern recognition, but I don't think we're

4:20

doing thought. I don't think it's true artificial

4:22

intelligence. It's sort of a workflow

4:24

optimization. Yeah. Put technology

4:26

at this point. But you're the first of Twitter

4:29

or whatever you call that platform now, just rife

4:32

with all kinds of threads about,

4:34

oh, I use this language

4:37

model to monetize

4:39

my whatever. If you're not doing this, you're not,

4:42

I don't know, man, these influencers today. But

4:45

what are you actually trying to do? You're taking

4:47

your existing back catalog of the 10 Minute

4:49

Bible Hour podcast, and

4:52

you are trying to do what with it? All

4:54

I'm looking for is help with editing. I'm

4:58

not actually trying to use an AI

5:00

editor, but I'm okay

5:02

with working with a remote editor who

5:05

does use AI to crawl

5:08

a bunch of different stock footage

5:11

services and grab things

5:13

that might work and give me ideas

5:15

to brainstorm and to get things going for

5:18

sort of a, I guess like a video

5:20

essay kind of format. You've seen those on YouTube where

5:22

you don't see the person's face. You

5:24

put together a little video essay and then they cover it with

5:27

stock footage and relevant images that

5:29

get the point across. Yeah. Even

5:32

though really it was a podcast. Yeah. What's

5:34

interesting is you've got a person talking and then every four

5:36

to five seconds or I don't know, maybe it's

5:38

even lower than that, three to six or whatever,

5:41

you have an image change. I think it's

5:43

interesting because the visual part of your brain

5:45

is so powerful that you look at these

5:48

images and you associate it with what the person is saying.

5:51

It's almost like someone listens

5:53

to the podcast and if you say cake, they

5:55

would just go to some stock footage

5:58

site and they would look up cake. Yes.

6:01

And then like five images of cake come up

6:03

and they pick the one that they want to go in there. Yeah.

6:06

Yeah. And so figuring out, you know, what

6:09

role AI could play in the first

6:11

pass or in collecting a bunch of good options

6:14

for stock images or footage, that's

6:16

not really the goal. The goal is I made

6:19

a series of podcasts that through the books of

6:21

the Bible stuff is eight to 13

6:23

minutes of pop. And it's

6:25

just, I think, a really, I hope, helpful

6:28

introduction and survey, a covering

6:30

of the themes and all of that in a playful, fun way

6:33

of each book of the Bible. So what I want to

6:35

do is put that on the YouTube channel, but

6:37

I want something to be happening visually that

6:39

adds more value to that. Charts,

6:42

facts, figures, maps, classical

6:45

art, pictures of artifacts

6:48

and relics and ruins and things like

6:50

that, just so people can see and feel

6:52

and connect with the Bible a little bit

6:54

more and have that kind of support and bolster

6:57

the words I'm saying. What's up going on the screen,

6:59

right? Pretty simple text on screen, Bible verses,

7:01

et cetera. I bought a set of books

7:03

a long time ago to use bookstore. I think

7:05

it was Matthew Henry's commentary. Oh, wow. Yeah.

7:09

The Bible. I have them in the other room. Yeah.

7:12

And so you just pick any book in the Bible and you're like, oh, I'm in Judges chapter

7:14

five or whatever. And you just go get Matthew

7:17

Henry's commentary and you open it up and

7:19

you get to see exactly what this person said

7:22

about that book, which is interesting.

7:25

Now what I think is interesting and what

7:27

you're doing is the modern format is

7:29

video, like the equivalent of a scroll

7:32

today or the equivalent of a book today is video.

7:35

I mean, podcasts are great, but ultimately

7:38

the authoritative, I don't know, the

7:40

thing you look for when you're trying to

7:42

solve a problem is the video. I don't know why. That's

7:44

just the way it is. You have an opportunity here

7:47

to build videos of,

7:49

I don't know, the Matthew Henry commentary

7:51

of the Bible, but it would be the Matthew Whitman commentary

7:53

of the Bible. And when

7:56

we turn into dust, that potentially

7:58

could live a few hundred years. years past us. So

8:02

you're trying to brute force, I don't

8:04

know if that's the way to say it, you're trying to optimize

8:07

a way to turn these

8:09

podcasts that you've set into a microphone

8:12

into video. Yes. And

8:15

you're trying to streamline that process and

8:17

people are employing modern artificial

8:19

intelligence techniques in order to, I'm

8:22

assuming, just write the words

8:24

that you're saying and

8:26

then convert those spoken

8:29

words into written words and then those written words

8:31

into images. Yes. That's what you're doing. That's

8:33

what I think is happening. And you're investing into this,

8:36

you're putting money into this. I've hired a lot

8:39

of people for an audition and

8:42

ultimately I haven't found the right partner

8:44

here. So I'm sort of back to square one,

8:46

but back to square one having learned what

8:48

isn't the answer. So I've gone to

8:51

different editing companies, I've hired individual

8:54

editors and all good

8:56

and talented people. I'm asking them for a weird project,

8:59

but I'm just giving them an audio file that's 10 minutes

9:02

long and saying, cover this and make

9:04

it educationally effective. Clearly,

9:07

some of the people I've hired just went

9:09

out and ran that through a transcription program,

9:13

then took the transcription and ran that through

9:15

some sort of AI web crawler, something

9:18

that would go and search websites and

9:21

then did an auto-populate of

9:23

whatever returns they got. And

9:25

they were just like,

9:27

here you go.

9:28

This is an edit of your thing. Other people

9:31

just clearly went and manually did the work

9:33

and tried to understand the material a little

9:35

bit more. Ultimately, nobody I worked with

9:38

thinks in these categories enough. It's just not

9:40

what they do. It's not their area of specialization and

9:42

it's fine. It's a weird niche. They

9:44

just haven't been able to add enough value to the

9:46

viewer for it to work. But the AI editing

9:49

has been really interesting because it's

9:51

just some word that I

9:53

say that the AI fixates

9:56

on. It's like, okay, well, here's a picture of that. So

9:58

I'm talking about... Abraham and

10:00

Isaac and there's a ram in the

10:03

bushes and it's a whole dramatic story

10:05

with life and death stakes And it's like

10:07

did somebody say Bush? Well,

10:09

he was president of the United States from 2001 till 2009 really during that time

10:16

He fought against the like wait, what what do we

10:18

do? Huh? What and I mean? It's

10:21

a little bit hyperbole, but kind of not it's

10:23

been that kind of thing where you can tell

10:26

that AI is involved because it'll Be pretty

10:28

close on a couple of cuts

10:30

like oh, yeah.

10:31

Well sure enough. That is that is Israel.

10:34

Mm-hmm Now it's modern

10:36

Israel. I'm I'm talking about the Old Testament

10:38

and you showed me a picture of Benjamin yet Netanyahu

10:41

and you know the Israeli flag

10:43

and maybe an Israeli fighter

10:45

jet So that doesn't really work for an Old

10:47

Testament survey that it would be awesome

10:50

if there were fighter jets in the Old Testament

10:52

Maybe Ezekiel Art

10:54

things running around So

10:57

you can just see what happens. It doesn't understand

10:59

the whole concept It doesn't understand the theme

11:02

or the motif or the feel of the video,

11:04

which I think is art at all So

11:06

it's just like you said word. This

11:09

is picture of word picture of word.

11:11

You said behold. There it is I serve

11:13

you I give you thing master and it's

11:15

clunky and bad. But the other thing that's interesting

11:18

is Okay, let me ask you this

11:20

when you edit what is your timing

11:22

philosophy Like you say something

11:24

and then you make your next cut. It's called see say

11:27

explain that please. Yeah, so see say is

11:30

Sometimes you want the it depends on if I want to

11:32

be ahead of the viewer or behind the viewer

11:34

or right on time so if I'm saying

11:36

a word that's difficult as azimuth

11:40

azimuth is a word that means

11:42

an angular deflection like He

11:45

rotated 15 degrees in azimuth. So you rotate

11:48

right and you point the gun 15 degrees to the right

11:50

azimuth Azimuth and elevation

11:52

those are the measurement angles that you would

11:54

take when you're you know targeting something So

11:57

if I were to say azimuth what I

11:59

would do is So

14:00

then you just show it. Yeah, the timing

14:02

of that matters. Yes, because You

14:05

don't think it matters, but it really really

14:08

does Say it again that a

14:10

bring to boy leaves his hand It's about that point

14:12

it would have been Abram Abram leaves his home

14:14

in or of the Chaldeans and makes his way west

14:16

Okay in that case makes his

14:19

way west. So there's an idea that's

14:21

gonna follow that I would not expect

14:23

the map to stay up on screen for another

14:25

two seconds It would be makes

14:27

his way west and then when the T

14:29

hits that I might give it three

14:32

or four frames and Then

14:35

boom screen change. Okay, and

14:37

what I would do there is I would

14:39

have a map of the ancient near east and I

14:41

have four of the Chaldeans over here in a little

14:43

picture of maybe Abram's face and Then

14:46

he makes his way west and he starts tracking

14:49

You know kind of you kind of bend up north to get around

14:51

the desert when you travel in that part of the world He

14:53

starts tracking west and then I would pull

14:55

in on the next part of the story until he arrives

14:58

Yeah, that's even better and

15:01

then maybe if I was covering the whole video I would go

15:03

to a picture of that place like

15:05

a nice big panoramic stock

15:07

photo That you just use the credits on

15:09

because I don't have access. I haven't been there

15:12

and You show this beautiful

15:14

picture. Maybe I even put the name of it up there.

15:16

This is where he went to and Visual

15:19

storytelling visual storytelling. Yeah, the next

15:21

thing here's the next thing I like the way you did that even better

15:23

because you he makes his way west So

15:26

in the same motion you're moving west then you start

15:28

pushing in the natural visual

15:30

question is where is he going? And

15:33

you didn't have to say it the image did it

15:35

for you. Mm-hmm. Yeah, I like that way

15:37

better. So The

15:39

AI can't do that Yeah,

15:44

there's nothing more to it man, it's just bad really

15:47

all of those satisfying timings that you talked But I

15:49

remember one time I edited a video where

15:52

we were flying a drone in front of this man

15:54

near me if I said Can I want to know? That

15:57

video is still available. Yeah patrons

15:59

who are I'm curious to see it. That's great.

16:02

Uh-huh, that happened. No, it wasn't that time at all. I

16:04

was flying a drone that didn't hit anyone in the face. It

16:06

was way up high. I was looking at the gargoyles

16:08

on top of the Catholic Cathedral. And

16:11

we got Salt Lake City in the background. It's a beautiful shot.

16:14

And there's a dove that flies through

16:16

the shot. No way. Behind

16:19

the bell tower, and then it comes out the other

16:21

side of the bell tower, and in an early

16:23

draft, I cut it before the dove

16:26

made it to the edge of the screen. Yeah, you can't do

16:28

that. Yeah, this was such a complex video.

16:30

There were so many things in it. And

16:32

I sent it to you. I was like, oh, take a look at this. I don't hear

16:34

anything. I don't hear anything. I don't

16:37

hear anything. Finally, I got a phone call. It's one in the morning.

16:40

Hey,

16:41

go in there right now and let me see the

16:43

end of that dove flying to the edge of the screen. Okay,

16:46

I'll call you back if I have more.

16:47

That's it. You didn't give me any other

16:49

feedback on anything. It was just,

16:52

I need to see the dove finish the flight. It was the

16:54

wrong cut. It was the wrong cut. You

16:56

were right. It did really well, and clearly it was because

16:58

we included the little... It was because of the dove. Yeah, yeah,

17:00

exactly. There are all these sensibilities that you have.

17:03

I share a lot of them. I mean, every editor

17:05

has their own taste,

17:06

but

17:07

there are all of these sensibilities that are kind of best

17:09

practice, and they feel satisfying.

17:12

Even if you don't edit video, your brain

17:14

knows when a thing is edited right, and it's

17:17

satisfying. Everything

17:19

I wanted to see, every time I had a next question,

17:21

whoever edited this and gave it to me, that's great.

17:24

Have a like. Have a nice comment even.

17:26

That was wonderful, but when it doesn't

17:28

happen, it's so disorienting.

17:31

It's frustrating. Frustrating. For

17:33

me, it's frustrating. It's like, why

17:35

did you do that? When I make

17:37

a music montage in Smarter Every Day

17:40

and I have slow motion footage. Gordon

17:42

McGlattery writes the music for Smarter Every Day.

17:44

His company's called A Shell in the Pit.

17:47

He writes gorgeous music. Super talented. Very

17:49

different. Sometimes it sounds like a video game.

17:52

Sometimes it sounds like acoustic guitar. He has

17:54

a large quiver

17:57

of arrows he can pull from at any one moment to

17:59

give you a certain And so when

18:01

I'm doing slow-mo, I make sure to hit

18:03

any changes on the beat I'm

18:06

thinking I'm thinking of one song he has

18:08

in particular. Can I play it for you real quick?

18:11

Yeah, sure

18:19

It is very pentatonic. Oh,

18:22

yeah. Yeah So

18:25

he's got music like so you slowly

18:27

ramp this up and then as

18:30

this music comes in more you're like, okay Well,

18:32

I'm gonna let it get more complicated visually and

18:35

then eventually Okay mechanical

18:37

thing happens now Yes,

18:40

right. Yes. This is a build montage Music,

18:44

you know what this makes me think of what

18:46

smarter every day the YouTube channel Yeah

18:49

In fact this the name of this song is

18:51

on the smarter everyday edit So he

18:53

he makes edits of the videos specifically

18:56

for smarter every day So this right here is

18:58

where it gets really complicated and we're gonna hit it

19:00

now Sorry, I waited

19:02

too long Wow,

19:08

it's like 8-bit Baroque. Yeah, he

19:10

does the order of the cosmos being

19:12

conveyed But

19:14

the intricacy on the deep great deep

19:16

hits of the music if I ever have to change

19:19

I would change and if there's a Right

19:21

here. We have to shift down Now

19:27

you get the idea so it's

19:29

important to hit on the music and each

19:31

different This is weird because

19:33

you're better at music to me each movement of

19:35

the music. I might change

19:37

a scene Go back to real-time

19:40

footage to show me tightening a ranch on the thing

19:42

that's about to do the thing so

19:44

I don't know. That's the kind of stuff I do. Yeah,

19:47

and it's great. You're a really good editor and George's

19:51

to George's great and you are too. Yeah,

19:53

my buddy Nate who helps me out with some of my church

19:55

visit video And it's he's

19:57

wonderful. I mean of the four of us. He's the only one who's

19:59

one em

19:59

Do shorts have

20:02

Emmys? He might. I don't know.

20:03

He's good. He's got quite the track record. Yeah,

20:06

he's good. Yeah. Nate's real good. Yeah,

20:08

so the not humanness of

20:11

the AI decision-making is

20:13

really evident on a few of these attempts.

20:16

And I think all of these attempts are a little bit human,

20:18

a little bit AI, but the cuts

20:20

just don't make sense. And so it's frustrating when

20:22

you watch somebody else's video that is on

20:25

YouTube just not do

20:27

the things it ought to do. If you just put that

20:30

there and put that there, it would make sense, but you didn't.

20:32

It's really frustrating when you're watching a draft

20:35

of your own material with your own voice.

20:37

And so then what's happened

20:39

in some of these rough edits that I've got back

20:42

is clearly the editors or

20:44

the editing AI software suites

20:46

that they're trying to use. They don't

20:49

recognize certain historical artifacts

20:51

or details and they

20:53

just take stab at it. And so

20:56

in some of these edits I'm getting these full

20:58

AI panels, AI written panels that

21:00

are just word salads, horrible

21:04

writing, atrocious, overly

21:06

busy. What do you mean? Just tortured

21:09

language. So for example, there

21:11

was, now just understand too, I've

21:14

sent three videos, one

21:16

that is a bread and butter video,

21:19

me sitting at my desk talking about history

21:21

stuff in the Bible. Another that is

21:23

a very typical video for me where I go and visit

21:25

a church and I ask somebody to tell me about

21:27

their understanding of religion and God

21:30

and Christianity and I ask questions and just listen

21:33

and marvel at their church building. And

21:35

then I sent out another video where it's

21:37

just my voice talking about the book of Genesis.

21:40

I'm comparing what I get back from different companies

21:43

on all of these edits. On one

21:45

of the edits I got back on this particular

21:47

church visit video. It's a church

21:50

that is all built

21:52

in the style artistically of the book of

21:54

Kells. Have you heard of the book of

21:56

Kells? It's something to do with

21:59

Ireland.

23:45

of

24:00

what worship should look like. It's too

24:02

busy. It just plastered over it? They

24:04

plastered over all of it. That's how- Wait.

24:07

That doesn't sound like you're being nice to people. That sounds

24:09

like you're- I don't know. Dude,

24:11

I wasn't there. It was World War

24:13

II. That's what they told you. That's what they told

24:15

me. Okay. So I'm going to choose to give them the benefit of the doubt.

24:18

Okay. That they were clearly doing so to

24:20

try to accommodate their Christian,

24:22

albeit different, Christian brothers and

24:25

sisters who were in a very difficult time. I

24:27

took it as a pretty neat story. And then in the 90s,

24:30

times had changed again. They do a restoration

24:33

on the church. And now all

24:35

of this, old St. Patrick's Church in downtown

24:38

Chicago, now all of the Book of Kelsusness

24:41

is on display. It's just beautiful. So I

24:43

had a few different editors take a run at this thing. And

24:46

one of the editors relied heavily

24:48

on AI for the writing. Not even

24:50

Google searches, I don't think. I'm just guessing.

24:53

You're not allowed to talk to the editors when you use these

24:55

companies. Oh, okay. Are they people

24:58

in the same country? Like are these people in America?

25:00

Uh-uh. No, I'm almost a hundred percent

25:02

sure they are not. Okay. I think

25:04

they are using AI

25:07

to look- like I think

25:09

they're entering something and saying, but now

25:11

frame that in a natural Midwestern American

25:13

style. Because you notice some of the AI

25:15

blemishes even in the written communication back

25:17

and forth between editor and myself. Really?

25:20

I think so. And I think these people are

25:22

native English speakers? Yeah,

25:26

maybe as a second language. Okay. They're

25:28

all obviously incredibly competent people,

25:30

but they're clearly working from overseas.

25:33

And that's what makes the economics of it probably

25:35

work. I don't know, you're not allowed to know what the economics

25:38

of it are or how it works at all. Maybe it's

25:40

a dubious problematic arrangement. I don't

25:42

know. What I do know is everybody I worked with is

25:45

really talented at all of these different companies, even

25:47

if they don't know the material. And why would I expect them to know

25:49

this material? It's very obscure. But

25:51

they would run into things clearly in

25:53

the editorial decision-making process and be

25:55

like, all right, AI, bail me out.

25:58

Write me a thing. about the

26:00

book of Kells. But here's the

26:02

thing, man. In the last year

26:05

of the evolution of AI and the integration

26:07

of AI, there are a lot of people who

26:09

I've read who are making compelling cases that

26:12

the writing side of it at least is getting worse.

26:15

Yes, I've heard this. What have you heard about

26:17

it? It just seems like in the early days,

26:19

it was learning off good stuff. I

26:22

don't know if this is it, but now the

26:24

things that are being input into the model are trash

26:27

and so you're getting trash out. So it's

26:29

almost like when you make, you remember back in the day, we

26:31

had Xerox machines, copiers, photocopiers?

26:34

Yeah, you make a photocopy of your butt? Yeah,

26:37

exactly. No, no. When you get a photocopy

26:40

of a photocopy, you'd be like, oh, it brought

26:42

over the blemishes from the blemishes and then,

26:45

and you could keep doing that over and over

26:48

and over and eventually the worksheet

26:50

your teacher would hand you be like, what is up with

26:52

it? I'm sorry, I can't read this number.

26:55

Yeah, have all the artifacts and flex and...

26:57

Yeah, I wonder if it's something like that. I

27:00

don't understand how it works well enough at all. I

27:02

mean, I'm tinkering with it as an end

27:04

user and I'm tinkering with it. Holy cow,

27:06

you just made me think something, dude. I'm sorry to interrupt you.

27:09

Go for it. That means we're gonna have to start

27:11

seeding. Like you're gonna be able to make

27:13

money with the seed for a

27:15

language model. Like, hey, here

27:18

is a, you got a good clean foundation

27:20

for a language model. You know what

27:22

I mean?

27:23

Like we're gonna be... A real transcode? Yeah, exactly.

27:26

That's interesting. So

27:28

the name of the game is going to be infect other people's

27:30

AI. Right. Scary.

27:33

And so even if we think of say a YouTube

27:35

channel's algorithm as being a

27:37

little bit AI-ish, I mean, it's machine learning. We've

27:40

talked before and we've looked at hard

27:42

empirical data to suggest

27:45

this video or this channel

27:47

is stuck. The AI is fixed. It doesn't

27:50

know what to do with this particular product.

27:52

And we've both talked with people who have arrived

27:55

at that place of being AI stuck or

27:57

algorithmically fixed, overfitted.

28:00

overfitted thank you and knew there was another term

28:02

and They go and

28:04

launch something else with the same talent the same material

28:06

They're still the same person and it works

28:09

the algorithm likes it Mmm, it goes so

28:11

we know that overfitting happens and

28:14

it looks like it's happening in Particularly

28:17

the language models the chatbots

28:19

over the course of the last year. It's been overused

28:22

with bad prompts And now it's

28:25

learning off of itself So

28:27

it's learning off of garbage it has written

28:29

if it is known echo chamber There

28:32

you go.

28:33

That's what I think is happening. Really?

28:36

I think you are reading stuff written by AI

28:38

so much more than you realize and

28:41

The more that happens the more that increases

28:43

the percentage of material that

28:46

the AI is Gleaning from

28:48

that is also AI It's

28:51

putting a second and third and fourth layer

28:54

between actual human minds writing

28:57

things and What we are modeling

28:59

from with our AI assistance One

29:09

being the least amount ten being the maximum

29:11

amount What level of candor do you

29:13

think makes sense for this little

29:15

spot where we talk about the fact that there isn't

29:17

an ad? on this episode was On

29:21

your scale was it? Yeah, I don't

29:23

know like from zero to one. That's not

29:25

in part in the scale What was

29:28

zero to one would be like we just

29:30

don't admit that there's no ad

29:32

on this episode Okay, and we just

29:34

hope nobody heard me just say that Oh Let's

29:38

go ten Yeah,

29:42

great. Yeah, there's no ad which means this one

29:45

is fully brought to you by The

29:47

handful of people who support this program to whom we are

29:49

super super grateful Yeah, patreon

29:52

is becoming increasingly more important

29:54

for notum questions, and I am also

29:57

grateful So I think

29:59

it's time to do something for the patrons again. Well,

30:02

I think so too. And I've been cooking

30:05

up ideas because it's not just notable questions.

30:07

I mean, the whole economy of everything

30:09

is making it such that Patreon

30:11

is kind of the way, it's the most important

30:14

thing for pretty much everybody who makes

30:16

stuff on the internet.

30:17

And so, yeah, I've been thinking the same stuff that...

30:20

The reason...

30:21

I'm definitely going to start this again. I absolutely hate

30:23

talking about things like this, dude. It

30:27

cripples me. So easy and so natural

30:29

for you and it just cripples me. I don't... Well,

30:32

let's just... I don't know. Let's just keep going.

30:34

Oh my goodness. Come on. I mean,

30:37

it's just the way it is, right? So I think

30:39

the reasons behind all this could

30:41

fill a whole episode. I think it

30:43

has a lot to do with inflation, which has a lot

30:45

to do with printing money, which

30:48

has a lot to do with all the things. Yes.

30:51

And so that affects marketing budgets. And

30:54

yeah, that's just where it's at.

30:57

Yeah, and it's

30:59

okay also. Yeah,

31:02

absolutely. It's okay that economies

31:04

ebb and flow and things like that. This is part of

31:06

reality, but the most

31:09

meaningful bedrock thing in terms

31:11

of how something like this happens is the

31:13

people who like it the most kicking in on it. That's

31:16

the way it works. Yeah, I also want to do an

31:18

episode on the Medici family. I

31:20

think that'd be cool. Whoa. Yeah.

31:23

Not now. I mean, that's a hard pivot

31:25

right there. Well, you just want to run it right now

31:27

and just keep going. No. It could

31:29

just be like a little bonus 90-minute episode

31:32

inside of this other episode. No, I

31:34

don't. I don't. But the fact that they... Why

31:36

do you want to talk about the Medicis? What's happening to you? Well,

31:39

the fact that they were patrons of

31:41

the arts and stuff like that, I think it's very interesting.

31:43

Oh. Yeah. Okay,

31:45

I see what you did. I'm grateful for the patrons.

31:48

It makes no damn questions possible. It pays

31:50

us. It pays Tina. And

31:52

I'm grateful. And I think we should send stickers out.

31:55

I would like to send stickers out. It's

31:57

time. I really like what you're doing on...

33:59

or

34:01

there's gotta be a good idea for something out there.

34:04

That's a really interesting idea. We could

34:06

do several things. We've talked about all kinds of stuff this

34:08

year. Yeah, it needs to

34:10

be this year.

34:12

And so then we send it out, and it's like, thanks

34:14

for supporting this last year

34:16

when we talked about the thing. And if you weren't

34:18

supporting this last year, then you don't get that sticker,

34:21

but you can get next year's sticker. Yep, but

34:23

it's an evergreen sticker. Think about something we talk about next

34:25

year. Like, it's a sticker that works all the time.

34:27

We don't put the year on it, but it's only like

34:29

a limited run. That's how we do it on Smarter

34:32

Every Day. I think it'd be awesome. Let's do it. Yeah,

34:34

okay, I like that a lot. And yeah, it is

34:36

time to send something out. I mean, that's always kind of been the model

34:39

is just hear something fun from time

34:41

to time and thank you, and it's a goodwill engine.

34:43

And yeah,

34:44

more than ever, that's what Patreon

34:47

is.

34:47

It's a goodwill engine, thick and thin.

34:49

It's really appreciated. Yeah, it is. So

34:52

if you wanna support, you can go to patreon.com

34:54

slash no dumb questions. And

34:56

you will, like, if you're a supporter at the

34:58

end of the year, you will get the sticker.

35:01

Let's do it that way, right? I like that a lot. Yep,

35:03

those are good ground rules, but we should probably go make

35:05

that sticker. But you gotta tell us what the

35:07

sticker needs to be first. That's, so it's not our fault.

35:10

The ball's in your court, y'all. You gotta tell

35:12

us what the sticker's supposed to be. Oh, we're off the hook. Look

35:14

at that. Thank you for your support. Obviously,

35:16

it means a lot, especially right now.

35:18

Thank you. Yeah, agreed. Is

35:21

that good? I think it's good. All

35:24

right. Do

35:26

we close file? That's close file. Closing three,

35:29

two, one, close.

35:35

I wonder what the first word to enter

35:37

the English language that was not

35:39

invented by a human mind will be.

35:42

Scrotulate.

35:43

Just a guess.

35:48

Excellent choice of prefixes,

35:50

my friend. Scrotulate.

35:55

Yeah, that was your brain. That was definitely your brain.

35:57

Oh, you got me again. Definitely your brain.

36:00

in one episode.

36:02

No, I think it's gonna happen.

36:05

I think there will be something

36:08

that authentically happens in our culture

36:10

that did not have a human brain origin.

36:12

It was a derivative product. I just

36:14

don't know when that's gonna happen, but it'll happen. Interesting.

36:18

Well, I can tell you this. What

36:21

the AI churned out on my videos

36:24

in terms of the written sections was

36:26

an absolute dumpster fire. Well,

36:28

you also are dealing with matters

36:31

of religion, faith, Bible.

36:34

So how did it do on the doctrine? Terribly.

36:37

What was the most heretical thing it did?

36:40

It's not even so much that it's heretical. It's just

36:42

a mishmash of a whole

36:44

bunch of incompatible ideas. And

36:47

so, you know, if you don't think hard about

36:49

it, I just here's

36:51

a Christian religious sounding phrase

36:53

and here's a Muslim one. I mean, they

36:55

both more or less sort of acknowledge a God

36:58

thing and there's some Mormon assumptions

37:00

over here and here's a phrase from

37:02

Hinduism and really there might be truth in all

37:05

of those particular phrases. There might

37:07

be beauty in all of those phrases.

37:10

But if you really look carefully at all

37:12

of the words and meanings behind all of those phrases,

37:14

you realize there are some mutually incompatible

37:17

belief systems being crammed into one thing

37:19

here and it just doesn't work. So

37:22

it would be like if you

37:24

saw something from your field that was just

37:26

a bunch of engineering drivel,

37:29

just a bunch of engineering philosophy I do see this. I've

37:31

flung together into a paragraph. You know where

37:33

I see this? Tell me. I see this on videos,

37:36

not just modern ones that have been

37:38

created by AI. I've seen footage

37:41

on the internet that I filmed

37:43

with my hands that

37:45

is being talked about by this other channel.

37:48

Yes. And I'm listening to them articulate

37:52

and speculate about what's going on and

37:54

say it with authority and

37:57

I was physically there when it was filmed and

37:59

I know

37:59

know it's horsepucky. And

38:01

I'm sitting there watching and I'm like, this

38:04

is a lie. And I look down at the view count

38:07

and I say, holy moly. And I look

38:09

at the channel and I'll be like, this channel has

38:11

a lot of viewers and they're propagating

38:14

misinformation right now. And

38:16

they're doing it as if it's true. And I've

38:18

thought to myself, you're an unethical

38:20

person. What

38:22

a gentle thing to think, given the circumstances.

38:25

Yeah. Yeah. I've seen that. I've seen other things

38:28

where it's a field that I know a lot about

38:30

and I've seen something spoken

38:33

with authority, even with an animation.

38:35

And it was not correct.

38:37

And I look at the subscriber count and

38:39

I look at the views and I'm like, wow,

38:43

this is just the thing I know about. What

38:45

are you doing with the things I don't know about?

38:49

Yeah. In my field, a lot of times when people are like,

38:51

hey, that was wrong. What they mean is I don't

38:54

hold that theological opinion on that

38:56

issue. But it's not one or zero

38:58

wrong. It just isn't. I mean, that's

39:00

just like, that's your read, that's your opinion.

39:03

But where it can be wrong is

39:05

if let's say we are talking about

39:07

a Presbyterian set of assumptions.

39:09

That's a certain brand of historic

39:12

Christianity, Presbyterians.

39:14

And they have a certain way of governing themselves.

39:17

If we went and we were talking

39:19

about what the Presbyterians think and

39:22

we talked about a Catholic form of

39:24

governance, well, then that would be wrong

39:26

because that isn't what the Presbyterians think

39:29

and it's incompatible with all of their other assumptions

39:32

about how church ought to be. It

39:34

doesn't mean it's universally wrong

39:36

and like the grand ethical, biblical,

39:39

what mind of God kind of sense.

39:42

It's just you're trying to take something that's incompatible

39:45

with a system and pretend

39:47

like it fits in that system. Well, it doesn't. It's wrong

39:49

when you put it there.

39:51

That can't work.

39:52

It's taxonomically wrong.

39:55

That makes sense.

39:56

Taxonomically, what does that mean?

39:58

Like in terms of.

39:59

organizing and ordering and categorizing

40:02

things. Okay, it doesn't fit with that, I

40:04

see, category.

40:06

That's the kind of stuff that the AI kept getting wrong

40:08

on these edits. It would say things that

40:10

are Christian from a different expression

40:12

of Christianity, or maybe we're

40:15

Muslim, but sort of sound Christian, Muslim

40:18

in their assumptions, and try to make

40:20

it fit with something Catholic. So the same

40:23

feeling I get when I read a phishing attempt

40:25

email, where it's like, you're

40:27

saying all the right words, but

40:30

you're, there's something off. I

40:32

don't know what this is. Did it even get that close, or was

40:35

it? I don't think it got quite that close. Really? There's

40:37

one moment where I'm looking at

40:40

the stained glass ceiling, depicting

40:42

the four authors of the Book of Kells.

40:45

They're symbols from Matthew, Mark, Luke, and John, the

40:47

four biblical authors of the stories of Jesus,

40:49

the Gospels, and it depicts their four symbols,

40:52

a beautiful stained glass, just magnificent.

40:55

And I'm looking at it, and the AI

40:57

editing cuts to a panel that

40:59

describes, I have no idea

41:02

what. It doesn't make any sense. It shows

41:04

a picture of something. We were just looking at it. We

41:06

don't need to cut to a picture. It shows

41:08

a picture of something that makes no sense from another

41:10

document that isn't in any way related

41:13

to the Book of Kells. And then it goes on to

41:15

describe what's happening in this document

41:17

with just tortured language, no

41:19

context. No, here's an image

41:22

of, it just Kool-Aid man's

41:24

through the wall. Here's an eagle. The eagle has

41:26

a hook. There's a man with a beard pulling

41:29

on the hook and a... Wait, what? What the heck is happening?

41:31

Oh, so they put this weird Celtic image

41:33

through an AI interpretive.

41:36

I don't know. Can you put an image in AI

41:38

and it will tell you what it is? Is that a thing you can do? I

41:41

don't know. Okay. What is going on with

41:43

the things I hired. I have no idea. I

41:45

wanted to see what would happen if I gave it this

41:47

input. You showed me a couple of things, and it looked like some...

41:51

I don't know. The short thing you showed me,

41:54

some of it looked okay. Some of it looked

41:56

like some kind of caterpillar Alice

41:58

in Wonderland drug. trip or

42:00

something. Okay that's fair. Would I imagine drugs to

42:02

feel like? Okay, alright. I don't know. So

42:06

what are you gonna do with this? Oh, did

42:08

you waste your money? No, certainly

42:10

not. I'm learning ways not to do it. So

42:13

I'm starting at the lowest hanging fruit.

42:15

Okay, well what happens if we

42:18

just hire an affordable editing

42:20

company that you can just Google? Editing company,

42:22

edit my videos. Can they do this? No. They

42:25

can't come close to doing it. That won't work. Alright,

42:28

well what happens if I hire

42:30

some different individuals? They can't because of money? Why

42:32

can't an editing company edit the videos? Because

42:35

they don't know the material. I see. It's

42:38

not fair to ask somebody for whatever

42:41

amount of money they charge to

42:43

go and learn all of the intricate

42:45

details of the Desert Wandering and the Book

42:47

of Deuteronomy and the beginning

42:50

of the conquest of Canaan at

42:52

the beginning of the Book of Judges to make a 10-minute

42:54

video. That's not fair. They're just not

42:56

conversant. So the amount of time it takes them

42:59

to vet an image or a

43:01

painting or, okay, what archaeological

43:04

site is this? Is it tell this or

43:07

is it tell that? And so

43:10

the mistakes that have been made

43:12

in these edits, they're innocent.

43:14

They just demonstrate

43:16

a lack of knowledge of the material. What's the process?

43:19

Do you say, hey, that

43:21

right there is a, I don't know, that's

43:23

a ball of fire. I don't even know what I'm

43:25

talking about right now. Well, I can fill in your

43:27

question. Okay. So for example, I

43:29

worked on a video about Susa, one

43:32

that ultimately I ended up kind of landing

43:34

the plane on and getting to where it needed to be. But it

43:36

was fun. That's a city, right? I had a lot of people take a run at

43:38

this ancient Persian capital of Susa.

43:41

I'm sorry I haven't watched the video yet. You're just

43:43

fine. Yeah, okay. Good.

43:45

And yeah,

43:47

I got different things back stylistically from different

43:49

editors I put it in front of and I've

43:51

already published the video and it turned out I

43:53

think fine, it's informative, it works.

43:56

But

43:57

the returns that were happening from search

43:59

engines and from the AI

44:02

were just anachronistic and incorrect. So for example,

44:04

I make a remark at one point where I say Xerxes

44:08

sitting upon his throne in the

44:11

palace, Susa thought to himself,

44:14

and the image that comes up is

44:16

like a clipart of Xerxes

44:19

sitting upon a portable throne overlooking

44:21

the Battle of Salamis in Greece.

44:24

Close, close. When he's

44:27

thinking, did like the statue of Rodin

44:29

come up, the thinker? That's the

44:31

kind of stuff that you get. Yeah. That's the kind

44:33

of stuff it does. It's every low-hanging

44:36

visual joke, and not even joke,

44:39

just association. So

44:41

what's the feedback loop? Okay, clearly it's wrong.

44:44

What do you do to correct it? Oh yeah. So anyways, I've

44:46

been using a program called Frame.io,

44:49

Frame.io, and the

44:51

interface is a lot like Adobe Premiere, which is what

44:53

you and I edit in. And

44:56

I just go through, and everywhere

44:58

that something's wrong, I have

45:00

to type a note. I can't go through

45:02

it verbally with these editors because

45:04

you're not allowed to talk with the editors ever.

45:07

No email, no Zoom calls. You

45:09

can't know their name. You just get

45:11

a first name that may or may not be true, a

45:14

picture that may or may not be them, and

45:16

everything is fed through a communication

45:19

channel with the American-based

45:21

project manager. Clearly,

45:24

they don't want their talent getting poached. I

45:26

don't blame them for that, but

45:28

it makes getting good edits super

45:31

hard because

45:33

you can't actually communicate.

45:35

So then I would mark the error

45:37

on the picture of Xerxes and be like, hey, this is great.

45:40

This is definitely King Xerxes, but

45:42

he's in Greece here, and

45:44

that's the wrong part of the planet. You know,

45:46

we need a different shot. And then

45:48

ultimately, I'd go into my archives

45:51

because I have an ever-growing database of pictures

45:53

I've taken in my travels, and

45:55

I would grab the really cool

45:58

relief of Xerxes from...

45:59

another old Persian capital, Persepolis,

46:35

and it's important to know, I mean you're

46:37

doing the Thomas Edison thing, I

46:39

made a thousand things that didn't work, you'll

46:42

have another tool in your toolbox, and so maybe

46:44

in the future, you're like, hey,

46:46

I have the simple thing, I just

46:48

need to cover the screen with XYZ, and then maybe it'll

46:50

give you a good first cut, and then

46:53

you can go in there, and it'll give

46:55

you ideas or something like that. Do you mind

46:57

me asking, how much does this cost for

46:59

a single like first pass edit? Do

47:02

you pay for a first pass of an edit? I've paid

47:05

a variety of amounts because I wanted to figure out... What

47:08

does what by me? And it's

47:10

been painful. I intentionally set

47:12

aside a budget for this, the budget hurts,

47:14

but I'm trying to solve a

47:18

bigger equation than just get a couple videos published.

47:20

I've got an idea,

47:22

I have a mission, I have a plan for what I want

47:24

to accomplish with my YouTube channel moving forward, and

47:27

that was kind of a question mark for a while because

47:30

the podcast sort of took over everything

47:32

I was doing, and that's great. I

47:35

mean, it's great, it's so much fun. I'm glad people

47:37

like it, I like making it, but now this YouTube

47:39

channel, finally, I'm like,

47:41

all right, I know what I'm doing, I know

47:43

what needs to happen next, but I want to build

47:45

out a whole system for how I do it before I tear

47:47

off in this direction. So I set money aside, probably

47:50

the least I paid for

47:52

an edit was $300. Wow.

47:55

Which hurt? Yes, $300 is a lot.

47:59

of money.

48:01

But in terms of sitting down and editing,

48:03

that's a lot of hours. Right.

48:07

Unless you super know the material, in which case

48:09

it's, oh, well, the right picture here is this, the

48:11

right picture here is that. But nobody knows that

48:13

material. Nobody studies that. The

48:16

most I paid for an edit was $1,500. You're

48:19

going out into unknown territory and you're

48:22

actually taking stock of the situation.

48:25

This was a huge investment.

48:27

Yeah. And I've been able to publish

48:29

almost nothing. Yeah.

48:32

Yeah. And next to what I have published,

48:34

I had to do the work on ultimately. Interesting.

48:37

I mean, do you get project files

48:39

and they still have not provided

48:41

me with those? I have asked. I haven't been

48:43

given those. So it makes it really hard to go and fix

48:45

their last few mistakes. Wow.

48:48

It's just a bad workflow. What that kind of stuff

48:50

is for is lowbrow, simplistic,

48:55

faceless, automated, crass.

48:58

Let's just try to get some money out of YouTube,

49:01

audience building channels where you make

49:03

a YouTube farm. For bot farms. 15

49:06

channels. Amazing things from

49:08

the internet today. This amazing

49:10

thing happened on the internet today. Yeah.

49:12

This man thought he could catch the cat, but

49:14

he couldn't. This man had to unclog

49:17

the drain. He did. Interesting.

49:19

That's reassuring actually. Like

49:22

this actually makes me feel good because

49:25

it makes, not that it's

49:27

hard to do what we do, but it makes

49:29

me feel good that for example,

49:32

my friends who might be older and watching

49:34

more things when they swipe on their phone, they

49:37

won't be tricked. It's harder to make

49:40

inauthentic material than I thought. Yes.

49:43

I thought you'd be able to plug

49:45

in words to a major politician's

49:47

mouth and have him or her say

49:49

something crazy and have it passes.

49:52

I mean, I know this can be done. It can be done.

49:55

I thought it would be far easier than this

49:57

or like narrating.

49:59

something. So it's

50:01

reassuring that it's not like pop, pop,

50:03

fizz, fizz. It's far more

50:05

hands-on. This is where I'd say it's at

50:07

right now. This isn't a very evergreen

50:09

observation because we know it'll be different in six months.

50:12

But right now, I think AI

50:14

language modeling is getting really bad, really

50:17

predictable, and people can smell

50:19

it. It feels sterile and

50:22

trite and safe and

50:25

institutional. Even when it tries

50:27

to sound human and friendly and colloquial, it

50:29

still sounds that – it's just a

50:31

stink on AI writing. I think

50:33

it's bad, and I wouldn't be too worried about

50:36

that replacing anybody's jobs right now. It makes the value

50:38

of authenticity go up. It does. I

50:41

think that AI video looks

50:43

like total garbage right now. There's

50:46

a show that came out a while back

50:48

called Secret Invasion, another one of these Disney

50:51

Plus utter flops, just

50:53

total failure. I think that show went so

50:55

badly, they're going to reevaluate their whole Star

50:58

Wars streaming. I mean, it

51:00

was awful. But their

51:02

opening was entirely generated by AI, and

51:05

it's horrid. It feels

51:08

completely inhuman and

51:10

like a mockery of a human-making

51:13

art. It's

51:14

bad.

51:15

However, I will say that

51:18

AI stable visual art,

51:20

it's really getting somewhere. What do you mean, like a single

51:23

image? Yeah. Okay. Yeah. It's

51:25

getting somewhere. It's the tells

51:27

and the things that used to give it away. As

51:30

we get into these new versions of Mid-Journey

51:32

in particular is what everybody's using. It's

51:35

getting good. I mean, you can

51:37

go in and specifically

51:40

circle and edit things you didn't like

51:42

in-app now. So if everything's

51:45

perfect – Do you pay for Mid-Journey? I've never used it.

51:47

I do. Yeah. Yeah. It's 80 bucks, 80

51:50

bucks a month. Wow. So I'm not going to pay for it forever.

51:53

But I'm using it really hard right now. I'm

51:55

trying to build out something that I'm going to use in a whole

51:58

series of videos that are coming up. There's

52:00

just a few pieces of specific art,

52:03

avatars, characters that I'm trying to build.

52:05

I'm really deep into the mid-journey world, learning

52:08

how to seed it with images, so that

52:10

rather than just randomly generating characters

52:13

that don't look the same, I want the same character from

52:15

image to image to image. I'm using

52:18

Adobe's generative AI

52:20

to select that, separate it from the background,

52:23

make any final augmentations. I'm

52:26

using AI a lot right now for image

52:29

art and photo images,

52:32

and I'm even figuring out how to use AI

52:34

tools to improve garbage

52:37

old photos of Nineveh before ISIS

52:39

destroyed it. The pictures just aren't any good

52:41

that we have of Nineveh because people took pictures

52:44

on potatoes of this thing when it emerged

52:46

from the sand, and then ISIS came

52:48

along. It never got photographed well,

52:51

even though it was out of the dirt during the age

52:53

of photography. Well, there

52:55

are ways to recapture that and reestablish

52:58

it and make it look like people took great photos

53:00

of it. I've been working on things like that

53:02

for future videos as well. I

53:05

think that's trending in a really impressive

53:07

direction in terms of what it could be. That also bothers

53:09

me about the SR-71 Blackbird. You

53:12

know the Blackbird, the coolest airplane ever? Yeah, the Cobra

53:15

Knight Raven. The

53:17

thing about the Blackbird is we had

53:19

this period where we went from film to

53:22

digital, and the first few years

53:24

of digital were just trash. Hot

53:27

trash. DV, progressive. I

53:30

mean, what was the scan where

53:32

it interlaced? That was it. Interlaced

53:34

videos like, why are we doing this? A

53:37

lot of that was when the SR-71

53:39

was flying, and so we didn't have really,

53:41

really, really good stuff from there.

53:44

Some of the best stuff is old film photos from

53:46

there. So anyway, I'm sorry. I just wanted

53:48

to lament that for a second. No, I mean, you're

53:50

on the same page. We went right to a perfectly logical

53:53

place. In your field,

53:54

that's something that just, because of the unfolding

53:56

of history, we just don't have a good look at.

53:59

Same thing for me.

54:00

with Nineveh for example.

54:03

So I think there's a ton of potential in all

54:05

of this. It's just, I

54:08

needed this by the way. Okay. I

54:10

needed to understand because the

54:12

last time we talked about AI I was like, it's coming

54:15

for us. I hate this. But

54:18

this is actually encouraging. Okay.

54:20

It's encouraging that people can sniff it out.

54:23

It's encouraging. So got

54:25

it. Still photos, need to be careful

54:28

what I'm looking at. However, the

54:30

video stuff and the writing stuff, that's

54:33

really, really good because it

54:35

lets me know that people aren't

54:37

going to be fooled quite to the degree that

54:39

I thought. And so this is encouraging. And I appreciate

54:42

the little, I don't know what you want to call this, this trek

54:44

into the unknown that you've done at your

54:46

great expense with money to figure

54:49

this out because it's, yeah, I

54:51

feel really good about it. So thanks. I'm

54:54

feeling better about where it's at too. I like

54:56

that the writing is failing. I

54:58

like that the visual images are getting better

55:01

and more controllable because I'm

55:03

seeing how it will advance knowledge

55:05

instead of destroy the concept of truth

55:08

and knowledge. I'm excited about where these iterations

55:10

are headed. And I'm also excited about

55:12

what it means for creativity and what

55:15

people do. I think there's going

55:17

to be more interest than ever in actual

55:19

human effort that goes into making

55:22

something beautiful, composing music,

55:24

trying to understand a thing and

55:27

understand it so well that you can spread it on the mental

55:29

desk in front of you and draw connections

55:31

between disparate things to find

55:34

meaning. Knowledge is great. Everybody

55:36

loves it. But meaning is

55:39

what we seem to be wired for. What does

55:41

it mean? Why is it like this? What

55:43

is the underlying truth of these physical

55:46

realities, these metaphysical

55:48

realities that we encounter in life? And

55:51

that's where the human thing happens. And

55:53

I think there's going to be more hunger than that than ever

55:55

as cheap AI things do

55:57

the busy work of creativity. those

56:00

who can still spread a series of disparate

56:03

things out on the table and make the connections. I think

56:06

there's job security in that and I think there's beauty

56:09

in that. When's the last time you watched

56:11

Inception, the movie?

56:14

Three weeks ago. Really? That's about how

56:16

long it's been since I watched it. Yeah, we watched it ahead of

56:19

Oppenheimer. So you know the

56:21

point in the movie where they had a totem and

56:24

the totem was the thing that would tell you if you were

56:26

in the real world or in a dream. Yes. It's

56:28

like AI doesn't have

56:30

the ability to cross that boundary and that

56:32

threshold. There's always something there

56:36

that lets us know that it's authentically

56:38

human and that thing that you're talking about, that metaphysical

56:41

place that the human mind occupies, I don't know,

56:43

I said a big word, I don't even know what metaphysical means, but

56:46

that place that the human mind occupies, it's

56:48

like AI hasn't

56:50

figured out how to cross that completely

56:53

yet and that makes me feel good. Yeah,

56:56

I think it takes a soul. I don't think it's getting there.

56:59

I think the greater fear for

57:01

me is not that it will cross

57:04

the line and become indistinguishable from humans.

57:07

It is that the majority of humanity will

57:09

regress back the other way across

57:12

the line and therefore become indistinguishable

57:14

from AI. So

57:17

more Brave New World, less Orwell,

57:20

or it's the book Amusing

57:22

Ourselves to Death, Neil Postman. But

57:25

that's another conversation. It is.

57:28

Bottom line on this, I eliminated

57:31

a whole bunch of options for how I want to

57:33

do this next big series of videos on

57:35

my channel and now

57:37

I'm going deeper down the well with different

57:40

strategies for working with humans.

57:42

Is it more important to have a great

57:44

editor who I can teach things about ancient

57:46

history or is it more important to have a great

57:49

ancient historian who I can teach things about

57:51

editing? It's kind of the Armageddon question.

57:53

Do you want the astronaut or do you want the explosive

57:55

minor expert to deal with the asteroid?

58:00

I love it. Thanks for the update.

58:30

Thank you.

Rate

Join Podchaser to...

  • Rate podcasts and episodes
  • Follow podcasts and creators
  • Create podcast and episode lists
  • & much more

Episode Tags

Do you host or manage this podcast?
Claim and edit this page to your liking.
,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features