Podchaser Logo
Home
How AI copyright lawsuits could make the whole industry go extinct

How AI copyright lawsuits could make the whole industry go extinct

Released Thursday, 15th February 2024
Good episode? Give it some love!
How AI copyright lawsuits could make the whole industry go extinct

How AI copyright lawsuits could make the whole industry go extinct

How AI copyright lawsuits could make the whole industry go extinct

How AI copyright lawsuits could make the whole industry go extinct

Thursday, 15th February 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

Support for Decoder comes from SAP Business

0:03

AI. Sure, we've all had

0:05

fun messing around with AI image generators and

0:07

conversation bots, but AI is more than

0:10

a novelty. Businesses around

0:12

the world have found ways to harness

0:14

its potential, like spotting inventory shortages before

0:16

they happen, or supporting supply chain management.

0:19

And it's very possible that your business could

0:21

benefit from AI integration too. Unlock

0:23

the potential of AI and discover even

0:26

more possibilities with SAP Business AI. Your

0:29

technology, real world results. That's

0:32

SAP Business AI. Learn

0:34

more at sap.com/AI. Hello,

0:41

and welcome to Decoder. I'm Neil Aptel, editor-in-chief of

0:43

The Verge, and Decoder is my show about big

0:45

ideas and other problems. We're

0:47

doing two decoders a week now. On Mondays,

0:49

we're going to have our regular interviews, but

0:51

our new Thursday episodes, like this one, are

0:54

all about deep dives into big topics in

0:56

the news. And for the next few

0:58

weeks, we're going to stay focused on one of the

1:00

biggest topics of all, generative AI.

1:03

There's a lot going on in the world

1:05

of generative AI, and maybe the biggest thing

1:07

going is the increasing number of copyright lawsuits

1:09

being filed against AI companies like

1:12

OpenAI and StabilityAI. So,

1:14

for this episode, we brought on Verge features

1:16

editor Sarah Jong, who is a former lawyer

1:18

just like me, and we're going to talk

1:20

about those cases. And the main defense the

1:22

AI companies are relying on, an idea called

1:24

fair use. Let's back

1:26

up a sec. All the big generative AI

1:28

models from every company are trained on huge

1:30

swaths of data that are scraped from the

1:32

entire internet. And big media

1:34

companies like the New York Times and

1:36

Getty Images have filed copyright lawsuits against

1:38

those AI companies, saying that basically, they've

1:40

stolen their work and are profiting from

1:43

it. A claim that amounts

1:45

to straightforward copyright infringement. I made something,

1:47

you made a copy without my permission,

1:49

that's copyright infringement. If there's

1:51

one thing to know about copyright law, it's

1:54

that it's still very much rooted in

1:56

the idea of making copies, and regulating

1:58

which copies are legal. in which

2:00

aren't. And since computers can't do

2:03

anything at all without making copies, copyright

2:05

law shows up again and again in

2:07

the history of computing, and especially the

2:10

history of the internet, which allows anyone

2:12

to make and distribute perfect copies faster

2:14

than ever before. But there's a

2:16

check on all of the control that

2:18

copyright law provides. Fair use. Fair

2:21

use is written right into the Copyright Act,

2:23

and it says that certain kinds of copies

2:25

are okay. You can quote things. You can

2:28

quote books in commentary about books. You can

2:30

run clips of movies in video criticism of

2:32

movies. And you can make copies of articles

2:34

to share in a classroom. There's a long

2:36

list of these things in the Copyright Act,

2:39

but since the law can't list or predict

2:41

everything that people might want to do, it

2:44

also has a four-factor test written into it

2:46

that courts can use to determine if a

2:48

copy is fair use or not. But

2:51

here's the thing about the legal system

2:53

in general, and fair use specifically, it

2:56

is not deterministic or predictable. I

2:59

know we have a lot of engineers and product

3:01

managers in the decoder audience, and it's tempting to

3:03

think about the legal system like a computer that

3:05

you put in some inputs, and you can predictably

3:07

get some outputs. But that's not

3:09

how it works at all. And it is

3:11

especially not how fair use works. Every

3:13

court gets to run that four-factor fair

3:16

use test any way they want. And

3:18

one court's fair use determination isn't actually

3:20

precedent for the next court. That

3:23

means fair use is a very vibes-based

3:25

situation. It's anyone's guess how a lot

3:27

of copyright lawsuits are going to go.

3:30

Many of them feel like a coin flip. And

3:32

when you add in the amount of hype,

3:34

uncertainty, and money that comes with AI, it

3:37

gets even more complicated. So I

3:39

wanted Sarah to come on and help me explain what's

3:41

going on to everyone. Sarah is one

3:43

of my very favorite people to talk to about copyright law.

3:45

I promise you, we didn't get totally off the rails and

3:48

running out about it. But we went a little

3:50

off the rails. But we had to

3:52

start at the start. The first thing we had

3:54

to figure out was how big a deal are

3:56

these AI-conforming lawsuits? I feel

3:58

like there's sort of a... potential extinction level

4:00

event on the horizon. It's pretty weird because

4:03

all the lawyers seem to think so and

4:05

for whatever reason, like the CEOs don't seem

4:07

to think so. My read when

4:09

I talk to the CEOs is that they think

4:11

this is a money problem. That something's going to

4:13

happen and their general counsels and their

4:16

policy people are going to walk through

4:18

some court cases and maybe get some

4:20

policy changes passed in Congress and

4:22

they'll have to pay some money, but it will be fine.

4:24

And in the end, the money is always fine. You

4:27

are an extinction level event. I'm feeling

4:30

like the noise I'm hearing indicates

4:33

extinction level event. Why

4:35

do you think it's that bad? I

4:38

mean, like we lived through Napster,

4:40

right? Like we like it's it which

4:42

is weird because the CEOs also lived

4:44

through Napster, but maybe

4:46

they didn't like maybe they're from another universe.

4:49

But yeah, like it's the level of

4:51

copyright direness in these cases, the

4:54

effects on existing industries plus

4:57

how applicable law is lining up. It's

4:59

got Napster vibes to it. And

5:02

when Napster happened to the law, entire

5:04

companies went bust, entire industries went bust,

5:06

copyright changed forever in a way that

5:09

was not great. It was

5:11

an extinction level event and AI

5:13

has a similar thing going on there.

5:16

They got sued that went all the way to Supreme Court.

5:18

The Supreme Court made some changes

5:20

to copyright law in that case. When

5:23

most people think about Napster, I'm pretty sure they

5:25

think about Justin Timberlake playing Sean Parker in the

5:27

movie The Social Network. And they

5:29

might think about the idea that a

5:32

company that just quote unquote facilitates piracy

5:34

is a bad idea.

5:36

They do not think about the Supreme

5:38

Court eventually issuing changes to copyright

5:40

law wholesale that we now live inside. So

5:42

explain what you mean there. Quickly give people

5:44

the capsule summary of Napster and Groxner and

5:47

what happened to the law in those cases.

5:49

Yeah, we exit one era where we

5:51

had just sort of softened fair use

5:54

so that it was okay for people

5:56

to use their VCR setups to record

5:58

off the television. So that was the

6:00

arrow we were exiting, where it was like, okay, so

6:02

like, there's there are these new technologies, and people are

6:04

going to use them for themselves in

6:06

these, like, you know, pretty benign ways,

6:08

and that's okay. And copyright doesn't

6:10

have to restrict that. And

6:12

then we enter into sort of the Napster era where they go,

6:15

everyone can be a pirate now. And

6:17

that's not good. It could destroy this

6:20

industry. So now we have to change

6:23

copyright law in a way that we've never seen

6:25

before. That's

6:28

a really good example of something I want to

6:30

hammer on as we cover the AI companies. The

6:32

concept of fair use was enshrined into federal law

6:34

in the Copyright Act of 1976. It's

6:37

almost 20 years before the consumer

6:39

internet came along. So when

6:42

digital culture hit and companies like Napster arrived on the

6:44

scene, we had no idea what was going to happen.

6:47

No one in 1976 could predict Napster. And

6:49

the record labels and Napster had to go

6:51

to court to figure out if Napster was

6:53

legal at all. Turns out it wasn't. Napster

6:55

basically met its end in 2001 when a

6:57

federal appeals court upheld a ruling

7:00

that determined Napster by facilitating the

7:02

copyright infringement of its users was

7:04

also liable for copyright infringement. A

7:07

few years later, another peer-to-peer file sharing network

7:09

called Grokster went all the way to the

7:11

Supreme Court with a very similar lawsuit. Grokster

7:14

was a different company and the same

7:16

federal court that had shut down Napster

7:18

said Grokster was not infringing on copyright

7:21

law. But because Napster was the forerunner

7:23

of a whole bunch of file swapping

7:25

platforms ending in stir, Grokster ended up

7:27

being painted with the same brush. And

7:29

that left the Supreme Court to make

7:31

a big decision that had major ramifications

7:33

on everything that's happened online in the

7:35

last 20 years. Ultimately,

7:37

the court said if you market a

7:39

tool for people specifically to do copyright

7:42

infringement with, you are liable for the

7:44

copyright infringement that happens as a result.

7:47

That is a judicial construction. That idea had to be

7:49

invented. And there was a lot of disagreement about it

7:51

at the time. You can go into the history and

7:53

drama of that case, the cases that came before it

7:55

and the cases that came after it. But the point

7:58

I want to make is that no one knew

8:00

what was going to happen. The Supreme

8:02

Court had the power to effectively

8:04

create or destroy a company and

8:06

an entire industry based on its

8:08

understanding of copyright law at that

8:10

time. And at that moment,

8:12

they said, this is illegal and those

8:14

companies basically disappeared. And that is the

8:17

extinction level event that Sarah is describing.

8:19

The justices might say, well, actually all

8:21

this is illegal. And then the entire

8:23

AI industry might disappear. So

8:30

now we come to the AI companies, which are also making a bunch

8:32

of copies. And the

8:34

argument that is getting made

8:36

everywhere is this is something called fair

8:38

use. Yep, we acknowledge that we've made

8:40

the copies, but we've done them

8:42

in a way that makes it OK because of something called

8:44

fair use. Can you quickly explain what fair use is? So

8:47

fair use is the escape valve for

8:49

copyright because it is wild for

8:52

the law to restrict other

8:54

people's speech based on whether or

8:56

not you've published it in a book or set

8:58

it on a tape or whatever. And

9:00

so you have these four factors in

9:03

the law. You can look them up, 17 USC 107. They're

9:06

like intertwined. There's not really like a clear

9:08

logic behind how the four are like lined

9:10

up. And in fact, if you go and

9:13

look at the cases where fair use is

9:15

implicated, you can see the factors being weighed

9:17

very differently per case. You don't even need

9:19

to meet all four depending on the

9:21

use. It's not super

9:23

clear. It is meant to

9:25

be very flexible because speech is important

9:28

and you want to have a really

9:30

flexible escape valve. But that

9:32

also means it's not super

9:35

predictive in cases of new

9:37

technologies. So AI companies are

9:39

getting sued. The New York Times, for example,

9:41

has sued OpenAI. And the New York Times has complaint is

9:44

very compelling because it has all of these

9:46

examples where OpenAI will just spit out word

9:49

for word recitations of New York Times

9:51

articles. They've obviously copied the information. OpenAI's

9:54

response is, yep, we've acknowledged that we've

9:56

made these copies. But that copying is

9:58

fair use. We're allowed to do it

10:01

and we're going to show it to you by

10:03

going through the four factors So the first factor

10:05

is purpose and character of the use. What

10:07

does that mean? And how do you think applies in this case? so

10:10

what's the difference between for instance a middle

10:12

school or opening up the New York Times

10:14

and like quoting the

10:16

New York Times in their book report about Soybeans

10:21

Like go right like it's just like you

10:23

you it's a source So why not

10:25

go to the New York Times to

10:27

like get the definitive information on some

10:29

news story? People are also going to

10:31

open AI against the advice of their

10:33

teachers to like copy paste

10:35

a paragraph about soybeans for their book

10:37

report Right. It's like the purpose and

10:39

character of the use including whether such

10:41

use is of commercial nature or

10:44

for nonprofit educational purposes Yes,

10:46

open AI is ultimately going to make money

10:48

off of this thing. They charge you

10:50

for GPT for right now, right? Right pay

10:53

them a subscription and you get access to

10:55

someone else's information That

10:57

seems tough, right? It is tough the

11:00

big missing word in this that's sort

11:02

of been added over time Through

11:05

the courts actually is that

11:07

they're looking for transformative use

11:10

That's just something that's evolved over

11:13

the years if a work

11:15

is transformed by the copying

11:19

There's like a stronger argument to be made

11:21

that it was a fair use So

11:24

you've got like, you know a parody you've

11:26

got mashups. That's like a classic one if

11:28

you're like doing a YouTube

11:30

like clap back and you like have

11:33

a little clip of the person you're

11:35

clapping back like that's a transformative

11:37

use but You

11:39

can kind of tell from all of those easy

11:41

examples I used you can definitely think of a

11:43

time Someone got in trouble with

11:46

copyright law for doing exactly that Which

11:48

goes again to like their use is kind

11:51

of a funky thing where like because

11:53

it's case-by-case Even if it seems

11:55

like it's easy, you can still get in

11:57

trouble might win in the end, but you'll still

12:00

get in trouble. If you get into like a

12:02

much more difficult scenario like open

12:04

AI, something that has never been to court

12:06

period, you're up leveling

12:09

the difficulty to another place.

12:11

And I feel like in the case of

12:13

open AI, whether or not

12:16

copying all the information on the internet so that

12:18

a robot can spit it back out at you

12:20

in slightly different formats, whether

12:22

or not that's transformative is wildly up in the

12:24

air. In some cases, it

12:26

clearly is transforming. But

12:29

the New York Times has all those examples where

12:31

the robot just spits it back verbatim.

12:33

It's not transforming. And so you can

12:35

sort of see like the New

12:37

York Times is trying to preempt that

12:39

transformativeness debate. They're like, yeah,

12:41

like you like if it's spitting it out

12:43

verbatim, how much transforming is actually going

12:45

on in here? You get kind of like

12:48

an almost circular argument there

12:50

where it's like if it's not doing

12:52

it verbatim in some cases, then

12:56

when it is doing it verbatim, it's still

12:58

transformative because that like whatever internal

13:00

guts are happening in there, like it's

13:02

like clearly changing things just because we

13:04

got like a one off

13:06

like verbatim quote, surely

13:08

that means it's like it's

13:11

still okay. It's a weird one. Yeah,

13:13

I will just point out to the audience.

13:16

We're already in the middle of like a

13:18

deeply existential debate on the very nature of

13:20

how AI systems work. And what

13:22

if they're transforming the source text? We're

13:24

at the first factor. We haven't

13:26

even made it out of the first one. There's three more to

13:28

go. And they're all they're all like this. And some of them

13:30

are even walkier. This is why when

13:33

I say it's a coin flip. This is what I

13:35

mean. Like I feel like Sarah and I could just

13:37

sit here debating whether or not AI is transformative for

13:39

the rest of the show and not

13:41

reach a conclusion. And it's not us who's

13:43

deciding it in the end. It's a bunch

13:45

of judges. And I don't know what they're gonna think. We

13:49

have to take a quick break. When we come back, Sarah

13:51

and I will start diving into the other three factors in

13:53

a fair use case. A

14:01

Corporate A Coder comes from S A P

14:03

Business Ai It's all over the internet. A

14:05

I dare say I that your friend is

14:08

turning his cat into a Monet painting. your

14:10

coworker used a chap ah to read solid

14:12

about Pancakes A eyes in the stuff of

14:14

science fiction anymore, but it's also more than

14:16

the gimmicks we see on a day to

14:18

day basis. If your business owner A I

14:20

can offer real solutions to help you scale

14:23

an enemy, it might be time to check

14:25

out S A P Business Ai Sep Business

14:27

A I can help you automate repetitive tasks,

14:29

optimize inventory management and supply chain analysis. And

14:32

identify opportunities for growth in your

14:34

operations. Sep Business A I can

14:36

help you with finance, sales, marketing,

14:38

human resources, procurement, supply chains, and

14:40

so much more. Like guarding against

14:43

fraud with A I assisted anomaly

14:45

detection or receive data driven prescriptive

14:47

guidance A critical decision points. They

14:49

even have a generative Ai road

14:51

map to help you discover upcoming

14:54

and cutting edge innovations for your

14:56

business. Who knows when innovations are

14:58

around the corner? Revolutionary technology, real

15:00

world results. That s a P

15:02

Business A I learn more it as

15:04

a P.com/a I. Was

15:12

back. We were just talking about factor one of

15:14

the ferries test which is purpose and character of

15:16

use and we talked about the idea that some

15:18

copies or transform it into the original work and

15:21

he turned him into something else. That

15:23

is major we up for debate in the

15:25

case of a generative Ai system. My church

15:27

between. The

15:29

second series factor is the nature of

15:32

the work is widely available. Is it

15:34

a secret is had stashed away the

15:36

copy, the new drugs? everyone else. How

15:38

does nature of the work place in

15:40

these cases? Some. Things are

15:42

considered like a little more. In.

15:45

The purview of copyright than

15:47

others. So like you know,

15:49

arts plays, that anything you

15:51

know, creative works. that's. Much

15:54

more like. valuable than

15:56

say the the sequence

15:58

structure an organization of

16:00

Java, the language, right?

16:03

Which is, by the way, copyrightable. But

16:05

it's like kind of

16:07

a little bit less copyrightable, like on the

16:09

down low, than the other stuff. Because it's

16:11

weird. It's just a weird thing that you

16:13

don't really want to be like, oh yeah,

16:15

this is clearly what the founding fathers

16:18

wanted us to protect with copyright

16:20

to the law. I

16:22

always think of nature of the work as how

16:25

the judge feels about it. Like

16:28

it's just up for whatever it is. It's like,

16:30

how do I feel about the song Pretty Woman?

16:32

Is it important that I'm going to turn up

16:34

the dial on this one and say you've got

16:36

to overcome it a lot? Is

16:38

it a list of APIs in a spreadsheet in

16:40

order? I'll turn the dial down a little bit

16:42

because that seems pretty silly. And that, as

16:45

far as I can tell, is how this

16:47

one gets assessed. It feels

16:49

like the entire contents of the New

16:51

York Times archive might get the, I

16:53

feel like this is pretty important, waiting.

16:55

It's hard to know. I think the dial

16:57

is kind of in the middle on this one because on the one

16:59

hand, it's the kind of

17:02

creativity that you want to incentivize. But on the other

17:04

hand, it's full of facts. And

17:06

when it comes to facts and

17:08

history, facts themselves aren't copyrightable.

17:11

And you don't want to put a big fence

17:13

around the first draft of history, essentially. So

17:15

I think it does end up sort of

17:17

in the middle there. The part

17:19

of the nature of the work factor that I think is

17:21

also a bit of a coin flip here is

17:24

how widely available something is. So the

17:26

New York Times thinks of

17:28

itself as the news. It has

17:30

a very high self regard. And it thinks of itself

17:33

as pervasive. And it is the first draft of history

17:35

in all the ways the Times thinks about itself. It

17:37

is also famously behind a paywall. It's a thing you

17:39

have to pay for. You have to pay a lot

17:41

of money for it. They can mail you pieces of

17:43

paper, which is a very interesting way of receiving the

17:46

news. There's a tension there. The

17:48

tension is in the complaint. They're saying,

17:50

look, Chat CPT lets people get past the paywall. It's

17:53

the thing you have to pay for. And you can pay someone else to

17:55

get it for free. The argument is you're

17:57

reducing the market for the New York Times. How do you think those two

17:59

things? square. Yeah, like chat GPT,

18:02

letting you get around the

18:04

New York Times paywall, and the

18:06

New York Times being paywalled in the

18:08

first place. That does

18:11

seem to run in the

18:13

favor of the New York Times and against

18:15

open AI. Yeah. And again,

18:18

I will tell the audience, you can see

18:20

how complicated this is from the jump. This

18:22

is not, there's nothing deterministic about

18:24

this analysis. At every point, you can have

18:27

a pretty existential argument. But to

18:29

me, it's the nature of the work, it is

18:31

the least deterministic. Because if the judge decides that

18:33

they don't like the New York Times that day,

18:36

they can just turn the knob down and

18:38

say something like the news is the news. It's

18:41

not a poem. Everyone

18:44

has the news. It's open, I didn't steal it from

18:46

the New York Times, they would take it from the AP. And it's the same

18:48

news. And like, that is a

18:50

thing that they could logically say here to

18:52

devalue this factor. Yes, they

18:55

could say that. We're

18:58

about to get into the third and fourth theories factors.

19:00

And one thing we're going to talk a lot about

19:02

is Google, specifically Google books. 20 years

19:04

ago, in 2004, Google said it wanted to

19:06

scan and make searchable all of the books

19:09

in some research libraries. The

19:11

copyright holders, authors and publishers said, No,

19:13

you can't do that. And they filed

19:15

copyright lawsuits. It took nearly

19:17

a decade. The case was

19:20

finally resolved in Google favor in 2015.

19:22

The federal appeals court held that turning

19:24

books into search snippets was fundamentally transformative

19:26

of the original work. There's a factor

19:28

one, and said that even though Google

19:30

has scraped the entirety of the books

19:33

and made a profit from offering its

19:35

services, it is not hurting the

19:37

market for the original books. That's

19:39

where factors three and four come in. Factor

19:44

three in the analysis is how much of the

19:46

work was used. This one

19:49

might be the most obvious one on its face, which

19:51

is that's all of it. It's bad. If it's as

19:53

little as possible, it's good. Is there any nuance to

19:55

this one? I actually do think there's a little bit

19:57

of nuance to this one. This is where the Google

20:00

Books cases, I think, sort of cut

20:02

in favor of OpenAI where like, yes,

20:04

they're taking 100% of the

20:07

New York Times, like as much of the New York Times as

20:09

they can get. Yes. But like, what's

20:12

spitting out is small

20:15

in comparison to what they're taking. And

20:17

so like, the fact that they've taken everything

20:20

is sort of minimized in the Google

20:22

Books cases, for instance, where they're like,

20:25

yeah, they had to read

20:27

everything. But because a

20:29

robot was doing all of the reading, that

20:32

means a lot less. That

20:35

makes sense. Like the human literate

20:38

part of the copying

20:40

is what's important here is sort

20:42

of, I think, the upshot

20:44

of like those cases. And

20:46

so here, I think it's

20:48

a little bit more mitigated for OpenAI.

20:52

So even though OpenAI has taken everything the fact

20:54

that it's literally really everything, the

20:56

fact that it's not sort of immediately available,

21:00

kind of diminishes this factor. Yes.

21:02

That's fascinating. I honestly wonder

21:05

if that argument will be

21:07

made well, and if the judges will accept it

21:09

well, given that the main thing

21:11

that AI companies have to do using the state is

21:13

train on it, right? Google Books is like we made

21:15

an index of all the books, and

21:18

we can show you parts of the index and then kick

21:20

you out to buying a book. OpenAI

21:22

is like, we copied everything, we trained

21:24

a model on it. And

21:26

we might have even thrown away the database of

21:28

copies that we made. And now the model can

21:30

just go converse. And

21:32

it's like you had to take all

21:35

of the stuff to train the model.

21:37

And that just seems like

21:39

very complicated to me, because there's

21:41

a little bit of technical nuance

21:44

to how AI models work, that

21:46

requires all the stuff, even if all the stuff isn't

21:48

out in the world, or

21:50

exposed to the user. I mean, that's also

21:52

how search engines work, right? Like, it's like you have to

21:54

have all the stuff in order to be able to search

21:56

it. I mean, the technology is subtly different,

21:59

how the technology is used. is being used is subtly different. The

22:02

other part of it that no one wants

22:04

to like really even think about or hear

22:06

about is that the Google Books cases, we

22:09

don't have a Supreme Court decision out of

22:11

them. Like the important bits come out of

22:13

the appellate courts and it's been

22:15

10 years. So it's like almost 10 years.

22:17

So it's like you're, we've seen a shift

22:20

in how copyright, especially fair use is

22:23

being addressed by the Supreme Court. It's a completely different

22:25

court. Like we could get something

22:28

very strange out of this that

22:30

is unexpected. Let's talk

22:32

about the fourth factor here, which feels like

22:34

the most important one in this

22:37

case and often feels like the most important

22:39

one in any fair use case, the effect

22:41

of your copy on the market for the

22:43

original work. I make a song, it samples

22:46

your song. The sample

22:48

is de minimis in some way, it's just

22:50

like background noise. My song is really popular.

22:53

That doesn't mean people are going to listen to my song

22:55

instead of listening to the original. And in fact, it

22:58

might mean that people love the sample so much that

23:01

sales of the original song go up. So

23:03

you've had some impacts on the market. It could

23:05

be positive or negative or nothing. And we're going

23:07

to try to figure that out. And if it's

23:09

positive or nothing, maybe that use

23:11

is fair. If I have taken so much

23:13

of your stuff and replaced your work

23:15

with it and your market goes down and

23:17

your sales go down, that's negative and that's

23:20

going to cut against fair use. I

23:23

don't know how to evaluate that in the case of AI

23:25

at all. It feels like

23:27

they're racing the market for all human

23:29

generated content in the world. But

23:32

then I use the tools and I'm like, I

23:34

think I think I still have something to say here. I

23:38

mean, I actually think that the fourth

23:40

factor is really, really

23:42

against open AI. And

23:44

I think that it's because of the

23:46

Warhol case. So we have this case

23:48

where you get like an Andy Warhol

23:51

style portrait of Prince, or

23:54

you've got the famous Marilyn Monroe thing

23:57

where it's like the cutouts and

23:59

the choppy. print thing. So

24:01

a magazine thinks about

24:03

putting a picture like a photo of

24:05

Prince on their cover, and they're like, no,

24:07

everyone's going to put a photo of Prince

24:09

on their cover after he died. Let's

24:12

get an Andy Warhol style thing instead

24:15

from the Andy Warhol Foundation. So

24:17

they like make a portrait of

24:20

Prince in the

24:22

style of Andy Warhol. And

24:24

the base that they use is a

24:26

photograph that someone else took. They

24:29

do not license the photograph. And

24:32

the person who took the photograph

24:34

is the kind of person whose

24:36

photographs get licensed to

24:38

be put on the cover of magazines. And

24:40

the court basically just goes, look, you like snapped

24:43

up an opportunity that this person theoretically had.

24:45

And the Warhol Foundation is like, no, this

24:47

is like, it's not the thing. They didn't

24:49

want a photograph. They wanted Andy Warhol. They're

24:51

like, yeah, but it's like the same market.

24:53

It's about the same. Right. I

24:55

don't think that there's been a Supreme

24:57

Court case that emphasize Factor 4 that

25:00

heavily before. And it's,

25:03

I think, like sort of a

25:05

warning shot, actually, for these new

25:07

technologies. I don't know if they had the new

25:10

technologies in mind. But like, definitely, this is

25:12

not something we've seen out of courts

25:14

before. Is that heavy of an emphasis

25:16

on Factor 4? One thing that's

25:18

interesting about that note on Factor 4, which is

25:20

the economic factor, you could call it, is that

25:22

in a time since you and I graduated from

25:24

law school, and now there is

25:26

a movement called law and economics in

25:29

the law that really emphasizes these ideas.

25:31

Like the law should be measurable. We

25:33

can apply economic thinking to it. That

25:36

was not so much in vogue when

25:38

the Napster cases were getting decided, the Groschke cases

25:40

were getting decided. And so now you have this

25:42

other fair use thing where it's like, is a

25:45

painting replacing the market for a photo? And

25:47

the judges are like, we can do some economic

25:49

thinking here. And we're going to prove it by

25:51

saying, yes, there is a market for

25:54

depictions of prints. And

25:56

this painting can serve that market as well as

25:58

the photograph, which sounds... But

26:02

I think in a case of open AI,

26:04

the economics of it actually become pretty

26:07

direct. There is a

26:09

market for information or writing or what

26:12

books in this robot for

26:14

20 bucks a month can

26:16

just substitute for

26:18

users all of the other kinds of products that they

26:20

might otherwise buy. Again,

26:23

I think there's complexity

26:25

here because I actually don't think the

26:28

GPTs right now, the GPT-4,

26:31

can't actually do the work.

26:33

It's not quite good enough. So

26:35

you have to pull your mind ahead to where it

26:37

obviously will be good enough in the future. But

26:41

right now, I wonder if the difference between

26:43

what it can do right now and what

26:45

it might do will weigh into this analysis.

26:48

You were putting open AI in the position of going like, our shit's

26:50

not that good. So you

26:52

can't sue us because our... I mean, any big

26:54

company will argue that it is a piece of

26:56

crap if that is legally advantageous. I

26:59

mean, yeah, but that's

27:02

where they're going to have to go in order

27:04

to make Factor 4 work with them, right? It's just

27:06

like, oh, yeah, we're not that great. We actually suck

27:09

and we're always going to suck. Like,

27:11

therefore, we will never impact the

27:14

commercial value of this work. I actually don't

27:16

know if they're going to be willing to go that far because

27:18

it's a bit much. I straight up

27:20

think that they are going to have to really

27:22

minimize Factor 4 as much as they can and

27:24

just talk around it and try to really push

27:27

their case on the other factors. I think Factor

27:29

4 is like, that's a

27:31

rough one for them. I think it's especially

27:33

rough given that the

27:35

most recent various cases we have out of SCOTUS

27:37

is a Factor 4 case. I

27:40

think that might be the biggest sign to me that

27:42

we're headed towards an extinction level event. We

27:46

have to take another quick break. We'll be right back. Thank

27:58

you. AI.

28:01

It's all over the internet. AI this, AI

28:03

that, your friend is turning his cat into

28:05

a Monet painting, your co-worker used a chatbot

28:07

to write a song about pancakes. AI

28:10

isn't the stuff of science fiction anymore, but it's

28:12

also more than the gimmicks we see on a

28:14

day-to-day basis. If you're a business

28:16

owner, AI can offer real solutions to help you

28:18

scale and innovate. It might be time to check

28:21

out SAP Business AI. SAP

28:23

Business AI can help you automate

28:25

repetitive tasks, optimize inventory management, and

28:27

supply chain analysis, and

28:29

identify opportunities for growth in your operations.

28:32

SAP Business AI can help you

28:34

with finance, sales, marketing, human resources,

28:36

procurement, supply chain, and so much

28:39

more. Like guarding against fraud

28:41

with AI assisted anomaly detection, or receive

28:43

data-driven prescriptive guidance at critical decision

28:46

points. They even have a

28:48

generative AI roadmap to help you discover

28:50

upcoming and cutting-edge innovations for your business.

28:53

Who knows what innovations are around the corner? Revolutionary

28:56

technology, real-world results. That's SAP

28:58

Business AI. Learn more

29:01

at sap.com/AI. We're

29:09

back. I'm talking about fair use with Verge Features

29:11

Editor, Sarah John. We've talked about

29:13

what the four factors in a fair use case are,

29:15

but that leaves us with a really big question. How

29:18

is this all gonna play out for the AI companies?

29:22

I often use the blurred lines case. Oh

29:24

no. An example of how much of a

29:26

coin flip all of this is. The

29:29

state of Marvin Gaye and Pharrell

29:32

Williams get into a dispute. The estate of

29:34

Marvin Gaye wins, even though there's no copying

29:36

in that song at all, which is still

29:38

crazy to me. Later

29:40

on, the estate of Marvin Gaye goes

29:42

after Ed Sheeran. There is actual musical

29:47

similarity in the two songs there, but Ed

29:49

Sheeran takes the stand. He's very sympathetic. He

29:51

says, this is the death of all music. And the jury agrees with him,

29:53

and he wins. Total

29:55

coin flip. The facts actually,

29:58

in both cases, were the what they

30:00

needed to be for the outcomes in my opinion. But

30:02

they are straightforward fair use analyses. Why do

30:04

you think that's applicable or not applicable in

30:06

the case of opening eye? I don't think

30:08

it's applicable. The thing about the

30:10

blurred line skater, it's a case about vibes, right?

30:13

Does this sound vibe with Marvin Gaye? And

30:15

is that infringement? It's just

30:17

such a terrible case. Yeah, it's really bad. Yeah,

30:20

it's a really terrible, I mean, it's a good

30:22

example of the fact of like, this

30:24

is why you never want to go to trial for

30:26

literally anything. The court is random, right? The jury

30:29

is random and the decision is random. And then

30:31

eventually you end up in an appellate court with

30:33

a bunch of unelected weirdos and they're extraordinarily random

30:35

lately. This is why the CEOs think

30:37

it's just gonna be some money, right?

30:40

Because they're gonna say, look, I don't want to go to trial.

30:42

I doubt the New York Times wants to go to trial. Like

30:44

we'll just pay you a bunch of money and you'll go away

30:46

and we'll build our businesses under a

30:48

legal regime that exists now. Yeah, Getty,

30:50

The Times, the media, authors,

30:52

basically everyone who works in a

30:54

creative industry is very

30:57

mad and very concerned. And

31:00

they've seen sort of their bottom line eaten away

31:02

by big tech. And

31:05

this is like no longer an arrow where you

31:07

go, oh, well, we can form these partnerships with these companies.

31:09

And it'll work out for us in the long run. Like

31:12

people are tired of having, getting

31:14

their backstabs essentially. Like that's sort of

31:16

the perception, I think. I

31:18

think that like there is an appetite to take this

31:20

all the way to the end rather

31:22

than give in and

31:25

let things blow up the future.

31:28

And like whether or not that's like

31:30

a warranted feeling, maybe,

31:34

but it's like the feeling

31:36

is just different. So you don't think there are

31:38

settlements here? If you have nothing to

31:40

lose, then of course go to trial, right? If

31:43

you have literally nothing to lose. And I think like we've sort of hit

31:45

the wall where you go, oh, either

31:47

we go to court and we destroy copyright law

31:49

forever, maybe. Yeah. Or

31:52

we lose everything, right? Like I

31:54

think that we're like running up against the wall where

31:56

people are sort of like, oh, it's not just that

31:58

the robots are gonna take our jobs. the robots

32:00

are going to take like the

32:02

future literacy of humanity. Like it's

32:04

like people are starting to spin up these visions

32:06

of the future that are increasingly

32:09

apocalyptic. And I

32:11

think that there's like an appetite to take this to

32:13

the end. So I want to end there

32:15

on kind of a big thing. This

32:17

is an idea that I have ruthlessly stolen from you over

32:19

the years. It's the idea that

32:21

copyright law is the only real limiting regulation on the

32:24

internet. Because it's the only thing

32:26

that can consistently get things taken down. Like

32:28

everything else aside from child

32:31

pornography and sex trafficking to

32:33

some degree even. It's those

32:36

two things in copyright law. And

32:38

those are the regulating factors on the internet. Those

32:40

are things that you can send a letter

32:42

and something you can hand down if you claim it's copyright infringement.

32:45

And then you have this chaotic fair

32:47

use argument happening in the background. Like

32:49

the exit ramp is supposed to be

32:51

really flexible and you know

32:53

lend itself to these kinds of existential arguments

32:56

that might go either way in front of

32:58

a court. Entire industries might

33:00

live or die depending on how people are

33:02

feeling that day about the nature of the

33:04

New York Times. Is

33:06

this working? Like I, is

33:08

this the right way to go about it? Because it's

33:10

what we've been doing for a long time. And

33:13

I still don't know if

33:15

we've made any of the correct policy

33:17

choices using copyright laws are only real

33:19

tool. I think it's a

33:21

terrible tool to regulate

33:24

speech. It's clearly

33:26

not working out I think in the

33:29

context of like you know individual

33:31

creators. Like we've set up just

33:33

a very bizarre kangaroo court system

33:35

essentially through platforms. Everyone is

33:37

familiar with the idea of copyright

33:40

strikes and DMCA and everyone

33:42

sort of knows that it like it's a lot

33:44

of BS and doesn't work super well. Doesn't make

33:46

a ton of sense and it's weaponized.

33:50

That said it's like when you're looking

33:52

at sort of the changes that are

33:54

coming to the culture through generative AI

33:57

and what that poses for.

34:00

society and for the way we

34:02

live and you know all kinds

34:04

of things like how do we

34:06

learn in schools right the

34:08

nature of creativity itself the value

34:10

of literature and art I

34:13

like don't even know what how to quantify

34:15

the changes that are coming down the pipeline or

34:17

what to do to address them and historically

34:21

when you're looking at a technology

34:23

that's about to blow up culture itself bring

34:27

in the copyright actually like right like it's like

34:29

the printing press shows up you bring in sort

34:32

of proto-copyright like the stationers monopoly

34:34

right that you you bring in something

34:36

like copyright does it work super well

34:38

is it a good thing I don't

34:40

know kind of not

34:43

not totally I'm not 100% on

34:45

board but like yeah like traditionally what

34:47

we do when technology is about to

34:50

blow up culture is we bring on

34:52

something like copyright and so like I

34:55

don't know if that's the right tool for this because I

34:57

don't even know if we really understand what

35:00

generative AI is about to do to us but

35:02

I think it does make sense to me that

35:04

it's shown up at this time as

35:07

sort of the front line does it make sense

35:09

that it's shown up to you as sort of an extinction level

35:11

event for these companies it makes

35:13

sense to me in like oh

35:16

yes this is checkouts gun right moment

35:18

right like it makes sense and like

35:20

oh yes this was this was the

35:22

destiny of copyright law and the destiny

35:24

of generative AI but will

35:26

it be a good tool about

35:29

as good as anything else I think it's

35:31

like not great it's not super good but

35:33

like if I run my head

35:35

through anything else that we've got on the

35:37

books I don't think that there's like

35:40

something where I'm like oh yeah this isn't a copyright

35:42

thing this is a something else thing there's one I

35:44

can there's one I can think of actually yeah this

35:46

is a hint towards our next episode deepfakes

35:50

there's no copies there's some there's

35:52

a copy somewhere in the model but

35:55

then you you're looking at the deepfake of Taylor Swift

35:57

or Joe Biden or Donald Trump or whoever and it's

35:59

not a copy of anything. So

36:01

if those characters want to show up and say

36:04

take this down, they have to

36:06

use some other tool because they

36:08

can't just go to copyright law and say that you're not

36:10

authorized to use that photo of me. The

36:12

way that I don't know, even celebrity revenge

36:14

porn gets taken down because they own the

36:16

copyrights, the underlying images that gets stolen. Like

36:19

there's something else that needs to happen

36:21

for in particular deep fakes that

36:24

I don't think that we have an answer to yet either. It's like,

36:26

I don't know, we've had

36:28

senators on the show proposing new causes of

36:30

action around likenesses, which

36:32

just gets to it's just the other weird places.

36:35

But it's like everyone will have the same rights as any

36:37

celebrity to endorse or not endorse

36:39

Twitter. And it's like that is really weird. But

36:42

it feels like we're going to need that for the deep fake problem.

36:45

Yeah, that problem is just another

36:47

rat's nest because it makes copyright

36:49

look easy. Because

36:51

like once you get into sort of the deep

36:53

fake problem and likenesses. Oh,

36:56

man, like you didn't copyright socks. Wait until

36:58

you get to the right of publicity slash

37:00

the right of privacy, which is the same

37:02

thing depending on which state you're in. Amazing.

37:05

Well, that's a big hint towards the next episode. I got to

37:07

ask you though, just to wrap this one up. How

37:10

do you think New York Times versus open eyes gonna play out? Oh,

37:13

why would you ask me? Like it's

37:15

you're basically just setting me up for

37:17

like, whatever I answer is the wrong

37:19

thing. I think that this is one

37:21

of those things where I think the

37:23

most you can really hope for

37:25

is that whatever comes out doesn't

37:28

damage copyright law in a way

37:30

that makes it unworkable. Like

37:32

that is like the worst case scenario. The worst

37:34

case scenario isn't actually that generative

37:36

AI gets banned forever, or

37:39

that it gets a green

37:41

light forever. The worst case is that

37:43

copyright law changes in a way

37:45

that's unworkable. I really do

37:47

feel like the tension you've identified with

37:49

the CEOs of these companies continue to

37:51

make huge investments is that

37:53

they feel they can solve the problem with money. And I

37:56

think the tension is you actually have to solve

37:58

the problem. My prediction is

38:02

that if the time's not just

38:04

a big early victory, OpenEye will just throw money

38:06

at the times and make it

38:08

go away, right? They will just throw money until the

38:10

time says, fine, we'll do a 10

38:12

year deal. But that doesn't stop the author's guild

38:15

and that doesn't stop Getty. And

38:17

then suddenly it becomes too costly to run. I

38:20

think that first case, the trial case

38:22

actually determines a lot of what happens next because

38:26

the authors win there, the creators win in

38:28

any of those early cases, the

38:30

entire AI industry is going to light up with

38:32

settlement offers. And then the prices

38:34

are just going to rise and maybe that will determine what

38:36

happens next. But if the AI

38:38

companies win first, I do

38:40

think it's existential for all these creative companies and they

38:42

are going to fight tooth and nail until they do

38:44

get in front of a Supreme Court and

38:47

then all bets are off. And there's the also

38:49

the sort of thing where some

38:51

of these companies have a worst case than the others. So

38:54

whichever ends up in front of a court first is also

38:56

going to be interesting, I think some

38:59

of these companies have played fast and loose with

39:01

copyright a little more than others. And

39:05

you're right that the first test bullet, like

39:07

the first trial balloon is going to determine

39:09

a lot of what happens next. Thanks

39:14

again to Verge features editor Sarah Zhuang for joining us on

39:16

the show. I hope you can tell Sarah and I love

39:18

talking about this stuff. That was a lot of fun. From

39:21

now on, we're going to keep bringing

39:23

you second episodes of Decoder every Thursday

39:25

to deliver more analysis and storytelling like

39:28

this. In addition to our classic regular

39:30

weekly interviews with CEOs, lawmakers and automakers,

39:32

stay tuned for parts two and three of our

39:35

series over the coming week. Your

39:37

thoughts about this episode or what you'd like to hear

39:39

us talk about more. You can email us at decoder

39:41

at the verge.com. We really do read every email. You

39:43

also hit me up directly on threads. I'm at retches

39:45

to lady. Also, I'm a tick off team. Check it

39:47

out. It's at decoder. You

39:49

like to go to please share with your friends.

39:51

Subscribe wherever your podcast. If you really like the

39:53

show, hit us with that five star. I wouldn't

39:56

want to be second. I'm

39:58

just saying that. Here is a production of

40:00

The Verge and part of the Vox If

40:28

you're looking for ways to innovate your business, it might be

40:30

time to consider SAP Business AI.

40:33

With dozens of potential integrations to

40:35

optimize sales, procurement, finance, human resources,

40:37

and more, SAP Business AI may

40:39

be able to improve your business

40:41

operations inside and out. Revolutionary

40:44

technology? Real world results.

40:46

That's SAP Business AI. Learn

40:49

more at sap.com/AI.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features