Podchaser Logo
Home
The Penultimate Episode of The Biztape

The Penultimate Episode of The Biztape

Released Wednesday, 27th September 2023
Good episode? Give it some love!
The Penultimate Episode of The Biztape

The Penultimate Episode of The Biztape

The Penultimate Episode of The Biztape

The Penultimate Episode of The Biztape

Wednesday, 27th September 2023
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:01

Indie musicians can now or

0:03

maybe go against DSPs.

0:07

And TikTok combats Ai. You're

0:09

listening to the biz Tape.

0:23

Welcome to the biz Tape. You're all Things

0:25

music, business and media podcast. I'm

0:27

your host, Colin McKay with my lovely

0:30

host Coast to Coast. I'm running out of times

0:32

I can say that, Joseph Wazelski.

0:35

It's true.

0:36

I guess I could adopt it in regular US,

0:39

but I feel like it would be really weird if I was

0:41

like, oh, my host Coast to Coast is calling

0:43

me.

0:44

I feel like you should change my name

0:46

and your phone to host Coast to

0:48

Coast.

0:49

Oh my god, you want to hear something funny? I forgot

0:51

to tell you this, speaking of our relationship

0:53

as a host to hosts like kind of thing. So

0:56

I was having coffee with some people, right,

0:58

and I was I,

1:01

you know, you're my business partner. Correct

1:04

you would say that's a correct thing, but sometimes

1:07

I like to say partner. So

1:10

I was talking to this person who

1:12

is part of the LGBT community and they

1:14

were like, I was like, yeah, me and my partner

1:16

have the show we do, and she was like, oh my god,

1:19

congratulations and I was like, look, we

1:21

would make a great couple together.

1:23

I'm sure we could get some Pogue magazine

1:26

cover kind of stuff.

1:27

Oh yeah, you can.

1:28

I I could. I could clean the dishes,

1:30

you could make thirty dishes. We would have a very

1:32

good relationship. But

1:36

it was like, I was like, I cannot. I

1:40

cannot hold a candle. I cannot hold a candle

1:42

to your partner. So you know, it's

1:45

just I you know, it was really

1:47

funny, and I was like, oh no, no, no, no. And I

1:49

was like, I you know, I'm not I'm like,

1:51

I'm not a fake ally, I'm just

1:54

just sadly straight. You know.

1:57

I can just imagine you and

2:00

in the middle of Portland brew So.

2:03

That was pretty close. Yeah, you're where

2:05

I was. Yeah, right, Oh

2:10

god, folks, Now

2:12

we're getting more unhinged and you're gonna probably

2:14

see it till the end of the show. And if you haven't heard, this

2:16

is our second to last episode of

2:18

the show, one forty nine, going

2:21

into one fifty. We're really at a higher number

2:23

than that, but fin my number

2:26

says, but you're

2:29

we're gonna get a little off the cuff in the last episode.

2:31

Just to give you guys some uh stuff kind

2:34

of maybe just some stories kind of

2:36

hit some week, get canceled Mike

2:40

as a fake ally yeah

2:42

no, but like yeah, it was just really funny and I was like,

2:44

but but seriously though, like you know

2:47

I am and how I So it was just like full

2:49

circle. I was like, you know, going with

2:51

that, but yeah, it was it

2:53

was just it made me laugh so hard. I was

2:55

just like, yeah, my partner Joe, he

2:58

like we worked together all the time.

3:00

Yeah, I love it. Anyway,

3:03

Joe intends to start with some of our last

3:06

stories that we're really going to cover on the show

3:08

before we get into our really crazy last

3:10

episode.

3:11

I feel like I can. I'll go

3:13

first. Mine's pretty, uh, pretty

3:15

cut and dry, but I feel like has some interesting

3:19

elements to it. It might be

3:21

dated, honestly, like after it

3:24

would be funny our last episode, like true

3:26

episode is TikTok that

3:31

yeah, but TikTok has currently

3:33

rolled out some tools

3:36

for labels when it comes

3:38

to labeling AI created

3:40

content and more seal

3:43

yeah and more specifically

3:46

in the music realm uh Ai

3:48

created content that's like sounds

3:51

pretty much. So there's a

3:53

lot that AI can do at this at

3:55

this point in the

3:57

the our generation are

4:00

our current timeframe. We're

4:02

at this culmination of like AI

4:04

could could forever change

4:06

the way humans

4:09

live their lives. At least that's what

4:11

Elon Musk is pushing

4:14

to some extent and a lot of these big

4:16

tech people. But whether

4:19

or not that's going to be true is hearsay.

4:22

But what I feel like everybody

4:24

can kind of get around is that

4:26

it is very true that AI can create

4:29

content that can mislead

4:31

you into thinking that a human

4:34

made it in a lot of cases,

4:36

especially if it's not tagged on

4:39

with created by AI. More

4:42

recently, Colin, did you see that

4:45

crazy TikTok that was borderline

4:49

racist that a

4:51

lot of people were seeing?

4:53

It was this creator who her whole

4:55

thing is that she face maps herself

4:59

and she does a transition and she changes into

5:01

like a new character that

5:03

she creates. Yeah,

5:06

and she she did a

5:08

little TikTok swipe

5:11

when it comes to like the transition shot,

5:14

and she turned into a

5:18

black woman who the

5:20

creator is originally a white woman. And so

5:22

a lot of people had a

5:25

lot of people were making

5:28

jokes. They were kind of like, is this like

5:30

the new era? Like

5:33

is this the new era of black face?

5:35

Like it just keeps going.

5:38

That's honestly, that is a major concern

5:40

with you know, with like AI and

5:42

specifically with the strikes that are going

5:44

on in la is how AI

5:47

could disrupt a lot of diversity

5:49

and casting, specifically on the actors

5:52

but also on the writers by being like, oh,

5:54

we have this AI that

5:56

can you know, create a story

5:59

or create a character and

6:01

it looks real and they're you know, black or

6:03

Hispanic or Asian or something like that. And that's

6:06

a real concern. So I don't think

6:08

it's you know, some people acting

6:10

you know, rightfully concerned about

6:12

it. It's true, like this is a concern

6:14

that unions are fighting over right now,

6:17

you know what I mean.

6:18

I think especially to like even you

6:20

know, I've heard a lot of talk about AI when

6:23

when it pertains to entertainment

6:25

specifically. But I also

6:27

think that there's a lot of potential

6:30

neglect being brought

6:33

on when it comes to political

6:36

ads and like political statements

6:38

and.

6:39

Oh yeah for that, like you

6:41

know, a fake you know, if there was a really

6:43

fake news your favorite politician. Literally

6:46

it's literally literally news.

6:48

Yeah, it's literally the fake news generator,

6:51

like and which Colin, I don't

6:53

know if you heard this. There's like a really interesting daily

6:55

episode about like how there

6:58

was this. It was like two

7:00

years ago they came out with this episode. I think it

7:02

was like during the election craziness

7:05

two or three years ago, and

7:08

it was about how this person gets

7:10

like hired out to just cut

7:13

memes together. And

7:16

now you don't even have to hire that person

7:18

as a political person.

7:20

You could just get an AI to cut a bunch of memes

7:22

together for you and then like spread

7:24

it everywhere. And to be honest, the way that social

7:27

media is now, it's quantity

7:29

over quality, and people still

7:32

consume it. So it's a

7:34

lot of like just post every day stuff

7:38

that can be throwaway and be terrible.

7:40

It can be fake, it can look

7:42

fake, and people will still think it's real. It

7:45

is kind of the crazy thing behind

7:47

it. But according to Billboard,

7:50

in July, President Biden's administration

7:52

announced that seven leading AI companies

7:55

made voluntary commitments quote

7:57

to help move toward a safe, secure,

8:00

and transparent development of AI technology,

8:03

and one key point quote the company's

8:05

commitment to developing robust technical

8:08

mechanisms to ensure that users

8:10

know when content is AI generated, such

8:12

as a water marking system,

8:15

because that can't get

8:19

this action enables creatively

8:21

with AI to flourish but reduce the

8:24

dangers of fraud and deception.

8:27

And so they're basically kind of

8:30

like testing. There's

8:32

a couple of things they're kind of testing when

8:35

it's like giving people the option to

8:37

put the stamp of like this is AI technology,

8:40

which I do think, like I feel like it's

8:42

so buzzworthy right now, and like it's getting

8:44

so much attention online that like people and creators

8:47

are doing it any ways to get that

8:49

like hype bubble and to

8:52

get a lot of views. But once

8:54

that kind of goes out of fashion,

8:57

like AI becomes normal and

8:59

people are like I don't

9:01

care that you

9:03

know, like telling if this is AI, Yeah,

9:06

it becomes like this unsafe territory

9:09

of like oh shit, we're

9:11

gonna like think that this is like a real thing. And

9:14

so like there's like a lot of research that's kind

9:16

of being put in to figure out

9:18

if there's even a way to use

9:20

AI, to to find

9:23

out if something was created

9:26

with AI, and to like kind

9:28

of funnel that in and self police

9:31

in some ways which

9:33

have been proven to be useful,

9:36

especially when it comes to people

9:40

in schools, which

9:43

if you don't know if you have chat GPT write

9:45

your paper for you, unless you're very crafty

9:48

in having chat GPT write it. A

9:51

lot of the time it's still going to get

9:53

caught with their own

9:55

AI generated thing because guess what

9:57

that thing is using a

10:00

shat GPT.

10:02

So it's every time I hear

10:05

that, it just reminds me of like, what was it like

10:07

turning in dot com you know that kind

10:09

of stuff.

10:09

Well, that's what it is. I think. It's like it's

10:12

it's finding the percentage chance

10:14

that like this was created with

10:16

AI.

10:17

I mean, if you think about it, it's very similar to the

10:19

same technology where it's like where

10:24

we're basically it says, you know, hey,

10:26

we found this other piece

10:28

of literature that you plagiarize,

10:31

you know what I mean. So it's it's not really that far

10:34

I mean, it's a new technology. AI

10:36

is very new, but it's like at the same time, it's like the

10:38

concept is not that far off. It's the same reason you

10:40

would get caught for plagiarizing, you know what I mean.

10:43

Yeah, so yeah, exactly, yeah,

10:45

but Billboard continues. Voluntary

10:47

commitments are of course voluntary,

10:50

which is likely why TikTok also announced

10:53

that it will quote begin testing an AI

10:55

generated label that we eventually

10:57

plan to apply automatically to content

11:00

that we detect was edited or

11:02

created with AI. Sure tools

11:05

to determine whether an image has been crafted

11:07

by AI already exists and

11:09

some are better than others. But

11:11

in the June version of The New York Times

11:14

tested five programs, finding that

11:17

services are advancing rapidly but at times

11:19

falls short. So basically what's

11:22

happening is like, because it's

11:24

started so late this

11:27

because like we're already like almost

11:29

a year into AI, right,

11:32

I think a year like it's

11:34

it's it's been out, it's been it's

11:37

been AI version one

11:39

eight twenty eight thousand,

11:42

you know, at this point, and it's it's continuously

11:44

getting smarter and smarter. And so because

11:47

of that, a lot of these programs

11:49

that are starting at the at the base

11:52

ground or having a hard time like catching up

11:55

to its AI competitors because

11:58

there's so much information that it still has to process

12:00

through and it just hasn't had the time to

12:03

do it. And also of course people are shifty

12:05

and can get past it.

12:07

Speaking which can I can I interrupt

12:09

you for a second, Joe, just to see something that I've

12:11

been doing this whole time while we're talking that you're gonna

12:14

laugh at

12:16

the thumbnail of this podcast episode

12:18

is going to be generated AI. Hell

12:21

yeah, because I got to lose

12:23

anymore. I did it. I generated, but

12:25

I did music business podcasts

12:28

and I whatever the first one was. I was like, I'll do

12:30

it as long as it's not, you know, crazy offensive.

12:32

So I just think it'd be funny if you've been wondering this

12:35

whole time listening to the show, what the hell

12:37

is that? That's what it is? Anyway?

12:39

Hell are you going? Hell yeah,

12:42

I've I've actually, I mean we got

12:44

into here. You know what we're gonna talk about it

12:46

call it Colin. We almost got canceled

12:49

like weeks

12:51

ago because I I

12:54

stupidly made like I threw

12:56

in some of our video clips into like

12:58

an AI program and it chopped

13:00

it up and like it to its credit,

13:03

it got us to go viral. Like

13:05

it found those like key

13:07

moments and an hour of footage that I didn't

13:10

have time to cut

13:12

together, and I posted

13:14

to TikTok and reels and on reels

13:16

it just exploded. We

13:19

got to almost a million views. Yea,

13:22

within like I

13:25

think what was it. It

13:27

was like a million views in like

13:29

like three days or something, and

13:32

but we were getting the

13:35

way the cut happened, it made

13:37

it seem like we were making fun of gen Z, which

13:39

in the context of watching it,

13:42

it does kind of seem like we are, like,

13:44

we're just two dudes like making fun of gen Z. It

13:47

cut out. Yeah, what it cut out

13:49

was the context of like, oh,

13:52

we're actually like in agreement

13:55

with like gen Z, like not drinking as

13:57

much like in these like bars and stuff.

14:00

The entire time, I'm going like, oh

14:02

man, I wish I could drink. I'm

14:04

so cool, and people are.

14:06

Like, yeah, people took it very People

14:08

were like Colin was bullying gen

14:11

Z.

14:11

They were like, Colin is a you

14:14

know, like.

14:15

You called them nerds

14:18

ironically, and I've never heard

14:21

anyone called nerds. Anyone

14:24

called anybody nerds un

14:26

ironically, especially

14:28

online. But people believed

14:30

it. And that's that is

14:33

kind of to to what we're talking about

14:35

here, Uh, kind

14:37

of the toss up of like your voice

14:39

can get diluted when it

14:41

comes to these AI technologies

14:44

because they're so I mean,

14:46

they're making choices on

14:49

your behalf right inside. How they're

14:51

editing something I'm.

14:52

Gonna put this, uh this

14:54

ai this thing I just generated

14:56

into the chat, Joe, so you can see it, so you can

14:58

see what the audience is seeing. There's

15:01

a lot going on.

15:02

So have you been doing this for like weeks?

15:04

No, I just did this right now while we were talking.

15:08

Oh oh, this is you brought up thought

15:10

this was the funniest thing.

15:12

I was like this, wait, why

15:14

didn't we make this? We're gonna art? Can

15:16

we make this her?

15:19

Yeah, I I are episode

15:22

absolutely, I'm gonna do that. I don't know what that thing

15:24

is in the middle. It looks like some sort

15:27

of antenna or.

15:28

Like there's a lot of pop filters

15:30

happening.

15:31

There's a lot of pop filtering. There's clearly

15:33

a guitar that's been thrown in there. This

15:35

is very like Eastern

15:38

European art.

15:39

Like, yeah, I love it, though, I would,

15:42

honestly I want this on a T shirt. This

15:45

looks amazing.

15:47

Oh god, Joe, God, And

15:49

this is the power of it?

15:50

What is it? One one percentage

15:53

or one half of dot?

15:57

Yeah? Right, no,

15:59

get it? Like but seriously, we've been talking

16:01

about it the whole time, especially with TikTok. It

16:03

does take It's a lot easier

16:06

to generate than it is to

16:08

find tools to you know, find

16:10

what generation is right, because you got

16:12

to think about it. That's like an extra step. It's like

16:14

you have to understand how this stuff

16:17

is kind of generated, and then on top of

16:19

that, you have to understand how to

16:21

identify those things that are generated.

16:23

So I'm really glad to see that TikTok

16:26

is taking some

16:28

steps to try to you know,

16:30

filter the content and go this is AI

16:33

generated, this is AI generated, this is AI generated,

16:35

because it's seriously going to be a problem

16:38

as the technology gets better. Now I'm sure

16:40

some of you are laughing at the greatest

16:42

thumbnail that's ever been made, but like it's

16:45

you know, a lot of this AI stuff,

16:48

you know, they're like it's made with AI, and

16:51

like there's two things to that one

16:53

it's and this is honestly, if

16:55

you one of my favorite books, the Toyota Production Method

16:59

is uh. Automation with a human

17:01

touch is a principle in that book for efficiency,

17:04

and that's very similar to AI, where this

17:06

is automate. This is automation, like

17:08

it's automatically making this thing

17:10

right, and then with

17:12

a human touch you can make AI look

17:15

incredibly realistic. Like I'm sure if

17:17

me and Joe went into this image and

17:20

we you know, fixed up the words and

17:22

put the percentage and maybe took the logos out

17:24

and the you know, they'd be like, oh, this is just like

17:26

some art you know what I mean that like they

17:28

had made or commissioned or something,

17:31

But it's AI generated and the same thing.

17:33

Obviously the bar gets higher with video,

17:36

but like the same thing happens is

17:38

that a lot of this AI stuff is also

17:42

you know, one it's getting to a point where it's

17:44

just it's good enough to be on its own. And

17:46

two it is like with a human touch,

17:48

like a lot of people because you think about it,

17:51

like people generate these AI content

17:53

and they generate and they generate and they generate,

17:55

and then the human touch sometimes is just

17:57

as simple as that one's the best one,

18:00

you know what I mean. Like so

18:03

that that's kind of the thing with AIS. It's got a long

18:05

way to go, but with

18:07

human touch, this can

18:09

really be used to deceive people. This

18:11

can be used to literally create fake news,

18:14

as Joe was saying, and so like

18:16

it's definitely something that I think needs

18:19

to have some system to flag

18:21

or filter, not in a negative way. You know, it's

18:23

like this is innovation at the end of the day,

18:26

but at least say, hey, this is AI, you

18:28

know what I mean. So people that

18:30

use social media as their primary news

18:33

source, which a lot of people do, don't

18:35

get confused about something or

18:37

worse, you know, like we could have like

18:39

you're saying political ramifications

18:42

from this because everybody thought, you know, the

18:44

President said something or like some other

18:46

you know, legal officials said something and it

18:48

wasn't true, you know what I mean. So

18:51

and you have to think about it this too. It's like

18:55

AI is its own fledgling

18:58

interests in like industry,

19:01

and everyone's excited about it, but

19:03

the amount of experts in it are few

19:05

and far between. And just

19:08

you guys listening to the show and us having

19:10

interest in talking about AI has

19:12

already put you ahead of so many other

19:14

people, you know what I mean, that don't even think

19:16

about that AI is out in the landscape.

19:19

So that's why it's really dangerous,

19:21

is that there's a lot of people that haven't even

19:24

factored in AI as a thing

19:26

in the same way that like if you look up, you

19:28

know, everything goes back to school for me

19:30

today for some reason, like if you're looking up sources

19:33

on the internet for like a paper, you're like,

19:35

that's a non credible source, you

19:37

know what I mean. Like people you know,

19:39

have just been looking at stuff on TikTok

19:42

regularly before AI and going, this

19:44

is the gospel, this is true, this is absolutely

19:47

true. And now with AI, it's like

19:49

we can make that gospel say whatever it wants, you

19:51

know what I mean. So I think

19:53

it's interesting. I do think that we're

19:55

gonna see people one

19:59

who are you know, trying to be maleficent

20:01

with it, but also people that maybe

20:05

fearing that this will stifle kind

20:07

of AI content, you know, maybe

20:09

go into suppression. But I think

20:11

that this is a tool that's extremely powerful.

20:14

So the you know, the question of how much

20:16

or how little we use needs

20:18

to be answered, you know what I mean. So

20:20

yeah, and that's that's a question for

20:22

everybody, you know what I mean. Like then it could come

20:24

down to context, it could come down to what's

20:27

going on in life. You know, there's a lot of things going

20:29

on with that. So I don't know, right,

20:32

like it's.

20:33

Yeah, I mean, it's it's such

20:35

a it's

20:38

it's like it reminds

20:40

me so much to some extent

20:43

on the hype that was n FTS,

20:46

and I feel like they kind of bled into each

20:48

other.

20:48

Yeah a little bit.

20:49

Okay, Like with any tech thing, there's like,

20:52

you know, like the big tech things

20:54

that happened in our lifetime were

20:56

streaming, and then it

20:59

goes into bigcoin, and then

21:01

it goes into NFTs,

21:04

and now it's going into AI.

21:07

But I think what's different about AI in

21:09

terms of those other ones, and what makes it more

21:11

powerful is because of the

21:13

ramifications and the real ramifications

21:16

that it has on how it interprets mass

21:19

data and like how it like

21:22

collectively shapes

21:26

not only shapes, but can can build on

21:29

itself. It can like it

21:31

has its own I

21:33

don't want to say personality, but brain

21:36

right that it can. It can stack up.

21:38

And like I think, I

21:40

think like the hype behind those other things

21:42

that I just listed off, you

21:44

know it did it definitely seemed more like a

21:47

fad, right, Like

21:49

this is a fad. This isn't actually going to like help

21:52

the A lot of people with crypto

21:54

actually thought it would help like a bunch

21:56

of people, and like.

21:58

It didn't have you know, it didn't have a really

22:01

innate value, you know what I mean, Like.

22:03

It didn't have a it didn't have an everyday

22:05

value.

22:06

Well in the sense

22:08

that what created value for

22:11

it was that people believed it had

22:13

value as opposed to having something

22:15

intrinsic at the end of the day like not.

22:18

And I don't believe in this whatsoever because

22:20

this is like the argument people used to go back

22:22

to the gold standard, which should never come

22:24

back ever when it comes to finance. But like

22:27

at least at the end of it, like gold, like

22:29

you know, it could be used in electronics

22:32

a lot, you know what I mean, Like it has a very high

22:34

usage. And the crypto it's like, yeah, there's

22:36

nothing that this does except takeaway

22:38

resources and everything

22:40

like that, and AI like you're saying, has

22:42

an intended use. So I think in

22:44

suppression when it comes to TikTok it,

22:47

you know, it's we gotta This

22:49

is a very careful walk

22:52

through a minefield that I hope

22:54

that we're prepared for as a society to walk,

22:56

you know what I mean. This isn't like we've

22:58

solved it, we're just gonna ban everything

23:01

or like put a label on everything immediately.

23:03

It's just more of like, well,

23:05

we need to look at this and see

23:07

how we walk through this minefield and

23:09

how we assess this kind of information and

23:12

like the systems that we have, and like

23:14

what as a kind of free

23:16

market society we want to put constraints

23:18

on because it's a it's a very dangerous

23:21

you know, piece of technology in the wrong hands,

23:24

So we need to watch out for that explicitly

23:27

at the end of the day. So I

23:29

think, yeah, it's something

23:31

that I love your point

23:34

about technology kind of blending together because

23:36

there's things that we don't think about that really

23:38

like it all blends together. And then we're here

23:40

and we kind of have to take a step to be like, no,

23:43

we are at this step, right, we have to take

23:45

ourselves out of it and be like, we're at this step with

23:47

AI, so we have to figure it out.

23:49

Like one that came to mind when you kind of went down that argument

23:52

for me was the dot com

23:54

boom to like social media. Yeah,

23:57

I you know, Joe and I

23:59

for this example, are blessed to have older

24:02

siblings, so like we understand

24:05

what that kind of was different. I mean,

24:07

the dot com boom was like websites

24:09

are out there, you can use them for services,

24:12

you know what I mean, that's amazing and

24:14

it made the stock market go insane until it

24:16

eventually crashed because people were just getting

24:19

websites to have websites that didn't have a

24:21

use or a service ie crypto, So

24:25

that happened, and then that was like

24:27

around two thousand and then like

24:29

right around there is when social media really

24:32

started to come off. And if you talk to people

24:34

that are around our age that

24:36

don't have a lot of older siblings, that aren't,

24:38

you know, very historical about technology,

24:40

they think all of that blends together,

24:43

right, that we had websites, social media,

24:45

you know, and it's all just happened at once, But that

24:47

was one step we took, and

24:50

you know, the ramifications of having

24:52

websites that we could get services

24:54

through and you know, good services, vital

24:57

services, right, good things like we hadn't even

24:59

fully exp war that yet. And then it goes, here's

25:01

social media that has the power to

25:03

socially change how we talk to each other

25:05

and whatever we're and then I'm like, wait, we're still

25:07

talking about dot com. And then it's like here

25:10

is streaming music, media

25:12

whatever, and it's like we're not even on the dot com

25:14

thing. Stop doing this, and

25:16

here is cryptocurrens and you're just

25:18

like, we need a minute to talk about this,

25:21

right, So, I think it's one of those things

25:23

where you can't stop

25:25

it. And like anytime any

25:28

of these social media companies or the you know, the

25:30

government are doing this, we're already

25:32

behind. So there kind of has to be some urgency

25:34

to this, right, There has to be some urgency

25:36

to like, what are we doing, how can we do it?

25:39

Let's do it. But at the same time, I

25:41

did say it's a careful minefield, which is what makes

25:43

this all freaking crazy.

25:46

Yeah, so, yeah, I mean it's it's

25:48

truly the wild West.

25:49

I'm gonna take my tinfoil hat off now, No.

25:54

That's great, it's great,

25:56

But yeah, I mean TikTok's like

25:59

the only company that's also rolling out plans.

26:01

Streaming service Deezer has also

26:03

laid out their plans to develop

26:06

some tools to grab them.

26:09

And according to Billboard, which

26:11

is also another quote a quote from an economic

26:14

point of view, what matters most is regulating

26:16

the things that really go viral, and

26:19

usually those are the AI generated songs

26:21

that use fake voices or copied

26:23

voices without approval. And

26:26

that's a quote from CEO of

26:28

Deezer, Geronimo

26:32

Bulgaria, and

26:35

according to Dennis

26:38

like this is a crazy name. I'm just

26:40

gonna say, Dennis L who

26:43

is the Believe co founder and CEO,

26:46

quote, you have technologies out

26:48

there in the market today that can detect an

26:50

AI generated track with a

26:52

ninety nine point nine percent accuracy

26:55

versus a human created

26:58

track. So he

27:00

goes on, quote, we need to finalize the

27:02

testing, we need to deploy, but

27:04

these technologies exist. And

27:07

then someone else pointed out

27:10

that there's definitely, definitely

27:13

like a lot of kind of countering

27:15

points. Especially

27:18

Cynthia run In from Duke University

27:21

said that quote, every time somebody builds

27:23

a better generator, people build

27:25

better discriminators, and then people

27:28

use the better discriminator

27:30

to build a better generator. So

27:33

the generators are designed to be able to

27:35

fool a detector. So I'm

27:37

gonna I don't know about you, Colin, I'm going to listen

27:40

to the Duke University person over the

27:43

Believe CEO.

27:44

Again, it's all based on each other, right,

27:46

I mean this is why at the end of the day,

27:49

like it's very similar to the field

27:51

and the questions that cybersecurity deals

27:53

with all the time, like where our passwords

27:56

are, where our data is. That's what they do

27:58

is they make systems to pretch data.

28:00

Something breaks the system, and guess

28:02

what they do. They examine that

28:05

thing that broke their system, and then they adapt

28:07

and it's better. That's just how

28:09

one life kind of works. But too specifically

28:11

technology works. The problem with technology,

28:14

and especially digital technology is it's insanely

28:17

fast. So if you get

28:19

a whole you know, if TikTok

28:22

or any of these social media companies make this

28:24

whole system where it's like this is how

28:26

we identify AI and this

28:28

is how we label it, and then like literally it's like

28:30

by the time you put the system out, it's already

28:32

gone. Now people use that

28:36

in a bad way, Like people

28:38

say, then it's not even worth trying, and

28:41

that's not true. Right, This is a

28:43

constant battle, and this is definitely

28:46

something that I think, you know, social media

28:48

companies with the content that

28:50

you know, users create that is not

28:53

exactly cultivated by TikTok

28:56

will have to have literally departments

28:58

to deal with, right, We'll have to.

29:01

And I hope that you know all of them are like we're making

29:03

these investments now, but every one of them is like we're

29:05

making these investments now. So we can make more content

29:07

and make more money. And it's like at the same time,

29:10

it's like, well, you need to be getting these people

29:12

in there, and so you guys can do it differentiate

29:14

this content and so you don't accidentally like

29:17

sway people into you know, crazy

29:20

ideologies or thinking that

29:22

the world's gonna end in two days or something

29:24

like that, Right, And that's kind of the power. And

29:26

to be honest, it can be doomsday like that, or

29:28

it could be something as little as like you

29:31

know, in our show with music business, like imagine

29:33

an artist you like and then they come out

29:35

as an AI and they say like something super

29:38

you know, kind of controversial to their

29:40

fan base even you know, and they're

29:42

just like I didn't say that, and AI said

29:44

that, and there was nothing I could do about it. Right, You

29:47

basically lose the ability to control your

29:49

own pr which is kind of insane.

29:52

Yeah, it's it's

29:55

just so crazy,

29:59

honest, Like, I don't know.

30:03

I don't know, that's beyond the scope for the show. That's why

30:05

we're that's where we're ending at. We were just too

30:08

busy thinking about AI all the

30:10

time.

30:10

So Colin,

30:14

what do you have all

30:17

right?

30:17

Well, speaking sort of about AI,

30:19

but not really. We're talking about a

30:21

proposed US bill which is

30:24

very interesting as this could allow indie

30:26

artists to negotiate collectively with streaming

30:28

services, which I was like, hello,

30:31

you know this is casually going by my newsfeed.

30:34

So this bill is called the Protect Working

30:36

Musicians Act of twenty twenty three. It

30:39

was introduced on September nineteenth, so that was

30:41

where we go from today. And basically

30:44

the House wrap Deborah Ross, a

30:46

New North Carolina Democrat, updates

30:48

an earlier bill that was introduced

30:51

by Representative Ted Duluth,

30:53

a Florida Democrat, and that bill basically

30:55

focuses on the relationship between indie

30:57

artists and music streaming services. So

31:01

basically what this allows them to do,

31:03

and like AI companies will get

31:05

to in a minute, but it allows these

31:08

musicians to work together

31:11

as indie artists to collectively

31:13

bargain against streaming services

31:15

for these rates.

31:16

Yes.

31:17

So the reason why that's important is

31:19

because most indie artists use

31:22

like an independent distributor like a

31:24

tune Core or a distro Kid or

31:27

CD baby for instance, and those

31:29

people negotiate licensing deals with

31:31

the DSPs. Some even

31:34

do like you know, a digital Rights

31:36

Music Licensing Network. If you've ever heard of Merlin,

31:38

it's a little bit different. But the point is is, like there's

31:41

another company that's already made this agreement

31:43

with Spotify, with Apple whatever, and

31:45

their rates are their rates are their rates right, and

31:49

so like they can't really do much here. And

31:51

the bonus of a label, and

31:54

honestly, one of the biggest bonuses of a label

31:56

is they collectively bargain, you

31:58

know, and because labels, if you

32:00

think about it,

32:03

basically act as a union of themselves,

32:05

right they go. We have, you know,

32:07

hundreds of thousands of artists on

32:10

our roster. We have you

32:12

know, we'll probably get the next big

32:14

people because that's what we do. We promote

32:16

the next big people. So give us

32:18

a deal Spotify, give us a deal,

32:20

Apple Music, give us a deal Amazon.

32:23

And that can be in like promotion, actual

32:25

payment, upfront payment, you know

32:27

what I mean. We've talked about on the show, like Drake

32:30

was the most streamed artist of last year, Like the just

32:32

straight payments he would get right that

32:34

were just like you're a big artist, here's

32:36

like one hundred thousand dollars, you know what I mean, Like

32:38

not like, oh, you're paid for it. Was just straight

32:41

up from Spotify. The point is

32:43

is that this bill basically lets

32:45

you collectively argue on that

32:48

together, and it creates an exemption to

32:50

the US anti trust laws that

32:52

would allow them to negotiate licensing

32:54

deals with a quote dominant

32:56

online music distribution platform

32:59

or a company engaged in development

33:02

of generative

33:05

artificial intelligence. Isn't

33:07

that interesting? So the

33:10

bill basically defines quote a dominant

33:12

online music distribution platform

33:14

as an entity that operates an app,

33:17

website, or other online service that

33:19

is used by members of the public to listen to sound

33:21

recordings via digital audio

33:23

transmission and audio visual presentation

33:25

or any other means basically being like, there's

33:28

no way you can, you know, argue like Spotify

33:30

tomorrow can't be like we only do CDs

33:32

now like it's like nope, you're you're doing

33:34

it. And the main

33:36

caveat for these online music

33:38

distribution platform like what

33:41

fits in it and what does it is just

33:44

revenue. Mostly, it's saying if you

33:46

make annual revenues

33:48

of over one hundred million dollars or

33:50

more, then you can be

33:53

you know, collectively bargained with under

33:55

these you know, artists, and

33:57

these artists basically can come together. And what

34:00

what's pretty kind of cool about it and

34:03

I kind of interesting to me is

34:05

they can do it under one lawyer as

34:08

like a big class action lawsuit.

34:10

Basically, so like imagine,

34:13

if you will, like

34:16

we have like imagine like all

34:18

these scenes of music, like Nashville has

34:20

a scene, Atlanta has a scene, La has a scene,

34:22

New York has a scene. You know, I'm sure Oklahoma has a

34:24

scene, right, Like imagine

34:27

okay, see okay scene yeah

34:29

right, But like seriously, like imagine

34:32

like all of them, you know, they go to shows together

34:34

to do this. They're just all onside and they were waiting for

34:36

a record label and they just turn to each other and be like,

34:38

well, I know you, you know me, we know each

34:40

other, we know everybody else in this town. Why don't

34:42

we just go get a lawyer together, and

34:44

let's go get us some good rates on

34:47

Spotify.

34:47

You know what I mean, You're going to recite the Barney

34:50

song.

34:51

I Love you, you love me?

34:53

You know, but like, yeah, it's

34:56

it's basically that's kind of what it is

34:58

is that they can collectively bargain again them

35:00

as like one big thing to try

35:03

to get you know, basically

35:06

better rates and privileges that

35:08

the labels usually use as their like defining

35:11

thing, you know what I mean. But

35:15

yeah, it's kind of it's kind of crazy, and

35:18

so I kind of wanted to talk about it a little bit.

35:20

Is like, with these kind of

35:22

tools, do you think this weekends labels

35:24

in general? You know what I mean? Like, this

35:27

feels like what a label does collectively

35:29

bark it. So do you think this, you know, weakens

35:32

people, you know, wanting to maybe

35:34

be on a label even more than they have been in

35:36

recent years.

35:38

You know, I don't. I don't

35:40

want to say yes because I

35:42

think I think that there's my

35:45

gut is telling me that, Like, honestly, the thing

35:47

that labels have up on pretty much

35:49

everybody in the industry is their grandfathered

35:52

in in a lot of

35:54

ways. They have long standing relationships,

35:56

They have long they have a lot

35:58

of money that is shaved and pooled

36:01

to do a lot of things, and a lot of brand

36:05

awareness from not only the general

36:07

public but from

36:10

the legal space as well.

36:13

And industry

36:15

relationships too.

36:17

I do think that, Yeah, I do think

36:19

that, like, labels

36:21

have been losing power in a lot

36:23

of ways. I think people have been saying no

36:26

to label deals a lot more recently

36:29

because you

36:31

know, a lot of the stuff you can do yourself. There

36:33

there are certain areas like, especially

36:36

when you get to be a certain point

36:38

as an artist, you do kind

36:40

of want that signing

36:42

bonus. That's kind of like where people

36:44

get like, Okay, I'm ready to

36:46

have this good signing bonus right

36:49

now, right to help

36:51

out with life stuff. And

36:53

I think that that helps.

36:56

And also having like a team of people like you

36:59

know, not not every label

37:02

is the

37:05

boogeyman, you know, but

37:10

the label system

37:12

is changing, and I will give you this,

37:15

I do think that this will change the ways

37:17

labels operate. I don't

37:19

know if it's going to take power fully

37:22

away from now.

37:24

I think that's that's a very astute point because

37:26

there's one thing I also haven't mentioned here actually,

37:29

So this is at the same time

37:32

that apparently major record companies

37:34

are trying to negotiate with streaming services

37:36

for better payment models aka collective

37:39

bargaining. But one

37:41

that's interesting here is that they

37:44

want this new model that I've seen.

37:46

It's called that they have like two groups

37:48

of basically royalty groups, and one is

37:51

professional artists, and

37:53

they're defined as having a minimum of thousand

37:55

streams per month or a minimum of five hundred

37:58

unique listeners. And so they

38:01

want the differential between professional

38:03

and non professional, who are people that don't

38:05

fit that you know, catalyst of being

38:08

over a thousand streams a month and five unique

38:10

whisteners. They want a

38:12

higher rate for the professional artists,

38:14

and then they want to lower the rate

38:17

for the non professional basically,

38:20

which is interesting for a

38:22

couple reasons. One, we we had a whole story

38:24

on the show one of when I was talking

38:26

about that, I think it's close to

38:28

it was like close to like eighty or ninety percent of

38:31

musicians have like less than like

38:33

a thousand or to

38:36

one hundred, like or a thousand

38:38

streams per month, right, And

38:40

so it makes sense that it's

38:42

like in the way that

38:45

streaming is calculated, it's

38:47

calculated as a pie. Like

38:50

basically it says like it's

38:52

not just like you get this money right away,

38:54

it's calculated in big pie percentages.

38:57

So basically saying like, you know, if

38:59

Spotify may five hundred million dollars in revenue,

39:02

they go to the labels first, and they'll be like, okay, Universal

39:05

Music Group, you were five hundred, you

39:07

know you were would say half

39:10

even though maybe that's not true. But they're the biggest

39:12

label in the world, Universal Music Group. You were

39:14

half of the streams from Spotify. We're

39:17

gonna give you two hundred and fifty million dollars,

39:20

okay, and there

39:22

you go. And then it's like, now we're going to this. There's

39:25

a lot of computation and like different stuff

39:27

in there. I know I'm ignoring. And then Universal

39:29

goes the same way and they go, Okay, Drake

39:32

was let's say something crazy

39:34

was forty percent or fifty percent

39:37

of our basis. So then it goes to like one

39:39

twenty five million to Drake, right,

39:42

then they cut out based on his deal, they

39:44

take the ninety, they take it, Drake gets like the

39:46

ten to twenty. Then Drake makes like

39:49

like ten million dollars, right, So that's

39:53

how that works. So it's interesting because it's like

39:55

if you take out the other ones,

39:57

right, like you're you're making the pie

39:59

bigger from the beginning, right,

40:02

You're making like going like give us

40:04

And they literally call it a double boost

40:07

to royalty payments. And one

40:09

thing that's beneficial for you

40:12

know, professional kind of artists

40:15

quote unquote, is that you get these double boosts and

40:17

who's maybe gonna have continual streams,

40:19

continual listeners, which you notice

40:22

it's not based on total it's based on

40:24

like a continual people that have teams

40:27

who are probably signed to labels. So like, I

40:29

think that's kind of the idea, is that

40:31

in a way, it would, you know,

40:33

take this money and put it more towards

40:36

people that have this continuous

40:38

career who are relying on

40:40

it. But I think it's gonna make it way,

40:42

It would make it way harder to come up, you

40:44

know, via royal royalty payments. Because

40:46

we joke on the show all the time out royalty payments

40:48

are a joke. But at the same time, I'm like, there's

40:51

something after a while, there's something, you

40:53

know what I mean, And so it

40:56

would just be interesting, you know for these people

40:58

that get just like a couple bucks, you

41:01

know what I mean. It's not crazy, but yeah,

41:04

that's I thought that was interesting that you brought that

41:06

up though, like the collectiveness of it

41:08

and kind of like how we could have incentives.

41:10

And I think that's why they, meaning

41:14

basically the big three labels are

41:16

really campaigning for that kind of model because

41:18

there's kind of some complex math there and it doesn't

41:21

make it feel like like from

41:23

a political standpoint, it doesn't make it feel like,

41:25

oh, we're taking money from other people. It's like real

41:28

artists get paid more. Are you against

41:30

that? You know what I mean? So

41:33

yeah, I thought that was very stute.

41:37

Yeah, I man,

41:40

we'd really just as like,

41:42

collectively as the music industry, we really

41:44

just love cannibalizing ourselves.

41:47

Oh absolutely when it comes to

41:49

like which I guess it's not totally

41:51

like streaming. I mean we could go back

41:53

to like the day one argument of streaming of

41:56

like, you know what you.

41:58

Could do instead of being like you get when

42:00

you get one, just raise the fucking rate.

42:03

Yeah.

42:04

Yeah, like that's kind of that today.

42:06

To do that, we need a union and the music

42:08

industry doesn't.

42:10

Want you mean, you need to collectively bargain.

42:12

Yeah, but it could be interesting. I think

42:14

this would be really interesting if this gets through.

42:17

Again, like it's notable because

42:20

I just thought it was a very interesting argument to

42:22

make. But I don't know if it's gonna get

42:24

through. I mean, one of the congressmen on this set.

42:26

As you've seen in Congress, lots of bills aren't passing,

42:28

like the budget, so it

42:31

might be hard but basically, he says, this

42:34

has been a very bipartisan issue in the Judiciary

42:36

Committee. It's the perfect time to bring these

42:38

issues up. So maybe this could be like, you

42:40

know, if Congress is getting crapped on from

42:43

like the American people because they're not doing their job,

42:45

they could be like, we did something. Everybody's

42:48

from paying more musicians, you know what I mean,

42:50

Like we did something. But uh, yeah,

42:53

it's good. It'd be interesting because I feel like there's

42:55

some labels that would probably be like, no, don't do

42:57

this, because then that deincentivizes

43:00

labels, you know, to their Congress friends. But

43:02

like at the same time, it's

43:04

interesting to see kind of

43:06

this argument be made, and I like, I

43:09

just imagine and I would just like to see

43:11

it to see it. Like imagine like

43:14

one giant company

43:16

decides like, hey,

43:20

everybody that's an independent musician,

43:22

hit us up if you want to be part of this collective

43:25

thing, and like thousands and thousands

43:27

of musicians joined together under

43:30

this like being of like a

43:32

lawyer or two, and they're going like, now we will

43:34

collectively bargain as the thousands and thousands,

43:37

you know what I mean. I would love to see

43:39

like the independence of that, though,

43:41

I am worried about, you know, like the business

43:43

model of it, because being like, we

43:46

could just do it. It's like ten people, you

43:48

know what I mean, And you could pay me forty thousand

43:50

dollars as a lawyer, and then instead of making two hundred

43:52

dollars a year, you make four hundred, you know what I

43:54

mean, except you still owe me forty

43:57

thousand dollars. That's where I worry

43:59

about this law. But anyway,

44:04

Joe, I guess I don't know what we'll

44:06

do for the last episode. This is maybe the last time. What

44:08

have you been listening to?

44:09

Yeah, let's let's I

44:12

feel like we could do it for the last episode. But

44:16

I haven't listened yet,

44:18

but I will. But Devendra Banhart just

44:20

released a new record. It is

44:22

so beautiful. It's called Flying

44:25

Wig. I have actually listened to it because

44:27

I've was honored enough to work

44:31

on this project and I

44:33

am very proud with a lot of the stuff that

44:35

we've created, and

44:38

such an amazing team in

44:41

such an amazing like songwriter

44:44

and just being

44:46

Devendra is like super super

44:48

sweet human. So please check

44:51

out his newest record. It's

44:53

called Flying Wig, available

44:56

everywhere. But I've also been

44:58

listening to. I listened to that. You choice

45:00

of von song, which please

45:02

tell me you listen to that?

45:04

No, oh my god, yours right now

45:06

and see what I think.

45:06

Dude. Okay, I'm not going

45:09

to tell you anything, but you have to listen to

45:11

it purely for the sample.

45:13

Purely for this is the newest song.

45:16

It's the newest one. It's

45:20

called got Me started, got

45:22

me and started. Okay, well yeah.

45:27

Somebody. Yeah, I'm gonna look at somebody is gonna

45:29

be like, are you kidding me? Colin?

45:32

You don't know choice Evon's whole catalog. I'm like,

45:34

I'm sorry, man, all right, Oh

45:36

my god? Is this really a starstruck?

45:39

Is that what that is? Yeah, it's just it's

45:43

it's the Yeah,

45:46

I can't remember what that is. It's like because

45:48

it's god, I know what

45:50

this sample is.

45:52

I mean, I can just add that new Google

45:55

that Google voice search thing.

45:58

Oh what's the sample?

46:00

Yeah, hold on, let's see it.

46:02

No, I gotta I got it. It's shooting stars

46:04

that's what it is. I was like, it's something like starship

46:07

or something. Yeah, that's

46:09

so.

46:09

Funny, Cobra

46:11

Starship? Was that what you're It's

46:15

what did you say Cobra Starship.

46:17

No, I was not thinking of Cobra Starship.

46:21

God. I might have been thinking of the band

46:23

Jefferson Starship and then later Starship.

46:25

But oh nice, let's see

46:28

thing of old bands as I mean, that's

46:30

half my thing on the show. We've made one hundred and fifty

46:32

episodes. I'm sure if we averaged them

46:34

out, it probably wouldn't be later than like nineteen

46:36

ninety ninety percent of the time. But

46:40

basically, we recorded

46:43

yesterday a little bit for one episode,

46:45

and I for or two days ago,

46:47

and then I forgot that. I got to work the next

46:49

day and I was like, oh my god, today's

46:51

the day and it's the twenty

46:54

first of September. Baby. So I've been listening

46:56

to that. Yeah,

46:58

I mean that song holds up and incredibly

47:01

well, like like well all of

47:03

earth Wind and Virus Catalog does. I mean,

47:05

it's like one of the few that of

47:09

like the disco era that like

47:11

really holds up. Like you can play

47:14

this song at like a wedding or

47:16

like, you know, and with regular people around

47:19

and people aren't like, what the fuck is this? You

47:21

know what I mean? So

47:23

I was listening to it.

47:25

It's like one of my favorites. I

47:27

think I talked on the show. I mean, we've been doing

47:30

it for three years, and I'm sure I made the September joke,

47:32

but like there was a guy I used to do videos

47:35

every year for the like song, and like

47:37

his whole like joke was like

47:40

today's the day like kind of

47:42

joke, and he would do it literally

47:44

every year, and then he stopped doing it. So I was really sad

47:46

about it, but I did I

47:48

did have some interesting like song trivia

47:51

about it, which was which was fun to learn,

47:53

is that like basically the

47:55

song, like the songwriter for it

47:57

was like kind of approached by like the

48:00

lead singer Maurice and was

48:02

like, hey, you want to write the song? And

48:04

uh, he wrote the song.

48:06

And it was really funny because apparently,

48:09

like according to everyone,

48:11

everyone is always like what the hell is happening on

48:13

the twenty first of September And they literally were

48:15

like it just sounded the best, Like

48:18

there's nothing going on that day, There's

48:20

not There wasn't like that's when I was married,

48:23

you know, Like.

48:23

It was like they're just like that is

48:26

it rolls off the tongue.

48:28

And weird part of the weird thing about the song

48:30

is the guy wrote like I think he wrote

48:32

just the like he wrote the lyrics. I don't

48:34

know if he wrote the chorus. I mean, I'll give him credit

48:36

if you did. But like the

48:38

funniest part about it

48:41

was that he wrote the song apparently,

48:43

and it's you know, uh got

48:46

you know, the regular lyrics, and then the chorus

48:48

starts and he goes, man,

48:51

you know like that. And first off,

48:54

side note, this is one fight I've gotten

48:56

in with somebody pretty much every two years.

48:59

It is body Ah. It is not party

49:01

on. It is not any other It is

49:03

just.

49:06

Yeah that says party on.

49:08

There. There are people I have had arguments

49:10

with that go that's not what it says.

49:13

I go look it up, look up what the lyrics are.

49:15

And the amount of times I've looked at the Google

49:17

search results for this song is too

49:19

many because everyone's like, oh god,

49:21

you were right.

49:22

I was like, I know, but basically

49:25

like wait, Colin, why are you so upset over this?

49:27

And You're like, it's just it's my thing, Okay,

49:30

Yeah, it's my thing.

49:31

One thing that's really funny with this is

49:34

that apparently when the when the

49:36

lead writer like wrote the song,

49:39

he didn't have that part in there, and apparently

49:42

he hated that partly,

49:44

like.

49:44

Yeah, it's kind of it doesn't make sense.

49:46

It's yeah, it's uh, I'm sorry

49:48

what I've been misgendering this person

49:50

the whole time. It's like, uh, Ali

49:53

Williams writes the song, so like, uh,

49:55

they they wrote the song. I'm just gonna say they let's

49:58

just go with that. They wrote the song, and

50:00

and like it was so funny

50:02

because they have this quote that I love that it says,

50:04

quote, I just said, what the fuck does

50:06

body ah mean? And

50:09

he and Maurice essentially said, quote,

50:11

who the fuck cares? And they said

50:14

I learned my greatest lesson in songwriting

50:16

from him that not every lyric,

50:19

uh, that not a lyric should get

50:21

away in the groove, you know what I mean? Basically

50:23

being like, yeah, it's just fun you know what I

50:25

mean? Like, honestly, if I

50:27

asked you to like sing that song, that's probably

50:29

the part you remember, you know what I mean? Other than

50:32

do you remember?

50:35

But yeah, it's

50:37

so funny. Like I was just I

50:40

was just dying at that is. I was just

50:42

like, what the fuck is body? Yeah, anyway,

50:47

we're gonna leave you.

50:47

With that question audience, What the fuck is body?

50:49

What the fuck

50:52

is body? Yeah,

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features