Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:11
Shalom, Jesse, you're back.
0:13
Shalom, my friend. How's
0:15
it going?
0:15
Good. How was that? How was what? Your
0:17
big trip to the Middle East. I here's how here's
0:20
how I think it was gonna go. This is this is
0:22
my prediction. I thought there was a fifty
0:24
percent chance that
0:25
on your trip to Israel, the food would
0:28
be so good that you would join
0:30
the IDF and a fifty percent
0:32
chance that you would feel so guilty about
0:34
being in Israel, that you would get
0:36
kicked out of the country for trying to do a land acknowledgment
0:38
center. Was I right?
0:39
Land acknowledgments in Israel would
0:41
be rather broad. It's
0:44
not like this isn't quite like America
0:46
where it's like three hundred years
0:48
ago. like You
0:49
know, sixty years. They
0:51
take a while there.
0:52
Yeah. Yes. I
0:54
was in Israel, there was a
0:56
trip put together by Israel's Foreign Ministry,
0:58
The New York Council and This
1:00
is good. Wait. Let's just pause there for a moment.
1:02
Yep.
1:03
Why?
1:04
Why the trip? Yeah.
1:05
Why did they put this together? I was also
1:07
invited on this trip. I didn't go because I
1:09
thought that if people found out I took a free
1:11
trip to Israel paid for by the Israeli
1:13
consulate. I would never hear the
1:15
end of it. You however are less wise
1:17
than I am.
1:18
Well, Yeah. That would have sucked if people were
1:20
mad at Katie Herzog on Twitter. That's a good basis
1:22
for making decisions about things. Yeah.
1:26
The reason they did it is propaganda. I mean, I mean
1:28
that in a friendly way. I know they get they I'm buddies
1:30
with the the guy who sort of ran it and let
1:32
it The
1:32
head propagandist. The
1:34
head propagandist of the Jewish cabal And
1:36
when they when they invited me, I said, you
1:38
know, because you guys are paying for it, if I'm gonna
1:40
do it, I would wanna come earlier, stay late.
1:43
and do programming with group critical
1:45
of Israel, which they were fine.
1:46
And this was with a a group. You went
1:48
with a bunch of other journalists,
1:49
friends. Yeah. Some are friends. Some have I'm not
1:51
gonna, like, say the names of folks who happen. Although,
1:53
I admit, I disclosed it. I think everyone should disclose
1:55
it, but some of them have already talked about it, like, on
1:57
the fifth column, Michael Moynihan.
1:59
Well, Kimile did not go because they wouldn't
2:02
pay for him to go for class. Yeah.
2:03
He wanted, like, to turn basically slave
2:05
boys to fan him down on the fight and feed him
2:07
grapes, and they were like And
2:08
they said no for some reason. You know what, I think that's racist.
2:11
The fact that they would not invite
2:13
that they would not pay for Camille to go business
2:15
class. Way more black
2:15
shoes than you would think once you get over there, by
2:17
the way.
2:17
Well, I have heard that blacks are the real Jews.
2:20
Mhmm. Yeah. No
2:22
comment on that. So basically, the way I the
2:24
way I squared it with myself was I
2:27
yeah. They they said I could
2:29
they'd fly me there earlier, let me stay late.
2:32
And I said, I would do something with
2:34
a, you know, a group critical of Israel. I would disclose
2:36
it on my website and people can decide for themselves.
2:38
So we did three days with Butzellem, which
2:40
is a lefty group that's very critical of Israel
2:42
and documents what it sees as composite
2:44
human rights abuses against Palestinians, and then
2:46
we did the much better fun day
2:49
six day trip with the Israelis. I
2:51
said early on whichever group sort of wanted
2:53
to guide me more. I would side with him. So
2:56
Shalom Israel. Well, hi. I'm
2:58
Israel. Hi, as I say.
3:00
Can you tell us once or for all is Kanye.
3:02
Right? Did you find the banking center?
3:04
All I'll say
3:06
is that it was it's an incredibly fucked
3:08
up place. As I wrote in my newsletter, I think
3:10
more Jewish people in particular should go to the
3:12
West Bank. But
3:15
it's sort of hard to come back from a trip like
3:17
that and like say anything in particular because
3:19
there's so many layers of, like, trauma and history
3:21
surrounding everything. I've been pretty know,
3:23
it's not something to write about a lot. I think the occupation's
3:26
horrible. I think the situation in the West Bank is
3:28
basically in a part time system. also think Israel
3:30
has legitimate security concerns. So I'm sure
3:32
by saying it's complicated that will satisfy
3:34
everyone and no one will comment angrily.
3:36
What I heard you just say is that you took
3:38
money from an apartheid government That's
3:41
what I'm picking up from
3:42
this. Mhmm. I don't think he's in a part just to
3:44
be clear, I don't think Israel's in a part tied state.
3:46
I think the situation in the West Bank is basically
3:48
a part
3:48
Okay. So semi and part tied. semi
3:50
part. part I'd like But you took
3:52
you take from what
3:55
are your dirty sources of money? Our listeners. Our listeners.
3:58
knows what our listeners are. It's the same
3:59
thing. Five dollars at a time. Yes.
4:01
Yeah. So I was very grateful
4:04
I got you over there to be clear. It's an incredibly
4:06
fucked up place. I wanna learn more about it. I'm trying to read up
4:08
more on the history, but Yeah. On
4:10
the question of, like, should you do junkets like
4:12
that? don't know. Some some outlets you're not allowed
4:14
to, but I think if you disclose
4:16
it, if from now on, whenever I
4:19
write something that's like seen as too soft on Israel,
4:21
people can say, well, you took money from the government,
4:23
which is true. So up to people to
4:25
decide what the what the limit I
4:26
do find it interesting and amusing
4:28
that the propaganda wing
4:31
of the Israeli government decided that the people
4:33
they really needed to target were a bunch of hetero dogs
4:35
podcasting.
4:35
That it is sort of funny. It's like sort of cranking
4:38
out converted. Yeah. Yeah. The only the
4:40
other thing I should say is that partly because
4:43
Israelis are hilariously adversarial
4:46
and argue about everything. One
4:48
thing I will say,
4:49
American Jews are very polarized
4:51
on Israel and there's like a big Not
4:54
everyone, but there's a lot of either Israel
4:56
is an apartheid state. That's the worst place that was
4:58
ever created versus Israel can
5:00
do no wrong. I found anecdotally Israelis
5:03
themselves are much more capable of nuance
5:06
than American Jews are, which which was interesting.
5:09
And also, I joke about it being a propaganda tour,
5:11
which, of course, it was a little bit, but, like, you
5:13
know, we spent some time in or
5:15
in evening in in Nazareth. We talked
5:17
to an Arab Israeli who was
5:19
very critical of Israel, so they didn't
5:21
take us to the West Bank or anything. We didn't meet
5:23
with Palestinians, but it wasn't it wasn't quite,
5:25
like, what I'd love to do is a
5:27
North Korean propaganda tour. I'm trying get in
5:29
on that. It wasn't it definitely wasn't at that level.
5:31
And how is the food? It it's
5:34
incredible. It's probably some of the best
5:36
food in the world. The problem was for these really portion
5:39
of it. We had to be all kosher. This
5:41
means you can't have dairy and meat in the
5:43
same meal. I I destroyed my
5:45
body, Katie. Even by my own standards, I think I
5:48
had eggplant fifteen
5:50
meals in a row.
5:51
Well, is is that just, like, normal
5:53
dietary preferences there? Is that
5:55
Oh, no. There's a lot of people
5:57
Kocher, the idea, which is reasonable, is
5:59
that at some of the meals we were joined by a
6:01
guest like, you know, someone who works in Israel
6:04
or a journalist, some of them might be kosher,
6:06
and they wanted to make sure there's never situation
6:08
where people couldn't. I wish I'd get,
6:10
but, like, Tel Aviv, legit,
6:13
probably
6:15
if you like Middle Eastern foods, some of the best food in the world.
6:17
And we we couldn't quite get the full thing, but we ate
6:19
a lot of delicious stuff. Oh, also Katie, one last
6:21
important thing. I've said this before.
6:23
Israelis are super hot. Really? And there's a
6:25
photo of Michael Moynihan, like, trying not
6:27
to look at nineteen year old
6:29
IDF soldiers. that I highly recommend
6:32
people look at because it's hilarious. People just put it in their
6:34
own show notes. They don't have a copyright on that
6:35
show. So they didn't try to recruit you into the idea.
6:37
did not try to recruit me in the idea if we did
6:40
have some very good jokes that even the
6:42
Zionist left about about
6:44
if hamas launched like a Jewish regiment?
6:47
Wouldn't
6:47
that be funny? Wait. What's the joke? The
6:49
joke, Hamars.
6:50
I got it. But
6:51
it's like, if there was a guy who was like, will
6:54
the will the terror tunnels be dusty?
6:58
Sorry.
6:58
No. It's funny. This this vest
7:00
is too tight. Larry David will be ahead of
7:02
it. Yep. Anyway -- Yep. Maybe
7:05
at some point, we'll revisit this. But for now,
7:07
I'm glad I got to go. And if you think I'm now
7:10
compromised, so be it. Well, welcome
7:12
back. You were sorely
7:14
missed by at least a couple of our listeners,
7:16
but I think we we maintained without you.
7:18
We did bank a lot of think we ran some good
7:20
stuff. This is the first time recording together in
7:22
a while. Yeah.
7:23
Since New York actually yeah. We've been
7:25
it's been a weird couple of months a lot of travel,
7:27
and we are hoping to get back to normal now
7:30
as of today.
7:30
Yes. It's gonna be incredible. Katie,
7:34
What is the name of this increasingly submitted
7:36
podcast? This
7:37
is Black and Reporting. I'm Katie Herzog.
7:39
And I'm Jesse Single. And today,
7:41
we're going to talk about Elon
7:43
Musk. gonna be very few people have heard of them,
7:46
very few people are talking about them. We're finally gonna
7:48
chime in on this whole Musk,
7:50
Twitter, mastodon thing.
7:52
If if you're asking yourself what is mastodon,
7:55
that's good. You shouldn't know what it is because it's dumb.
7:57
We'll explain. But first, can we
7:59
do one little loose ends thing from our live
8:01
shows? Let's do it. Okay. So during the live show,
8:04
we released it was the New York one.
8:06
Right? Yes. It was. There's a segment
8:08
where you talked about the different styles of
8:10
in cell core music. This is music with
8:13
the theme of young men angry. They don't have
8:15
friends are sad. They don't have girlfriends. Blah blah blah.
8:17
You point out there's different styles of it,
8:19
and then I made this hilarious joke.
8:21
Is there klezmer in Zellcorp?
8:24
Like, I want a great
8:26
girlfriend. I want a great girlfriend.
8:28
Okay. So about a week and a half ago, we got
8:30
an email from a listener who just wants to go
8:33
by the name musician with regrets.
8:35
Hi, Katie and Jesse, they wrote. I've been
8:38
listener since the lips of TikTok hoax.
8:40
The furry did nothing wrong. Preach,
8:42
and I've been enjoying it a lot. In
8:44
your recent live episode, Jesse inquired about
8:46
the existence of Klasmer in Cellcor. I
8:48
happen to be an expert at mediocre Klasmer
8:51
sounding mock ups. so I figured I could help.
8:53
I've created a little demo of what that would
8:55
sound like. Jesse, if you wanna hire me for your
8:57
barmature, feel free to reach out.
8:59
and they attached a file called plasma
9:01
in cell core featuring Jesse
9:04
forever single. Here's what
9:06
that sounds like.
9:19
Yeah. They play that at your
9:21
Barbitsa.
9:23
Jesse, that might be even better
9:25
than the sucking and fucking remix. Excellent
9:27
work. So far, I've
9:30
been featured in sucking and sucking. Jesse,
9:33
forever. Wait. Whatever they whatever they called this.
9:35
And replication crisis
9:37
wraps. So I'm really I'm getting out there. I'm
9:40
expanding my media.
9:41
this might be a good theme song for us. I really
9:43
like it. Yeah.
9:43
I like it too. Okay. So should
9:46
we move on Mastodon and Twitter and all this bullshit?
9:48
Yes, let's do it. was afraid that
9:50
everything would have calmed down and
9:52
the story would have moved on since it's been so
9:54
long since we've been able to record together. But thankfully,
9:57
people are still complaining about
9:58
your theory
9:59
was journalist would stop
10:01
being narcissistic and complaining about things. Yes.
10:03
Yes. How did that how did that theory be right? Not well.
10:05
Right?
10:06
I'll take a loss on that one. Okay. So
10:08
when Elon Musk announced that he wanted
10:10
to buy Twitter, I didn't think that
10:12
chain things were gonna change all that much. If
10:14
you will recall, I think my only concerned
10:17
that he was going was that he was going to fuck
10:19
up the verification system, which he immediately did.
10:21
We'll get to that in a moment. And there was so
10:23
much hand wringing and literal tears
10:25
like, literally,
10:27
employees crying when they found out that Elon
10:29
was gonna buy Twitter, that it all seemed
10:31
very hyperbolic And I figured that
10:33
he would buy the company, lose in content
10:35
moderation, and it would basically keep operating
10:37
as normal. But in some sense,
10:39
that's not in some since that is what has happened, and
10:41
in some sense that's not what has happened, especially
10:43
if you're a Twitter employee. And I think the
10:46
catastrophizers were more correct about Elon
10:48
than I thought at first. And I think the employees
10:50
were right be scared and
10:53
to cry. And frankly, if I were a
10:55
Twitter employee, I would be pretty fucking
10:57
mad at the the old leadership for forcing
10:59
Elon to buy the company when he tried
11:01
to back out of it. And I wanna be clear
11:04
here, don't hate Elon Musk. Every
11:06
time I criticize him, I get accused of hating
11:08
him, I don't hate him. And I think he's made a couple
11:10
of good decisions, which I'll get to in a moment.
11:12
let's start with what he's done wrong, and I wanna start,
11:15
of
11:15
course, with the blue check. The most important
11:17
thing in the world, the most important status signal,
11:19
the blue check.
11:19
Exactly. So for people who aren't on
11:21
Twitter, there's a blue check on accounts
11:23
that have been designated notable. That
11:26
in itself is kinda bullshit because there are a lot
11:28
of people with blue checks who got them solely
11:30
because they're like work at a media outlet that
11:32
has a relationship with Twitter. And so
11:34
all staff members or editorial members
11:36
were verified. That's how I got the check mark.
11:38
That's how you got the check mark. Yeah. They
11:40
literally emailed us in New York magazine. They're like, if
11:42
you wanna be verified, send us your name, forward it to Twitter,
11:44
you'll get verified.
11:45
Yeah. Same thing. I was verified when I was at
11:47
Grist. I think I had less than a thousand hours
11:49
at the at the time, there's no reason that I should have
11:51
been verified then. There's arguably no
11:53
reason that I should be verified now, but that's how we
11:55
got
11:55
it. also had to submit a semen sample. I don't know
11:57
what that was but it was part of it.
11:58
We just did a hair sample.
12:00
Okay. Yeah. Anyway.
12:01
So, and a lot of people are
12:03
not verified that should be. There is no reason
12:05
that some buzzfeed staffer with hundred
12:07
followers should be verified when Andrew Sullivan
12:09
and Thomas Shatterton Williams are
12:11
not. I just love if there's no
12:13
lower form of life that some buzzfeed
12:15
center with a hundred followers.
12:18
There well, there's clearly a bias in
12:20
the verification system. And for a while,
12:22
you could apply to be verified. And it said, like,
12:25
you had to have, you know, by lines at
12:27
x number of of outlets or whatever.
12:29
Thomas Shatterton Williams and Andrew Sullivan
12:31
both very clearly fit this metric and they
12:33
weren't verified. I think that shows bias
12:36
within the system. Did they wait did they ask
12:38
to be? know that Thomas did. I don't know about it.
12:40
I doubt Andrew did. Yeah. But
12:42
he he shouldn't be very bud for sure.
12:44
Yeah. Definitely. Yeah. So setting that aside,
12:47
the blue check has value in two
12:49
ways. One is that as a as a
12:51
status symbol, and two, to verify
12:54
someone's identity. So that's it. Basically,
12:56
there's when you look at if you
12:58
have a verified account and you look at your mentions,
13:00
there's a column for verified mentions.
13:03
That's the only sort of, like, extra privilege
13:06
as far as I know. I their problem It's
13:09
like when they used to pull the curtain for
13:11
first class -- Yeah. -- no longer have to deal with
13:13
deal with Apple, basically. Right.
13:14
But now he's fucked it all up, so the
13:16
vacation gollum is absolutely fucking worthless.
13:19
Okay. So Elon Musk takes
13:21
over And one of the first things that
13:23
he does is start selling the checkmarks
13:26
for eight dollars a month and announce that he's
13:28
doing entirely away with legacy checkmarks.
13:30
which means that the check mark will no longer
13:32
demarcate the user as notable. Therefore,
13:35
it loses its utility as
13:37
a status symbol. this ironically
13:39
makes it have basically no value even
13:41
at eight dollars a month. It's going to be
13:43
and think it is cringe as hell
13:46
to pay for a blue check. and this will
13:48
even be more true when legacy checks
13:50
are gone because it's no longer going to be even
13:52
mimicking this thing that used to have value.
13:55
So it's a giant neon sign that says,
13:57
I paid for cloud. Would you
13:59
pay for a blue
13:59
check? Well,
14:01
I would hope I wouldn't, but I will say anecdotally,
14:04
think a huge number of people are. In my opinion,
14:06
at least, I've noticed a huge uptick
14:08
in folks who
14:10
wouldn't have been verified before
14:12
and now are. Maybe I just have a lot of, like, a
14:14
certain type of folk in my mentions, but I I've noticed
14:16
a lot of them, which suggests that, like, there's
14:18
a market for it at least.
14:19
Yeah, there's a ton of people in my mentions
14:22
who are now verified. I don't think
14:24
that that's that we can extrapolate from that,
14:26
that this was a huge success. I
14:28
just don't think that we have the numbers, and I think
14:30
that, yes, we are probably seeing there's
14:32
probably some selection bias there. Okay.
14:36
Again, the BlueCheck
14:38
system was biased. It was fucked up. I
14:40
imagine that's pretty relatively easy fix though.
14:42
Just apply the guidelines evenly and without
14:44
prejudice. But then he did this other
14:46
dumb thing. And so now instead
14:48
of the blue check, there's an official badge
14:51
which just recreates the system in
14:53
a different format. Well, not in
14:54
instead of. It's like in addition because
14:57
you could have one without the other.
14:59
You you
15:00
I don't know if anybody has one without the
15:03
other.
15:03
Yeah. Mine I have a blue check. It doesn't
15:05
say Yeah. Oh, I'm sorry. I don't think anybody has official
15:07
without the blue check. Gotcha. Okay. So
15:09
the official means we
15:11
verified this person is who they say they
15:13
are.
15:14
Well, no, because he was only
15:16
initially, he was only rolling it out to
15:18
Twitter advertisers. you'll see
15:20
why in a moment. But okay. I saw exactly
15:22
I've seen exactly one journalist with
15:25
an official
15:25
demarcation, Gessu.
15:29
it's gonna be someone we don't like, but who?
15:31
Jeet here. No comment. How did get
15:34
why are a lot okay. Why are a lot of people
15:36
like pretending to be Jeet here? Is he such a hot
15:38
hot brand?
15:38
Yeah. I don't I highly doubt that there's a
15:40
lot of jeet here in personators. I don't know how he
15:43
got the official badge. Apparently, he lost it. But
15:45
if
15:45
you hired a jeet here in personator for
15:47
Barbitsa. Actually.
15:49
I I think he lost the official badge. I
15:51
don't know why, but he at one point had the official
15:53
badge, and he was literally the only journalist
15:56
who had this thing. don't know who he's been fucking
15:58
on Twitter to get that. Okay.
15:59
So but he just it it it recreates the
16:02
same system.
16:02
He still hasn't he? I I hit I had to
16:04
hit I blocked him, but he's official.
16:06
he lost it at one point, then he got it back.
16:08
How did she hear it? I don't exactly. It's
16:11
a stupid system.
16:11
I'm so confused by the system. Anyway, continue.
16:13
I'm sorry.
16:14
So it but the point is it just recreate
16:16
the same system with a different with
16:18
a different status symbol. Right? But
16:20
then he did something even dumber, which
16:22
was choose not to verify the identity
16:24
of the paid users. So obviously, the first
16:26
thing that people did was start impersonating
16:29
other people, including Elon Musk,
16:31
and then a bunch of brands, including Twitter advertisers
16:33
like Eli Lilly, the pharmaceutical company.
16:36
Somebody got verified account under the name
16:38
Eli Lilly. and then use it to tweet that
16:40
insulin is now free. So immediately,
16:42
the company stock price plummets, and
16:44
Eli Lilly says that they're no longer going to be
16:46
advertising on Twitter.
16:47
Okay. So if I have this
16:49
right, for a while, if you see
16:51
a blue check next to someone's name, it it means
16:54
like it basically means they are who they say they are.
16:56
although there's there's weird limits to that for, like, synonymous
16:58
accounts. But synonymous accounts, I mean,
17:01
wouldn't have a blue check. No. There
17:03
there's some that
17:04
I guess you're you III wish I could think
17:06
about the topic. Maybe
17:07
you could, like, you you could in theory, get
17:09
the blue check and then change your name.
17:11
Either way, since Twitter started, it's been
17:13
a symbol meaning this person is legit.
17:15
So then he simultaneously changes that
17:17
to either it means you're legit
17:19
and got grandfathered in or it means you paid
17:21
eight dollars at the same time, he introduces
17:24
a new verified thing. And
17:26
and so, of course, everyone is primed to
17:28
think a blue check means something. Right. So the in person
17:30
like, he he this was an incredibly dumb
17:32
fuck up.
17:33
Yeah. And then other social media platforms
17:35
also have some sort of check. Usually, it's a blue check,
17:37
not always, Instagram, Facebook, it it means
17:39
it that's what it mean. Like, it's
17:41
a universal symbol for this identity has
17:43
been verified. This is
17:44
troubling for me because I thought Elon Musk was
17:46
a genius who was great at everything. This is
17:48
why I tried to DM him so much, but it sounds like didn't
17:51
handle this well so far.
17:52
I I don't not think he handled this well. Okay.
17:54
So a bunch of advertisers announced
17:56
that they are cutting ties with Twitter in
17:59
part because of these impersonations. And
18:01
so this thing that he's doing to
18:04
raise money for the company is now
18:06
costing the company money. Does
18:08
it even out in the end I sort of doubt it.
18:10
I don't know that a couple hundred thousand people
18:12
or however many it is that pay for the BlueCheck
18:14
is gonna make up for, like, actual
18:16
-- Yeah. -- these ads. What if
18:18
a billion people would get buy blue checks?
18:20
That'd be eight billion dollars a month. Can't you think
18:22
about it? I
18:22
mean, think that what he should do is just charge everybody
18:24
on the platform a dollar if he needs to monetize
18:26
fall or monetize users. charge everybody dollar.
18:28
There are three hundred million users on Twitter. Charge
18:31
them all dollar. You don't have to fuck with the verification
18:33
stuff. Charge them a dollar a year. You have
18:35
a good Yeah. Well but no.
18:37
As soon as you have to whip out a credit card, you're gonna
18:39
get
18:40
fifteen percent of people pay, I think. Yeah.
18:43
such an interesting thought experiment because that would be
18:45
such a healthier Internet if people had to pay for it
18:47
anyway. I don't know.
18:47
That's classes, Jesse. I'm
18:49
so classes.
18:50
Okay. So he fucks up the the verification
18:52
system. And then he also
18:55
starts immediately laying people
18:57
off. He laid off half the staff
18:59
within, I believe, the first week that he'd had taken
19:01
over. And then he also announced
19:03
that Twitter would no longer be allowing remote
19:05
works starting, like, he did this on, like,
19:07
a Wednesday or something, and it was, like, five PM
19:10
on Wednesday. Everybody has to be in the office
19:12
on Thursday. Twitter
19:14
announced that they were going permanently remote
19:16
during the pandemic, so I imagine that
19:18
a number of people, you know, moved
19:20
when they thought they were never gonna have to go into an office
19:23
again and could live someplace cheaper than Silicon
19:25
Valley. And they find out
19:27
that the next day they have to be in the office. So
19:29
he's making these announcements, things go haywire,
19:31
and then he just walks them back immediately. He's like
19:33
throwing everything at the wall in a very
19:35
public way. Maybe this
19:37
So he he walked back through a remote thing.
19:39
Right? He
19:39
did walk it back. It was but now it's like,
19:42
not everybody has to be in the office. but
19:44
you have to get permission from your manager and
19:46
your manager is not responsible for you.
19:48
So if there's I don't know if you're not performing at
19:50
the task, it's your manager is the person who is
19:52
going to be penalized if the
19:55
employee who's working remotely, is
19:57
it performing? It's just
19:59
bad management.
19:59
No. No. I mean, none of this makes sense. because it just
20:02
it's like he's like, there's a type of
20:04
dude who just loves to I mean, not always
20:06
a dude that often who just loves to throw his weight
20:08
around and be like, I have power. Now none of this
20:10
sounds like he like, what you would do
20:13
if you got to the company is, like, meet with the folks
20:15
who already know how shit works and you can
20:17
still make changes. But you don't just go in
20:19
and start, like, tweeting out massive changes. It's
20:21
crazy. Right.
20:21
And especially, like, tomorrow, you have to be in the
20:23
office tomorrow. It's five PM tomorrow,
20:26
be in the office, be in San Francisco or wherever tomorrow.
20:28
It just like, it shows just this absolute
20:30
lack of consideration for the people who
20:33
work for you. And these are the people who built the company.
20:35
He is not the person who built the company. These people
20:37
have all been there for longer than him. And then
20:39
he issues these weird demands that people
20:42
resign if they're not willing to be, quote,
20:44
hardcore. I assume that is
20:46
some reference to a musical genre? Like,
20:48
what does he what does that even fucking
20:50
mean here? Yeah. It's just publicly
20:53
this was a public demand or, like, an internal
20:55
email? No.
20:55
This was an internal email. Okay.
20:57
You just stay, hey, you have to be hardcore.
20:59
Yeah. You have to be in hardcore. If
21:02
you are willing to be hardcore and you wanna
21:04
stay on check this box in this Google form
21:06
or whatever it was. And if not, we're gonna
21:08
we're gonna take that as your resignation by,
21:11
like, five PM on Friday. If you don't check if
21:13
we
21:13
if we ever have a full blown staff, I'm just gonna
21:15
use Google Docs for retention. Yeah.
21:17
He should do a Twitter poll instead. And
21:19
this apparently was not appealing too much
21:21
of the remaining staff who they were already in
21:23
this extremely chaotic environment. And so a bunch
21:25
of them announced that they are leaving.
21:27
Yeah. And I I thought the this
21:29
article in The New York Times really captured how
21:31
the first part of his tenure went. This is from
21:33
November eighteenth. Elon Musk
21:35
sent a flurry of emails to Twitter employees
21:38
on Friday morning with a plea. quote, anyone
21:40
who actually writes software, please report to the
21:42
tenth floor at two PM today. End quote,
21:44
he wrote in a two paragraph message, which was viewed
21:46
by The New York Times.
21:48
Thanks, Elon. About thirty minutes later,
21:50
mister Musk sent another email saying he wanted to
21:52
learn about Twitter's tech stack, a term used
21:54
to describe a company's software and
21:56
related systems. Then another email,
21:58
he asked some people to fly to Twitter's headquarters
22:00
in San Francisco to meet in person. Later
22:03
in the article it notes, Some
22:05
internal estimates show that at least twelve hundred
22:07
full time employees resigned on Thursday. Three people
22:09
close to the company said Twitter had seventy
22:12
five hundred full time employees at the end of October, which
22:14
dropped to about thirty seven hundred after mass
22:16
layoffs this month. I found this amazing
22:18
because it's like if I took over a cat food factory
22:21
without knowing anything about how to make cat
22:23
food. And then on Monday, I was like, there's gonna
22:25
be big changes around here. Watch out everyone.
22:27
And then by Friday, I'm like, Attention.
22:30
If any of you know how to make cat food,
22:32
please report to my office.
22:33
Yeah. Exactly. And Like,
22:36
there's a lot of speculation that Twitter was gonna
22:38
die. And there was were you online the one night,
22:40
like, Friday at five PM after
22:43
there were all of these resignations and people were
22:45
tweeting as though it was the last day on Twitter.
22:47
I
22:49
saw tweet for that. It was so
22:51
fucking hilarious. It was literally, like, the
22:53
deck of the Titanic. Like, there's
22:55
a band play. People say goodbye to their
22:57
friends. It was so all the
23:01
cringiness and melodrama and
23:03
narcissism of, like, blue check
23:05
media, Twitter, was just a wow to flourish
23:07
in this, like, super nose at Nova Bursch. It was
23:10
hilarious and horrible.
23:10
Yeah. I mean, it was very funny. And
23:12
also the fact that everybody just woke up the next day
23:14
and continued tweeting was very funny. That said,
23:17
I don't think that many people literally
23:19
thought that it was going to evaporate overnight?
23:22
I I just don't think most people thought that. I think
23:24
the concern is that if you lay
23:26
off this huge percentage of your staff,
23:28
that the machine isn't going to be keep running
23:31
forever. Now, it'll probably keep running
23:33
on auto autopilot for a while, but websites
23:35
do need constant maintenance. I I'm
23:37
more skeptical
23:38
on that because don't you think, like, if if
23:40
the firings had this horrible effect,
23:42
we would have noticed, like, performance issues, the site
23:44
would have gone down. It just despite all
23:46
the noise and smoke and lost jobs and
23:48
I don't want these people to have lost their jobs, it
23:50
doesn't seem to have affected the actual product unless
23:52
I'm missing something.
23:53
Okay. Well, so, Ben Smith, formerly
23:55
of the New York Times, his new project, Sema four,
23:58
ran an interview with Alex Stamos, who's the former
23:59
chief of security at Facebook. And he
24:02
said that, basically, Twitter should be able to keep
24:04
running with just a few hundred employees if they
24:06
are the right employees for a while. But he also
24:08
said this, quote, Eventually, there will be
24:10
an issue that has to be addressed by SREs
24:13
that site reliability engineers
24:15
or it will cause a cascading failure. The
24:17
question will be, is if the right team exists
24:20
at that point to stop the cascade? I
24:22
have no idea if the right team exists or not, or
24:24
if Elon is filling those roles or Elan
24:26
even understands those rules. Stainless
24:29
also said, quote, it will be very hard
24:31
to make money with the key relationships broken.
24:33
I think that's that's also a major issue
24:36
as if Is Twitter ever making money? I don't know
24:38
if it was profitable, but if NPR published
24:40
piece today that said that Twitter has lost half
24:42
of its top hundred advertisers since Elon Musk
24:44
came back. And you can see why advertisers
24:47
would be reticent to give
24:49
them money at this point. bad
24:51
press. But regardless, Elon's
24:53
public response to all of his criticism has
24:56
been to shit post, which just confirms
24:58
my belief that anyone who spends of time
25:00
on Twitter should not be running lemonade stand
25:03
much less fucking
25:03
corporation. Dude, the ship posing
25:05
has been so weird. even
25:08
by Elon Musk standards. And one weird
25:10
thing is he's sort of mixing these questions
25:13
about the company's future with shit posting.
25:15
So, like, he did a Twitter poll
25:17
to determine whether he should issue a general
25:20
Twitter amnesty for banned accounts and
25:22
And with Donald Trump, that's why he
25:24
he invited Donald Trump back on the platform
25:26
because Twitter decided. And, hilariously,
25:28
Donald Trump has has sort of ignored him,
25:31
which is, I gotta say, a pretty chad move.
25:32
Yeah, it is. But so so, of course, the
25:34
sorts of people who follow Elon we're gonna be like, yeah,
25:37
do an amnesty, seventy two percent. So then
25:39
he tweets, the people have spoken. Amnesty
25:41
begins next week.
25:42
Vox popularly, Vox day. basically
25:45
translates to the voice of the God. The
25:47
voice of the people is the voice of
25:48
God. Okay. So another hilarious thing
25:50
about this. Okay. Well, first of all,
25:52
I think this is sort of the shady thing that he's doing
25:55
where instead of just making decisions himself and
25:57
taking responsibility of for them, he's saying
25:59
no, no, no, this is
25:59
a democracy that people have spoken. But
26:02
he also like, he himself when he was
26:04
trying to get out of buying Twitter, presumably
26:06
because he realized it was a terrible fucking deal.
26:08
His claim was that Twitter was overrun
26:11
by bots. So why should
26:13
we believe that these polls aren't uncontaminated
26:16
if Twitter is overrun by bots? as
26:18
he himself said that it is. Yeah.
26:20
I mean, that's that's that's true, but just
26:22
like,
26:22
even if it's all real people, Twitter
26:24
polls are completely useless. So it's just Totally.
26:27
Totally useless. there's also a much pure
26:29
shit posting he's done. So, like, he tweeted,
26:31
our love will not never die with
26:33
a still from brokeback mountain
26:36
where It's
26:38
him and
26:39
CBS News. Why can't
26:41
I quit you? It's just bizarre. He
26:44
also had an interesting Well, do you know what that's
26:46
about?
26:46
No. Actually, don't. He had some beef
26:48
with CBS News.
26:49
CBS News, and I'm gonna
26:51
say, this is embarrassing on their part. CBS
26:53
News announced that they would no longer be
26:56
tweeting.
26:56
And then Exactly. took
26:58
it back the next day. Wait. What well, how
27:00
did they explain it?
27:01
Okay. So this was a CBS news
27:04
correspondent. He said this during a
27:06
during a broadcast. In light of the uncertainty
27:08
around Twitter and out of an abundance of caution,
27:11
CBS News is causing its activity on
27:13
the social media site is it continues to monitor
27:15
the bloodworm. It's so dramatic.
27:17
So melodramatic. out
27:19
an
27:19
out of an abundance of caution, like, what do they
27:21
think is gonna happen?
27:22
Marginalized people will be harmed?
27:24
Yes. That's the issue.
27:25
so okay. So he's getting in
27:28
fights with people. He's a CEO of his very
27:30
important media company, social
27:32
media company, and he's just tweeting stuff out.
27:34
He's tweeting memes out. He also had an
27:36
injury. He's
27:36
also he's he, like,
27:38
he steals names, whatever, names are
27:40
meant to be stolen. But I to me, there's something
27:43
just, like, sort of dirty about the fact that he will take
27:45
people's content and repost it without
27:46
crediting them. It's not great. You should
27:48
be banned from Twitter. He also had
27:50
to get an interesting interaction with
27:54
at cat nerd two.
27:56
Do are you familiar with cat nerd two?
27:58
No. I know cat nerd one.
27:59
I was gonna say, I will you understand
28:02
cat nerd two's tweets if you haven't read cat turns.
28:05
This is someone who has a million followers. I I
28:07
think maybe they were recently reinstated, but
28:10
so cat turn two says breaking. forty
28:12
eight hours since Elon Musk reinstated president Trump's
28:14
Twitter account and the world still hasn't ended
28:17
Elon Musk replies. And it
28:19
turns out that Trent nine inch snails
28:21
Resner is actually a crybaby, sideways
28:23
lapping emoji. So he's
28:25
Did we did Trent Resner leave? You
28:27
must
28:27
I don't even know. You must not care.
28:29
But just the you're a CEO
28:32
of a company and you're just constantly picking
28:34
fights with everyone, like, we're allowed to do that because we
28:36
can get content out of it. But if you're a CEO
28:38
of a company, it's a bad idea.
28:40
It's like James Lindsay runs giant
28:42
corporation.
28:43
Oh, yeah. They're not they're they have similar
28:45
dispositions in that.
28:46
He was also back on Twitter.
28:47
Yes. We had that that little sideways laughing
28:49
emoji thing, which comes across is just like deranged.
28:52
Like, I'm not even mad. My buddy Aria
28:54
Cohen Wade sent me a Twitter search
28:56
for Elon Musk using the halfway
28:59
sideways laughing emoji. He's used it forty
29:01
one times since November
29:04
fourteenth or about three point seven today. So
29:06
I'm a I'm a data driven journalist. he
29:08
really likes the half what
29:10
is that? How is it halfway,
29:12
sideways, laughing, crying, emoji? I guess
29:14
that's
29:15
crying, laughing so hard.
29:17
It's turning your face sideways.
29:18
Yes. So You know when that happens? He's
29:20
basically throwing himself headfirst into
29:22
all these culture wars. He he just like,
29:25
opine that people should vote Republican to
29:27
keep, like, checks on powered
29:29
government, which, like, whatever, you're allowed to have your opinion,
29:31
but you can't just, like, you you're the CEO
29:33
of a company. You do not you want to appeal
29:36
to as many people as possible.
29:38
This sort of reminded me of a friend of the
29:40
pod, Ethan's Ethan. Ethan. Ethan
29:42
Strauss. Ethan Strauss often
29:44
points out that, like, the NBA, for example,
29:47
will go way overboard in, like, a woke
29:49
direction that Its audience isn't
29:51
really into, and for lack of better word, whoa. We
29:53
don't like that word, but whatever. Everyone knows what we
29:55
mean. Elon's sort of doing that on
29:57
the opposite direction where he's just like -- Yeah. -- he's
29:59
playing footsy with these, like, weird right wing troll
30:01
accounts saying dumb political shit.
30:03
This the Twitter user
30:05
base is it's not overwhelming the
30:08
liberal, but it is disproportionately liberal. You're just
30:10
you're pissing off your core audience. What's point
30:12
of
30:12
Right. He's trying it's like he's trying
30:15
to drive people away, advertisers
30:17
and users, and he he's doing that.
30:19
He keeps saying that traffic is higher than it's
30:21
ever been. think that's probably true, although I don't
30:23
trust him when he says basically anything. Because he also,
30:26
at one point tweeted that Twitter is
30:28
the social media platform that drives the most traffic.
30:30
That is not true. Twitter drives hardly
30:33
any traffic to other sites. Like
30:35
anybody who's worked in social media can tell you this.
30:37
Like Facebook does. Search does. Twitter does
30:39
not. Well, yeah.
30:40
it's much less effective in
30:42
terms of, like, efficiency than Facebook or least
30:44
it was when I was at
30:46
New York Magazine, but but so many journalists
30:48
are on there and have such big platforms at the end of
30:50
this, like, I think we're in, like, late stage Twitter that
30:52
it it can it can it can help somewhat just
30:55
way less than people.
30:55
Well, he drives the conversation for sure. I just
30:57
don't think it drives a huge amount traffic to, you
31:00
know, to publishers.
31:00
Much less than someone
31:03
would think who who hasn't had access to those
31:05
numbers as I think we have.
31:06
Much less than Elon thinks. Yeah.
31:08
And to me, the most baffling part of this
31:11
is that people treat this
31:13
guy like he is a like,
31:15
their hero. It's this weird demigod
31:17
saying. And I think what's going on is
31:20
that people think that Elon is the enemy
31:22
of their enemy, and therefore he's their friend.
31:24
And so it doesn't matter to to his fans.
31:27
that a bunch of people just lost their jobs because
31:29
those people were woke assholes and so fuck
31:31
them, learned to code, bitches. Except
31:32
they already knew how to code. They
31:34
didn't know how to go. did you journalism? And
31:36
a whole like, this idea that that
31:38
Twitter was full of of Wokies
31:40
for lack of a better term, I'm sure there was a lot
31:42
of that, like, Twitter self selects for
31:44
progressive, I think, in some ways. But
31:47
Twitter, like almost every giant
31:49
Silicon Valley tech company, a huge
31:51
percentage of their employees, are
31:53
Asian. They are immigrants. Do you really
31:55
think that Indian immigrants who are engineers
31:58
at Twitter's are the people who give a shit about pronouns?
32:00
Like, those are people who lost their jobs as well.
32:03
And if you just take, like, step back for a moment
32:05
and look at what happened, here's what happened.
32:07
A billionaire bought a publicly traded
32:09
company, took it private, and
32:11
then laid off out the staff. And under
32:14
what other circumstances would we be cheering
32:16
for that? I cannot think of any. Well,
32:18
it depends who we is. A lot of the people cheering
32:20
for it have politics very different from ours
32:22
even if even if we share the belief that some there's
32:24
like liberal access. These are these are folks who
32:26
who worship tech pros and you hate so called
32:28
SJW. So it just I don't find it surprising
32:31
that they're they they don't care if people get fired. They're they're
32:33
little bit cruel about it.
32:34
No. It's not surprising. It's but it's
32:36
cruel. I find it really great. Like, he's basically, he's
32:38
doing what a predatory capitalist like Karl
32:41
Icahn would do, but instead of criticizing him
32:43
for that. A billionaire buys company and
32:45
lays off half the staff. People are cheering because
32:47
they think that their employees are too woke,
32:49
and I just find this attitude absolutely
32:52
repulsive. Losing a job is one
32:54
of the more stressful life events that people go
32:57
through. That includes people in tech, especially
32:59
because right now, layoffs there are layoffs all
33:01
across the industry. Amazon just laid off ten
33:03
thousand people. Facebook just laid off ten thousand
33:05
people. These are people. You might disagree
33:08
with their politics. but I don't see why
33:10
you should cheer when
33:12
they're losing their jobs. I just I
33:14
find it so
33:16
gross. It's it's the same thing
33:17
whenever there's, like, journalism layoffs. I always try
33:19
to explain to people like, we hear
33:21
disproportionately on Twitter from the ten percent
33:23
of journalists who are annoying ass but most
33:26
most journalists really are just trying to, like, do their
33:28
job and And
33:28
think that's true of Twitter engineers too. Yeah.
33:30
No. That's what I'm
33:31
saying. I don't even want the assholes to lose their jobs. I just
33:33
want them to hashtag do better. So
33:36
cheering for people getting fired, sucks.
33:38
Also, if you're like a big
33:40
capitalist smart money guy, you should
33:42
know that a CEO
33:45
steamrolling in and immediately firing half
33:47
of people without any sign he knows what he's doing,
33:49
this is just ego. It's just like him trying to throw
33:51
his dick around. It's not. There's no there's no
33:54
intelligence at work
33:55
here. No. I don't think he went through and was like, we
33:57
need to make budget cuts here
33:58
and here and here because we're overloaded
33:59
in these areas. is, no, he didn't do
34:02
that. He wouldn't have had time to do that. I will
34:03
say that I can almost understand where some the cruel
34:06
people are coming from just because the hysteria reach
34:08
such a fever pitch. And there's this weird
34:11
wishcasting going on where commentators
34:14
mix reporting
34:15
with what they want to happen.
34:17
So Yeah. Alejandra
34:19
Karabay Bio is a
34:22
just a pretty noxious presence on social
34:24
media. She's like some sort of instructor in
34:26
cyber law at Harvard. So
34:28
on the seventeenth, I don't think Twitter
34:30
will last through the weekend. Twitter is restricting employees
34:32
access to all its buildings through the weekend with no reason
34:34
given. The entire Android team resigned. The world
34:37
topped the largest sporting event in the world starts this
34:39
weekend. She continues, you
34:41
know, payroll tax, SRE, Twitter, Blue
34:43
teams all gone. The end is nigh. you know, that
34:45
was now
34:47
that was
34:47
more than a week ago. Twitter's still around. There haven't even
34:50
been performance issues. Then in
34:52
a Washington Post article, by
34:55
Taylor Rams. Taylor
34:57
quotes Kawabeo saying that
34:59
by letting all these people back on Twitter,
35:02
what must quote, what Musk is doing
35:04
is existentially dangerous for
35:07
various marginalized communities. Existentially
35:09
dangerous, Katie.
35:10
Yes. People die every time someone tweets
35:13
the t word.
35:13
So I think part it's it's
35:16
it's deranged too. There are there
35:18
are
35:19
In the world, communities that face existential
35:22
threat. No one faces existential threats
35:24
because of Twitter. You have to be so far
35:26
up your own ass to think that's the case. So I think
35:29
the people cheering about Twitter engineers
35:31
getting fired see discourse like this. They're like,
35:34
This is crazy. This is so over the
35:36
top. And they maybe adopt a little bit of a nihilistic
35:38
attitude.
35:38
Yeah. I think you're right about that. And I think
35:40
there's also this sense that Twitter
35:43
is now that that leftist on
35:45
Twitter are now experiencing what conservatives
35:47
have always experienced. Like for instance, there's
35:50
this this list going around. basically
35:52
a spreadsheet of thousands, I
35:54
think thousands of leftist accounts.
35:57
And the idea is that right wingers
35:59
will take this list, mass
35:59
report the accounts, and then they'll get banned
36:02
from Twitter. And people like Ben Collins
36:04
are tweeting about this. Lots of people are Ben
36:06
Collins of NBC News, the misinformation reporter.
36:09
Lots
36:09
of Disinformation. I'm sorry. Disinformation
36:10
reporter. Lots of people very upset
36:13
to find themselves on this list.
36:15
These lists have existed.
36:17
wow. Yeah. That much. That much. a sucked out
36:19
fucking people, deranged morons,
36:21
put your name on a list. God. That was Right.
36:23
And it's not like a, it's probably
36:25
not going to have any impact at all.
36:28
But b, these lists have existed forever.
36:31
And
36:31
Dude, it's so bad. Right. They're they're you're
36:33
right. They're just mad. that a in
36:36
some cases, they're just mad that the
36:38
medicine they readily dish out to others, they might
36:40
have to deal with a taste of -- Yeah. -- this fantasy world
36:42
in which until Elon Musk took
36:44
over this was like, and at all functional
36:47
or non sociopathic plates, I
36:49
don't that doesn't resonate with me at all.
36:51
Yeah. They're basically losing what they thought
36:53
of as their site. and
36:55
Twitter wasn't their site. It was never their
36:57
site. Although then again, the people who built it are probably
36:59
more in line with their values than people like Elon Musk.
37:01
And so the other thing about Elon, like,
37:03
he pretends to be a champion of free speech.
37:06
If he actually were, I would support
37:08
him more through all of these mistakes that he's making,
37:11
but he's not proven himself to
37:13
actually be a champion of free
37:15
speech like he tweeted when he got there. He tweeted,
37:18
comedy is now legal on Twitter. And then
37:20
the next day, he permanently banned
37:22
accounts that impersonated him personally.
37:25
Making fun people is free speech,
37:27
but he is incredibly thin skin and he
37:29
can't take it. He fired employees who
37:31
criticized him on Slack, and I realized this is his
37:33
company, and he can do what he wants, blah blah blah.
37:35
But you can't claim to be a champion of free
37:37
speech and then fire people for criticizing
37:40
you personally. And I think there
37:42
is difference between tweeting Elon
37:44
sucks balls on Twitter. I don't think you can tweet
37:46
that the boss sucks balls on Twitter and then expect
37:48
to have your job. But if people were
37:51
criticizing his decisions in good
37:53
faith on Slack and he fired him for
37:55
that, That is not being a champion of free
37:57
speech. Yeah. I think that I'm
37:59
less moved
37:59
by the whole. I think
38:02
you shouldn't criticize your boss. publicly.
38:04
I I think in any But but Slack
38:06
isn't public. Sure. Sure. Yeah. IIII
38:09
don't know. I always find myself little bit of a conservative
38:11
on that because just think, like, a pretty basic thing
38:13
that you need to be, you know, you need
38:16
to if if if they're not allowed to
38:18
internally criticize him or if they get in trouble
38:20
for, like, quietly reporting a qualm they
38:22
have or problem they have. That'd be one thing. But like,
38:24
I don't know. I bet some of people were sloppy
38:26
about it. Well,
38:27
okay. So on
38:28
the media, of course, covered this last weekend, and they
38:30
had somebody who'd been reporting on Twitter. And apparently,
38:32
before Elon took over Twitter
38:34
had a very sort of open door policy about
38:37
criticism, and it was part of the culture, was
38:39
that they wanted to improve the site, and so employees
38:41
were encouraged to criticize the
38:43
leadership decisions. And then
38:45
Elon comes in, and the next day, you're firing
38:47
people who lit who who criticized the boss.
38:49
Okay. I again, I would wanna know more details,
38:51
especially because on the media is sort of a Yeah.
38:55
Okay. That's good. Yeah. Anyway, okay.
38:57
Well,
38:57
okay. So then he he also tweets this.
38:59
New Twitter policy is freedom of speech.
39:02
not freedom of reach. Where have
39:04
we heard that before, Jesse?
39:05
Yeah. That's like a common line from, like,
39:07
progressives who who don't.
39:09
Yeah. Yeah. Yeah. Exactly. This is
39:11
just like a fucking stupid world platitude.
39:13
In that same tweet, he said, negative slash
39:15
hate tweets will be maxed debucid and demonetized,
39:18
so no app ads or other revenue to Twitter,
39:20
you won't find a tweet unless you specifically seek
39:22
it out, which is no different from the rest of the Internet.
39:25
He is correct. That is no different from most of
39:27
the Internet. I don't know who's gonna decide
39:29
what's a hate tweet or a like,
39:31
you can't have negativity on Twitter since
39:33
when. Twitter
39:34
would not exist when. Twitter
39:35
would not exist. but he's just repeating
39:37
the same platitudes that the very people who were
39:39
growing about him buying the company
39:41
claimed to hate freedom of speech, not freedom
39:43
of rage. And frankly none of this
39:45
should be surprising. He has a documented history
39:48
of targeting people who criticize them with legal
39:50
action. He'd once try to get an anonymous
39:52
blogger fired from his job for criticizing Tesla.
39:55
Yeah.
39:55
It's not really going pretty spatchy.
39:56
Come on. I commend him on bringing
39:59
back some people who have been banned including Meghan
40:01
Murphy. He did this last week after Joe Regan
40:03
asked them to, I am less enthralled about
40:05
the fact that he brought back James Lindsay just because
40:07
James Lindsay is so fucking aggravating.
40:09
Yeah. Still, I I do I
40:11
do think that he he brought back some people who deserved
40:14
to be on the platform. But when he was asked
40:16
if he would bring back Alex Jones, he
40:18
said no. Here's why he tweeted,
40:20
my first born child died in my arms.
40:23
I felt his last heartbeat. I have no
40:25
mercy for anyone who would use the deaths of children
40:27
for gain, politics, or fame. So I didn't
40:29
realize this before then, but his son died of SIDS
40:31
at ten weeks old. And of course, it's impossible
40:34
not to feel sympathy for Elon Musk in this
40:36
case. and for the parents of Sandy Hook victims
40:38
when you're talking about Alex Jones. But
40:40
again, this reveals what Elon
40:42
Musk actually thinks about free speech, which
40:44
is Free speech is good unless
40:46
it if personally offends Elon
40:49
Musk. That really stuck with me because
40:51
obviously I cannot imagine losing
40:53
a kid that's not the issue here. The issue
40:55
is he's clearly saying that whatever,
40:58
like, most yeah. Whatever
41:00
most offends him or hurts Tim can
41:02
be allowed on Twitter, which is like the exact opposite
41:04
of a principled stance on this
41:06
stuff. Exactly. And there's been this
41:08
spate of anarchist accounts being banned
41:10
or suspended Honestly, it's a lot of
41:12
people I can't stand, so it's hard to really
41:14
mourn them. But I think there is this this
41:17
element from Elon fans.
41:19
That's like, leftists are now experiencing
41:21
what conservative have conservatives have
41:23
always experienced on Twitter. I
41:26
understand that, but I don't think that he
41:28
has shown himself to have really any
41:30
principles beyond Elon Musk
41:32
first. Yeah. I mean, that's
41:34
yeah. I don't know. It's weird that I would think
41:36
otherwise. A lot of these a lot of, like, really
41:39
loud free speechy types. Don't actually
41:41
care about free speech at the end of the day. They just care
41:43
about, like, sticking it to their enemies. Yeah. Totally.
41:45
Alright, Katie. Should we do housekeeping before you
41:47
finish up this tail? Let's do Okay. So,
41:50
oh, I guess, the announcement is merch is back.
41:51
Oh, yeah. Merch is back. Is our website live? It's
41:53
live. Oh, what's what's the URL?
41:55
bar pod merch dot com. I already said that without
41:57
checking. Why don't you check? Bar
41:59
bar how do you spell that? BAR
42:02
How do you spell barpods? mod merch
42:04
dot com. It's barpods merch dot com.
42:06
Did you say dot com? Yep.
42:08
Okay. Yes. It is back. We have our
42:10
pervert for Nuance shirts and
42:12
our Park Slope panther shirt. I'm gonna
42:14
get one of those.
42:15
I'm really happy with this. So, yeah, we're doing
42:17
we're doing things a little bit differently. We have a lot
42:19
of the same, you know, based
42:21
items before if you wanna show your
42:23
support for the podcast, but we're gonna introduce
42:25
new basically, inside jokes, we think
42:27
are funny. I think pervert for Nuance will have
42:30
some legs because there's lot of you perverts out there.
42:32
And then the Parkzel pant if you
42:34
want to help defend Park Slope against
42:37
or prospect park against crime?
42:38
You know, I think our Park Slope Pander's episode
42:40
was a premen episode, so it's good
42:43
portion of our listeners aren't gonna get that one. Joanna
42:45
Blockreport dot org.
42:47
Alright. Joanna Block whoops.
42:49
Yeah. As Joanna Blockreport dot org, so you will
42:51
know the inside jokes on the
42:53
shirts were asking you to buy for
42:56
just five dollars a month or more. You can become a premium
42:58
subscriber. That's three extra episodes a month
43:01
plus It should be part of growing community. Almost
43:03
ten thousand people. Someday we will
43:05
fill one of those creepy
43:07
coder stadiums. built
43:10
by slaves. Yeah.
43:11
We have a weekly open common thread.
43:13
Every Wednesday, people can just talk about whatever they
43:15
want. And the last time I looked at either
43:18
this Wednesday's or last one. There were, like,
43:20
literally a thousand comments on the thread, a thousand
43:23
comments.
43:23
Yeah. There's a lot of really good discussion It's
43:25
a good community. You can also check out our
43:27
subreddit blocked and reported dot Reddit
43:29
dot com. But mostly check out the march. Alright,
43:31
Katie. Where are we in our in our story?
43:33
Okay. So as people
43:35
get more inflamed over Elon Musk,
43:37
some of them, certain number of them, are fleeing
43:40
to new sites. I should say, I don't think
43:42
Twitter is gonna die overnight. I think eventually
43:44
it will die because that's just the nature
43:46
of platforms. I think they all die
43:48
in hand like us humans. Wait
43:50
one. There's a new one called post. Yeah.
43:52
I know it's weird. Yeah. So
43:54
there's a new one called post that's currently in beta
43:57
post dot news. I like the interface that's really have
43:59
you look at this one?
44:00
No. Why why would you look at is
44:02
that you think you need more social media in your life?
44:04
Is that what is?
44:05
Here's why. I'm joining
44:07
all these new platforms under a different name
44:09
so can see the content of people who have blocked
44:11
me on Twitter? It's Hailey Curtis.
44:15
Jesse Single. I like the interface
44:17
on post dot news. It's nice and clean. And
44:19
they're very
44:19
nobody gonna post that
44:20
tomorrow. They're very explicit about
44:23
who they do and don't want on their platform,
44:25
and it's the standard, like, we want diversity, but
44:27
no, not etcetera. They are, however,
44:29
already getting shit because they say
44:31
that protected categories include gender,
44:34
religion, ethnicity, ase sexual orientation,
44:37
net worth, and beliefs, but they failed
44:39
to mention gender identity in there.
44:41
I suspect that's just an oversight or they
44:43
were loving that undergender. Well,
44:45
what if my belief is that nazis
44:47
are right? No. So that's protected?
44:48
That's not protected. No nazis. My beliefs
44:51
are protected. No. Not that one. I'll lose
44:53
with that one.
44:54
Oh, I'll mess around on there later and figure
44:56
it out. But the one that is getting more attention
44:58
is Mastodon. Unfortunately,
45:00
Mastodon is a deeply confusing website.
45:03
I found it very hard to navigate, but it
45:05
does have a pretty interesting
45:06
history. Why don't you tell me about that
45:08
history, Katie?
45:09
Okay. So Mastodon launched
45:11
in twenty sixteen, and it's basically
45:13
open source software that anyone can
45:16
use and adapt for their own purposes. It
45:18
can be somewhat divided into three
45:20
massive non interacting parts.
45:23
First, there's a thriving right wing
45:25
corner, gap and truth social, or the headliners
45:27
of that. Then there's the Japanese
45:30
pedophile corner, and then there's the
45:32
sort of broadly leftist. We're looking for a
45:34
gentler Twitter corner. that's what
45:36
people are leaving Twitter for now. You're gonna
45:39
get to that in a little while. But first,
45:41
I'm gonna talk about the first two, which
45:43
is the right hand corner. and
45:45
the Japanese pedophile corner. Are you ready for this?
45:47
Okay.
45:47
And and you say these don't overlap? They
45:50
don't
45:50
match. They would imagine it's sub
45:51
cases. I'm sure in some cases they do. Yes. Okay.
45:54
So this is key to understanding
45:56
Mastodon. There's no global
45:58
search, and none of these three ecosystems
46:01
can see each other unless they go out of
46:03
their way to look for it. Can
46:04
just say we're off to a very good start about
46:06
this being a useful service that
46:08
there's no global search? So that that sounds
46:11
good. Yeah.
46:11
Right. So people in each of these
46:13
separate corners generally remain pretty
46:16
blissfully unaware of the others. Okay.
46:18
So First, we're gonna talk about
46:21
how mass Mastodon works. So basically, when
46:23
you sign up, you choose a server also
46:25
called an instance. This can be based
46:27
on location or interest. So there are
46:29
servers for different countries and regions and
46:31
servers for things like What
46:32
what I love about this is they could have just
46:34
called it a server because that's what it is. But everything
46:36
on Mastodon needs to be fucking confusing a
46:38
dumb. So it's an instance, which is just
46:40
why.
46:41
Like they called their post or they did until
46:43
recently, they called their post toots,
46:45
which is literally the name that I called Moses
46:47
Farts like this. I tutored everybody
46:49
everybody. Check out my tooth, I tutored.
46:51
Yeah. So as a user, it's pretty
46:53
confusing because, like, how do you know
46:55
what server to sign up for? But, ultimately,
46:58
the server Obviously, the petifier. Obviously. easy
47:00
comp. But it also doesn't really matter which server
47:02
you sign up for because you can still follow people
47:05
on other servers. So I don't know why the server thing
47:07
exists. It's it's just it's like one
47:09
more barrier to get people to sign up. Maybe that's what they're
47:11
trying to do. I'm also going to immediately
47:13
contradict myself because I just said that it doesn't
47:15
matter what server you choose, but because there are
47:17
mass blockings of entire servers, it
47:19
kinda does matter what server
47:21
you choose. Okay. So once
47:24
you sign up and start following people, you
47:26
have local timeline which features
47:28
only posts from your server and public
47:30
timeline which could feature posts from
47:32
anyone your server are you following
47:34
me, Jesse?
47:34
It's very hard too, but I just imagine
47:36
me nodding and drooling
47:37
a little bit. There's no way to view everything
47:39
on Mastodon all at once. And so it looks a
47:41
lot like Twitter, but there are other certain limitations.
47:44
For instance, there's no quote, quote, quote, quote,
47:47
quote, quote, quote, quote, quote, quote, quote, quote feature. Jesus.
47:48
Yeah.
47:50
And this inherently limits the ryrie
47:52
of post and this is intentional because
47:54
the vibe on Mastodon is supposed to be
47:56
more about actual conversation than
47:58
dunking on your enemies.
47:59
And when it was started, it was thought
48:02
of as this haven for marginalized
48:03
identities like queer and trans people
48:05
who, of course, are always the victims of bullying and
48:08
never the perpetrators of it. the
48:10
writer, Sarah Geon. Do you wanna remind people who
48:12
Sarah Geon is?
48:12
Yeah. Sarah Geon was Geon.
48:14
It's Geon. I don't know. She
48:17
she was a so she was New York
48:19
Times, at one point, editorial
48:21
writer on my tech and the law and related
48:23
subjects, and people are mad at her
48:25
because she'd said, like, a lot of nasty things about
48:28
white people, so conservatives, tried to get
48:30
her, you know, basically fired for that, but
48:32
it didn't work. But then she left anyway. And,
48:35
yeah, She was like a really well,
48:36
maybe she's still a big name. I just feel like
48:39
she doesn't come up as much anymore.
48:40
Yeah. She doesn't. Okay.
48:42
So she wrote about Mastodon for Vice in
48:44
twenty seventeen. Her post was called
48:47
Mastodon is like Twitter without Nazis.
48:49
So why are we not using it? Subhead,
48:51
I quit Twitter to join a kinder, nicer,
48:54
decentralized open source version of Twitter.
48:56
She is, by the way, still on Twitter, so I guess it didn't
48:58
take But in her piece, she hilariously
49:01
mentions Graham Linahan as
49:04
one of the people who recently joined Mastodon
49:07
as though it's a selling point. although
49:10
she says this is a quote, Linahan apparently
49:12
loses interest after finding that the existing
49:14
community discourages posting publicly about
49:16
Trump. She also noted that,
49:19
quote, a user gently suggests that if
49:21
I post political content, I should put it behind
49:23
a content warning. There's a lot of content warnings
49:25
on Mastodon. Hopefully, we'll get to that later.
49:28
Okay. So I mentioned True Social and GAB
49:30
a moment ago. Both of these are right
49:32
wing social networks. True Social is actually
49:34
owned in part by Donald Trump, and that might
49:36
be one of the reasons he has not actually gotten
49:38
back on Twitter because he
49:40
either hasn't in his contract that he needs to
49:43
only be on through social or because he just
49:45
wants to pull people to True Social.
49:47
In twenty twenty one, Wired reported that
49:49
the guy who created Mastodon, his MPG in Roshco,
49:52
was unhappy that True Social was running
49:54
on Mass on software, but he couldn't really do anything
49:56
about it because no one including him owns
49:58
Mastodon. It's decentralized and that's a
50:00
big part of the appeal. but this also
50:03
means that there's no one person or entity
50:05
who can say no, true social or you can't
50:07
play with us. And I suspect that most
50:09
people who have fled to Mastodon over the must buyout
50:12
frankly don't realize they're sharing an ecosystem
50:14
with Donald Trump and literal Nazis, but they are.
50:16
Okay. So that's the first corner.
50:18
That's the red corner. Now we're going to talk
50:20
about the Japanese kitty pawn
50:23
corner. Jesse? My corner. Yeah.
50:25
Your corner. I like what I'm hearing so far. So
50:27
our background from this comes from a very interesting
50:29
twenty seventeen article called Mastodon
50:32
WTF timeline by a guy named
50:34
Matt Scala. willing to this in the show notes,
50:36
but Matt was on Mastodon early.
50:39
He also has an interest in Japanese culture.
50:42
And according to him, in Japan, there
50:44
two different terms for what we in English
50:46
would call child pornography. One
50:48
of them is absolutely taboo and
50:50
one of those is just not. So the
50:53
first type is what we would think of as kiddie porn.
50:55
Picture Jesse Pictures from kiddie porn. Hold
50:56
on. Hold on. Wait. I'm picture. Yeah. Okay.
50:59
The second type is called Lola
51:01
Con. That's short for Lolita complex.
51:04
And it's basically drawings and cartoons
51:06
of young people, including kids, in
51:09
realized situations, and this is
51:11
socially acceptable in Japan. So
51:14
I'm gonna read a bit from Matt's post here.
51:16
He writes, If you like mola
51:18
Con, you're a nerd and that's not a big deal.
51:20
It's legal and popular in sold in bookstores
51:22
everywhere. I cannot emphasize enough
51:25
that Lola Con is not only but
51:27
really acceptable in Japan. It's merely
51:29
nerdy. On the other hand, if you like
51:31
Kitty porn, then you're an evil sickle
51:33
monster in kindergarten is highly illegal.
51:35
It's also unpopular. As a matter of statistics,
51:38
the number of people who are interested in
51:40
kidney borne is vanishingly small as a fraction
51:43
of the Japanese population. Japanese
51:45
see these as two completely and obviously distinct
51:48
things. This doesn't totally surprise
51:50
me. I've never been to Japan, but I have
51:52
seen anime, and it does have
51:54
this sort of Lolita vibe. That's what Lolita's
51:57
short for Lolita complex. and we're
51:59
talking about a country where you can buy used underpants
52:01
and vending machines. They have very different cultural
52:03
norms that we do and whereas drawings
52:05
of child sex or written erotica has gotten
52:08
people jelled in some places. It's just totally
52:10
normal if and dirty in Japan. And
52:12
that says that Japanese people think that
52:14
the American inability to distinguish
52:17
between these two is further proof that Americans
52:19
are dumb as fuck. I am definitely
52:21
having trouble distinguishing them, but think
52:23
we should move on. Well,
52:24
I mean, I don't think it's that. Like
52:26
like, think of one as like anime and
52:28
the other as like photographs of
52:30
kids getting abused. So it's just it's just less
52:32
explicit. Well, like one of them is, like, this stuff that
52:34
happens before their clothes come up, but it's just as
52:36
creepy and sexual. It
52:37
is. But in one, there's, like, real children
52:39
being harmed in the other. There's no children.
52:41
You mean, because it's it's cartoon it's cartoon images
52:43
of children. because it's cartoons. Yeah. Cool.
52:46
Yeah. It seems good. Hey. Hey.
52:48
Hey.
52:48
everyone. Cool. Look, Jesse, if you wanna
52:50
become a protector of cartoon drawings,
52:52
that's that's your hobby. I'm
52:53
gonna go to Japan to become a vigilante.
52:55
Okay. So Twitter
52:57
is obviously an American company, and posting
53:00
Lola Con is banned, and users who post
53:02
that can be banned as well. So Lola Con
53:04
aficionados and artist were looking
53:06
for a platform where they could post their weird drawings
53:09
and they found it surprise surprise at Mastodon.
53:12
So in April twenty seventeen, a
53:14
Japanese site called Pixiv, it's
53:16
basically like a deviant art for Lola
53:18
Con. They started a massive on server
53:20
And there is this massive influx of
53:22
Japanese users to mass it on. And
53:25
in fact, so many Japanese users joined
53:27
the platform that it doubled in size in about
53:29
a day and Lolecon servers quickly
53:31
became among the most popular on Mastodon.
53:34
So this thing that was supposed to be the nicer
53:36
kinder Twitter for marginalized identities,
53:39
was also a massive hub to what
53:41
western eyes look like Kittyporn. So
53:43
this is the the great pervert influx.
53:45
Yeah. Yes. The Japanese pervert
53:48
influx. wondering what percentage of them
53:50
were female.
53:50
JPI?
53:52
Yeah. Yeah. And then so there's this rash of
53:54
thin pieces and blog posts in twenty seventeen
53:56
about how Mastodon had a child pornography
53:58
problem. Of course,
53:59
the very nature of Mastodon makes it
54:02
impossible to believe, and so
54:04
the thing that makes it appealing is decentralized.
54:06
means that there's no one who can ban the LongCon
54:09
servers. They can't be kicked off, so it's
54:11
basically the antithesis of the whole SafeSpade
54:13
vibe that liberal Mastodon who've
54:15
left what Twitter over Elon are trying
54:17
to create. So that said,
54:20
each server or instance can enforce its
54:22
own rules and arms. And as you can imagine,
54:24
this can be very fraught. and nowhere
54:26
is that more apparent, of course, than
54:28
the new server for journalism journalist.
54:32
Jesse, you wanna take it from
54:33
here? Yeah. This is jerna
54:35
dot host. And, basically,
54:38
the center's on Adam Davidson. Davidson
54:41
is a guy who I've read and listened to for a long time.
54:44
And I always found him to be a smart commentator
54:46
on money stuff. I know nothing about money
54:48
stuff.
54:48
Yeah. He was he created planet
54:50
money. Yeah. And And he was on this
54:52
American line for a
54:53
long time. New Yorker, New York Times --
54:55
Yeah. -- very very storied journalistic
54:57
career. If
54:59
you only knew him from his actual work
55:01
and not from Twitter, I can see how you would think
55:03
that this was a very smart and decent guy.
55:04
Yeah.
55:06
And then you look at his Twitter.
55:07
So my sense is that, like like lot
55:09
of other people, Trump and the protests and
55:11
the pandemic have all melted his brain oil
55:14
or at least a part of his brain dedicated to talking
55:16
about, like, in
55:17
turn to seeing conflict on the left.
55:19
He's basically become the purest of, like,
55:21
the good white man. He'll talk a lot about
55:24
his privilege, how there's no such thing as cancel
55:26
culture. This is one of
55:28
the most annoying subset of people on social media.
55:30
It's like the white person who will tell you all about
55:33
how white they are, how many advantages
55:35
they have, how sorry they are about it, how bad
55:37
they feel. But at the end of the
55:39
day, it's off and unclear what all this
55:41
performative flagellation does other
55:43
than, like, causing to talk and think about
55:46
race far more than any healthy
55:48
person Oh,
55:48
it's good for engagement, Josie. Come on.
55:51
It's good for engagement. And it also
55:53
seems to cause them to amplify
55:55
people of color, but they tend to be the tiny
55:57
subset of people of color who have the same
55:59
politics as highly educated white people now.
56:02
So I would find the act a lot more
56:04
believable if if all
56:07
the flage self flagellation was followed up
56:09
by, like, a genuine interest
56:11
in porn and working class black people that
56:13
I I feel like that doesn't tend to be the
56:15
case.
56:15
You don't think the guy who and
56:17
plan at money in his work for this American
56:19
life and the and the the worker
56:21
is really in touch with the working boss.
56:23
Anyway, Adam
56:26
Davidson ends up launching
56:28
jerna dot host, a
56:30
mastodon. So this is an instance
56:32
Right?
56:33
Yeah. It's a server. I'm just using
56:35
a server. It's so
56:36
fucking stupid. It's a server. He
56:38
launches it, and there's this sort of committee of
56:40
moderators. And if you go over there, you'll
56:42
find a mix of, like, random stuff, but also
56:44
the most boring, pointless, predictable,
56:46
political, opining imaginable. You can go there,
56:48
just go to jerna dot host, You
56:51
don't you don't need to sign up for Mastodon. So
56:53
just like imagine Twitter, but only the most
56:55
narrow, already overrepresented, progressive
56:57
side of everything. There appears
57:00
to be no appetite whatsoever there for
57:02
genuine disagreement. Or if there is, I
57:04
couldn't find any evidence of it. Like, The
57:06
things they do agree on include Trump bad,
57:09
musk bad, Twitter bad, fascist
57:11
bad, old ass. So you can imagine,
57:13
yeah, the discourse is is totally scintillating.
57:15
And and Katie, you attempted to join
57:17
this walled garden. No. Well, how did that go?
57:19
I did attempt to join. I attempted
57:22
your to join twice first under
57:24
Sootenum. I thought maybe that would
57:26
get me through the wall. It did not. And
57:28
then under my actual name, and
57:31
my application is I think it
57:33
got lost a bit. Pardon me? Yeah.
57:34
It's been forever pending. How long
57:36
has it been?
57:37
Probably two three weeks. they
57:39
don't they don't in, like, the stuff I've read about
57:42
this, they don't say explicitly that there's
57:44
any, like, ideological component. You should be able
57:46
to join if you're a journalist with an established track
57:48
record writing and or podcasting. Right?
57:50
Yeah. They basically say you need
57:52
to be a legitimate journalist. And Adam Davidson
57:55
did
57:56
this interview that I think
57:58
really tells you everything
58:00
you need to know about Adam Davidson. Let me find
58:03
a quote. We
58:03
had a conversation this morning about somebody
58:05
who has a blog about beer. We said, well,
58:07
this person does reporting. They actually interview
58:10
people. They look at statistics. They're
58:12
not just going to be sharing their opinion on
58:14
beer. and it felt like, yeah, that's journalism.
58:17
So he's a gatekeeper and he's
58:19
the one who's going to decide if your
58:21
particular beat is is
58:23
counts as journalism. However, apparently,
58:26
I don't count as a journalist George Takai
58:28
is on journalists, so he counts as
58:30
a journalist. the Star Trek guy? Yeah. The Star
58:33
Trek guy. So he's obviously not
58:35
enforcing these with any sort of
58:37
consistency.
58:37
Did you DM him to ask him about your application?
58:40
Funny you should say that, Jesse. I can't DM
58:42
him because I am blocked. Okay.
58:43
I mean, that seems pretty gatekeeper to
58:45
me. Maybe that's the point.
58:47
Yeah. And this is sort of my annoyance with project
58:49
is that, like, one of the good things about Twitter
58:52
is that it's public. It's all public.
58:54
Mastodon is not. You can sort of see
58:56
what's going on, but it's just It's like the
58:58
platform is so confusing, and I want journalists
59:01
to be having these idiotic conversations in
59:03
public where everyone can see them. The
59:05
people are being denied access to our dumbness,
59:07
which is not right. Right. So
59:09
yes, this is a pretty ideologically
59:11
narrow minded community and one way that manifested
59:13
was that there's immediate a heated controversy.
59:16
Almost as soon as this thing launched, whose
59:18
controversy involved two very close friends
59:21
of the podcast, Mike Paskett and
59:23
Parker Malone Probably our two best friends in real
59:25
life too.
59:25
Right? Yeah. I like their their initials
59:27
are Palindrome.
59:28
Wow. MPPM So The
59:30
controversy centers on a big important front
59:33
page New York Times article about puberty blockers,
59:35
two of the top investigative journalists there Megan
59:37
Tuohy and Christina Hewitt. They basically
59:40
just laid out many unknowns about puberty blockers,
59:42
a lack of evidence, the physical side effects, some
59:44
kids have experienced, and so on. It
59:46
was an important article on a hot button subject
59:48
on the front page of arguably the world's most
59:51
important newspaper, so you couldn't come
59:53
up with a better example of this sort of article
59:55
journalist like to discuss on social media.
59:57
So Mike Pasquale posts a
59:59
link to it
59:59
on jerna dot host on
1:00:02
November eighteenth. This seemed like
1:00:04
careful through a reporting hero. Mike, Mike,
1:00:07
Mike, Mike, my god. Dude, whoa,
1:00:09
dude. That idea, bro.
1:00:12
Let's just pick up Joe Bernstein's reporting
1:00:15
in the Times. In response,
1:00:17
Parker Malloy, a journalist who writes the Substack
1:00:19
newsletter, the present age, accused mister
1:00:21
Pesca of anti trans bigotry and
1:00:23
then posted a really at mister Davidson for
1:00:25
not removing the post. At
1:00:28
Adam Davidson's decision not to take action
1:00:30
on anti trans content isn't inspiring
1:00:32
confidence, and I totally understand why other places
1:00:35
are doing instant instance level blocking,
1:00:37
meaning other servers are blocking
1:00:39
the servers. She wrote on Jira host.
1:00:41
Zach Evers said one of the Jira host administrators
1:00:44
responded that he agreed with miss Beloy.
1:00:46
Then added, banning someone for posting
1:00:48
a link to an NYPD article sets a precedent
1:00:51
that we really need to work through. On Saturday,
1:00:53
journalists suspended mister Pasca
1:00:55
who's informed via text message from mister
1:00:57
Davidson, a long time, Fred. Prednis,
1:01:00
the two are currently writing an exchange of letters
1:01:02
hosted on SBSAC about the nature of cancer
1:01:04
culture. I will not be reading that by the way. And that's
1:01:06
not because I don't want Heska's work. It's because
1:01:08
III could not stand Davidson's
1:01:12
If
1:01:12
Adam Davidson ever wanted to come on, we would
1:01:14
have a mom, but it's very his views
1:01:16
on this I find to be very bad faith. According
1:01:19
to mister Pascal, this is back to the article. Mister
1:01:22
Davidson told him he had been suspended for
1:01:24
referring to miss Molloy as an activist,
1:01:27
which was dismissive. The suspension
1:01:29
seemed arbitrary in ad hoc, mister Paskett
1:01:32
said in an interview. Miss Molloy didn't respond
1:01:34
to a message he can comment. What'd you make of this,
1:01:36
Katie? this
1:01:36
whole thing is so inevitable.
1:01:39
I tweeted something like this. This is going
1:01:41
to end with Adam Davidson
1:01:43
getting kicked off of his own server for Transphobia.
1:01:46
which point he comes out as non binary. This
1:01:48
is always how this was going to end.
1:01:51
It seems like exactly what you would expect
1:01:53
happen when you give a certain, like, sensorious
1:01:55
type of person, too much power. The
1:01:58
idea of banning Mike Pasquale for calling
1:02:00
Malloy an activist is ridiculous.
1:02:03
I understand my lawyer. wants
1:02:05
to be seen as a I understand she wants to be seen
1:02:07
as a journalist. She does exactly what the
1:02:09
worst activist do, which is scurry around in
1:02:12
back channels trying to get people fired. She's
1:02:14
an activist.
1:02:14
Yeah. I ain't googled Parker Malloy
1:02:16
activists because I wanted to see if she is
1:02:18
commonly referred to as an activist. And I
1:02:20
on this blog post from Qwerty from
1:02:23
twenty fourteen. Let me just read you a second of
1:02:25
this.
1:02:26
Parker Malloy, someone I look to for advice
1:02:28
on occasion and supported through her fundraising for
1:02:30
surgery was telling me to kill myself,
1:02:33
not only was she telling another trans woman
1:02:35
to die she was giving her instructions ranging
1:02:38
from cutting herself to drinking bleach and
1:02:40
what's more demanding that she do it.
1:02:42
She is quite the eye What
1:02:43
what year was that? She's
1:02:44
a pro suicide activist twenty
1:02:46
fourteen. I think right. And so I think she we
1:02:48
should say I think she apologized for that. I mean, she
1:02:51
did. She Did
1:02:51
she ever apologize for trying to get both of
1:02:53
us fired? I think
1:02:54
not. She apologized for that. She apologized
1:02:57
subsequently for this Mike Paskett thing.
1:02:59
She said
1:03:00
that her mental health has, like, interacts
1:03:02
poorly with online stuff. I think maybe --
1:03:04
No. Sure. -- this is this is a fucking pattern,
1:03:06
like, eight years now. Like, just I she
1:03:10
it's really gross, obviously, to tell people to
1:03:12
kill themselves. It's very gross to constantly
1:03:14
try to get people fired and banned from stuff, and
1:03:17
I don't know. I find it. I just can't really
1:03:19
find it pissed about this. This is a whole other level.
1:03:21
It's something that marks you
1:03:23
as, I don't know if she's a bad person. Maybe she,
1:03:25
like, think she's a dog. Maybe she's like good to her dog
1:03:27
and her partner. Hitler
1:03:28
was good as dog too. Hitler
1:03:29
was also good to dog. So we're comparing her directly
1:03:31
to Hitler is what we're saying. No. It's just it's really
1:03:34
shitty behavior. And to suspend Mike
1:03:36
Heska for calling her an activist suggests
1:03:38
this is already a very broken community.
1:03:40
Anyway, moving on from that.
1:03:44
this this, I think, has the desired effect,
1:03:46
which is if you check out journalists, you'll
1:03:49
see no sign of descent on controversial issues
1:03:51
like puberty blockers. Like, it's clear this is not a
1:03:53
community you should go anywhere near if you
1:03:55
want to engage in debate or discussion, which
1:03:57
used to be things journalists did
1:03:59
regularly. They would disagree about stuff and
1:04:01
say so. So you will
1:04:03
clearly get dogpiled or even suspended
1:04:06
if you, for example, post a link to a New
1:04:08
York Times article by two
1:04:10
celebrated investigative reporters. So that's gonna be
1:04:12
a very healthy community that's gonna do good stuff,
1:04:14
I think. Did you try to get in? I I didn't have
1:04:16
it like my ego. I wanted you to as an
1:04:18
fair amount. Oh, you've got to. But my ego
1:04:21
my ego wouldn't let me. I'm not gonna get rejected
1:04:23
by Adam Davidson from this shitty law community
1:04:25
that kicked out of my tasca. So
1:04:27
unlike a lot of people, I'm not gonna make confident
1:04:30
predictions about, like, the future of Twitter, the future
1:04:32
of Mastodon. because who knows what will happen? Maybe
1:04:34
Twitter will collapse. Although, I think people are vast
1:04:36
overstating that likelihood. If I had
1:04:39
to guess, journalists won't
1:04:41
matter six months from now. A few
1:04:43
reasons for that. One is that, as you mentioned,
1:04:45
Mastodon's interface and structure just both
1:04:47
fucking suck. It's extremely confusing
1:04:49
compared to Twitter. The whole, like, federated
1:04:52
servers instance blah blah blah is not
1:04:54
intuitive and you can accuse me
1:04:57
of not putting much effort into trying to learn how
1:04:59
it works because I haven't. but you know what else?
1:05:01
I've never put any effort in understanding Twitter
1:05:03
because you don't have to because it just basically works
1:05:05
even though it sucks and it's horrible and so on. Right?
1:05:08
So
1:05:09
I think the fact that, like, it's hard to even figure
1:05:11
out how to do mastodon is
1:05:13
one obstacle. But I think the biggest reason
1:05:16
journalists probably won't matter in six months is that
1:05:18
journalists are incredibly narcissistic.
1:05:20
And validation from the masses
1:05:22
is incredibly important. them. And so
1:05:24
on Twitter, you get that constantly. I can make a dick
1:05:27
joke and get fifty likes. On a gate
1:05:29
kept Mastodon Server, you won't get that validation
1:05:31
except from the same circle of fellow journalists.
1:05:34
which will get old after a while. Plus, a
1:05:36
journalist has already been banned by a bunch
1:05:38
of other instances or servers apparently
1:05:41
for various reasons. Like, transborder sort
1:05:43
of right. It's been a it's
1:05:45
been accused of transphobia for not
1:05:48
blocking access to a front page
1:05:51
new these
1:05:53
the folks who want to ban everything and get
1:05:55
everyone fired, they're not shy
1:05:57
about showing who they are. Are they? Like, it's very
1:05:59
clear what they want.
1:05:59
They don't they don't think you should be about
1:06:01
allowed to disagree with them on their
1:06:04
pet subjects without getting in trouble. And
1:06:06
they they're not shy about it. it's
1:06:08
crazy that we're still debating whether this is okay,
1:06:10
when it's obviously poison to journalism.
1:06:13
But end
1:06:13
of the day, journalists are not gonna hang out in
1:06:15
a place where they can't have their ego stroke. and you
1:06:17
can't have your ego stroke unmasked it on
1:06:19
when you're just instantly blocked.
1:06:22
So then some of them will surely break off and form
1:06:24
another sense, but that'll dilute everything. That one
1:06:26
will get blocked by other psychos. The project
1:06:28
just seems doomed to fail, I
1:06:29
think. I think you're probably right about I
1:06:31
mean, some people have been on mass it on for six
1:06:33
years now and they love it and it's a totally different community
1:06:36
and they find what they what they are
1:06:38
looking for there. I don't think journalists are those
1:06:40
people and I think this the same things that
1:06:42
make journalists that make Twitter toxic,
1:06:45
the quote tweet button, the dog piles,
1:06:47
all of that shit going viral. that's
1:06:49
exactly why people are on the platform.
1:06:52
And Twitter itself could do away with
1:06:54
that. Like, if you wanted to make talk Twitter less
1:06:56
toxic, get rid of the quote tweet button. course,
1:06:58
people will just take screenshots and find
1:07:00
ways to mimic it. They
1:07:01
could just ban you and hate speech would go down
1:07:03
thirty percent.
1:07:04
Yeah. And they might because I've been shit talking
1:07:06
Elon Musk. But
1:07:08
those are the things that drive traffic
1:07:11
and interest in drama, what people want
1:07:13
is drama. And the only time that Mastodon
1:07:15
was interesting to me was
1:07:18
when this drama was going on with Pesca
1:07:20
in in Parker Boulevard. And there's this other,
1:07:22
like, weird usability thing
1:07:24
where Content warnings are huge on
1:07:26
Mastodon. It's like part of the ethos
1:07:28
is to content warn everything.
1:07:30
And what this means is that
1:07:33
a lot of the post have a little
1:07:35
tag, and it'll be, like, politics
1:07:37
or something like that. But so you have to click that
1:07:39
to see the actual text of the post.
1:07:41
And I just think in terms of usability, like, making
1:07:44
me click one more thing, like, an expand
1:07:46
button for articles or whatever. If you make
1:07:48
me click one more thing, I'm just gonna lose interest
1:07:50
pretty quickly. My fingers are only so strong.
1:07:52
Yeah. I I if you just go to journalists,
1:07:55
you will lose interest because it's like they badly want
1:07:57
this to be the new Twitter, the new thing. It's not going
1:07:59
to be. So I think we're
1:07:59
probably stuck with Twitter for now if it doesn't collapse,
1:08:02
which I don't think it will. But
1:08:03
Let me let me tell you what's at the top of Joanna
1:08:05
host right now.
1:08:08
George Takai, it's follow Friday,
1:08:10
so I hope folks utilize the hashtag
1:08:12
to find my account here. This is Dakota's
1:08:14
scintillating content people get from the service
1:08:16
journalism, the Star Trek dude
1:08:19
asking for people to follow them.
1:08:20
What if we just take the most boring
1:08:22
people on Twitter, folks where you already
1:08:24
know their view on everything, and
1:08:26
have to make another account that will force you to make
1:08:28
another account just to follow them, except you can't get
1:08:30
into journalist unless you're journalist h
1:08:33
up some journalist can't get in. Also, it's mass blocked
1:08:35
by it. Like, it it's -- Yeah. -- good luck,
1:08:37
Adam Davidson, but we would like you to come on
1:08:39
the podcast.
1:08:39
I'll be on post dot news. I'm
1:08:41
not, Katie. Don't even know. Just because
1:08:44
just because just because you can't
1:08:46
get into post
1:08:47
dot I'll be on fucking fuck dot org.
1:08:50
Anything else, Katie?
1:08:52
I think that's it. As
1:08:54
always, thank you to tracing wood grains for
1:08:56
production help. This has been blocked and imported.
1:08:59
I'm Jesse Single, and remember, I
1:09:01
completely support Israelis and
1:09:03
Palestinians equally unless
1:09:05
that's offensive, in which case my apportionment
1:09:08
is different.
1:09:09
And I'm Katie Herzog and also remember.
1:09:11
I want a nice girlfriend. I
1:09:13
want a nice girlfriend. I want a nice girlfriend. I
1:09:15
want a nice girlfriend. I want a nice girlfriend. I
1:09:23
Yeah.
1:09:25
They play that at your barmertsun.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More