Episode Transcript
Transcripts are displayed as originally observed. Some content, including advertisements may have changed.
Use Ctrl + F to search
0:00
We made USAA Insurance to help you
0:03
save. Take advantage of discounts when you
0:05
cover your home and your ride.
0:08
Discover how we're helping members
0:10
save at usaa.com slash
0:12
bundle. USAA. Restrictions apply.
0:16
I had an AI experience this week.
0:19
I just started seeing a new doctor. And
0:22
I go to my first sort of intake
0:24
appointment with this doctor. And the doctor comes
0:26
into the room, nice guy. And
0:29
he puts this little thing down on the
0:31
table next to him. It's basically
0:33
like a phone, but the screen is off and it's got
0:35
a little microphone attached to it. And
0:37
I'm thinking, okay, he knows I'm a podcaster. He's
0:40
trying to make me feel comfortable. In
0:42
case you have anything you want to say and want to
0:44
kind of get it down, you can do it in
0:46
the office. Exactly. And then he tells me,
0:48
this is my AI scribe. A scribe? Yes.
0:52
We haven't seen scribes as a job in this country for thousands
0:54
of years. Since medieval times. No,
0:56
but the scribes are the people who accompany
0:58
doctors when they're meeting with patients
1:00
and write down what people say. Yeah. So
1:03
now this job is being taken over by AI. And
1:05
the doctor explained to me that he has this
1:07
app that just sits on this little phone that's
1:09
purpose built for this kind of thing. And
1:12
it will take notes on everything we say during
1:14
the meeting. And it'll condense our notes
1:16
down into a visit report and a summary.
1:19
And it'll put it into my file. And like that will
1:21
save him tons of paperwork. And
1:23
sort of based on everything that it
1:26
writes down about you, that's all fed
1:28
into Facebook so that it can show
1:30
you targeted ads on Instagram. Is that
1:32
right? I think HIPAA protects that, but we
1:34
should check on that. I didn't read the fine print. We should
1:36
change that rule. I'm
1:41
Kevin Roos, a tech columnist at the New York
1:43
Times. I'm Casey Newton from Platformer. And this
1:45
is Hard Fork. This week, Apple's got a
1:47
brand new mixed reality headset and Kevin and
1:50
I tried it. Then a dramatic
1:52
hearing over child safety in the
1:54
Senate. And finally, how a single
1:56
car accident took down the self-driving
1:58
car company, Cruze. Casey,
2:17
we went on a field trip last week. We
2:19
did, you know, as much as we
2:21
enjoy being in this studio, this beautiful environment that
2:23
we're in right now, it is awesome
2:25
from time to time to leave the confines
2:27
of the studio and go out into the
2:29
world and get our hands on some new
2:31
technology. Yeah, so we were both invited down
2:33
to Cupertino, California, to the headquarters of Apple
2:35
to try out the Vision
2:37
Pro, which goes on sale this week. We've
2:40
talked about it on the show before. This
2:42
is Apple's, what they're calling a spatial computing
2:44
device, essentially a VR or an AR headset,
2:47
although they don't use those terms. And
2:50
it's attracting a lot of attention, first and foremost,
2:52
because it costs $3,500 for the
2:55
base model. But this is,
2:57
I would say, the biggest hardware release
2:59
of the year. And so
3:01
I was very excited to go down to Cupertino and
3:04
try it out myself. Yeah, I was. Now, we should
3:06
say, you know, a bunch of journalists did get their
3:08
hands on this before we did. You know, I would
3:10
say that we were sort of brought in as the
3:12
kind of cleanup crew of the journalists. They were saving
3:14
the best for last. They saved the best for last.
3:17
We were the last ones to get our hands on
3:19
this. And I think we'd like to
3:21
share some impressions. I would say
3:23
that, you know, I think we have some positive things
3:25
to say about it. I know that sometimes when some
3:27
of our listeners are journalists saying positive things about technology,
3:30
it sends them into a rage. So I just want
3:32
to say sort of preemptively that we will also continue
3:34
to criticize Apple. But sometimes
3:36
people do cool things. Yeah. Yeah.
3:39
So first, let's just talk about what this headset is. Yeah.
3:42
So Apple calls it a spatial computing
3:44
device. And in many ways, they've set
3:46
it up to be the eventual successor
3:48
to the Mac computer. So that is
3:50
sort of the level at which Apple
3:52
is thinking about it. Of course, it's
3:54
still very much in exploration mode. We
3:56
don't know if it will get all
3:58
the way there. But the Vision Pro is
4:00
an effort to see, can you
4:03
move away from laptops with keyboards
4:05
and this sort of one physical
4:07
display, put a helmet on your
4:10
face, stay in it all day,
4:12
have essentially infinite displays, manipulate
4:15
objects with your fingers, navigate the
4:17
device with your eyes, and invent
4:19
a kind of new paradigm of
4:21
computing. So this is like a
4:23
really big swing, Kevin, right? Like
4:25
Apple's got a lot of recent
4:27
successes, whether it's the watch,
4:30
the phone, the iPad, you name it, all
4:32
of those are pretty big businesses, but I
4:34
think all of them were easier to accomplish
4:36
in their own ways than what Apple is
4:38
trying to do with the Vision Pro. Totally,
4:40
it's a very ambitious kind of project. It's
4:42
been many, many years and billions of dollars
4:44
in R&D in the making, and
4:46
it really felt like it.
4:49
Like it's a beautiful device. It looks
4:51
like a pair of ski goggles. It's
4:54
not kind of bulky or clunky
4:56
like other VR headsets that I've
4:58
worn. It really looks like an Apple
5:00
product, and it has the
5:02
price tag to match. Yeah, and at
5:05
the same time, Kevin, I think the
5:07
most important question about this device, which
5:09
I do think remains mostly unanswered, is
5:12
what is it for or who is it for?
5:14
Where somewhere in there, I think, is the real
5:16
question about this device, and I think as Apple
5:18
guided us through some demos, they tried to answer
5:20
those questions for us. Yeah. Yeah. And I think
5:22
we should also just say, as sort of a
5:25
blanket caveat to this segment, this
5:27
is not going to be a full review because
5:29
we did not get a chance to really take
5:31
this thing for a full test drive. My
5:33
demo was about 45 minutes. I think yours
5:35
was about the same length. We were
5:37
not allowed to actually take one home and test
5:39
it in our own sort of home environment. We
5:42
couldn't throw it down a flight of stairs to see if it broke.
5:44
Exactly. And I would say
5:46
it was a heavily curated experience. Like Apple
5:49
definitely had a set of things that they
5:51
wanted to show, at least me,
5:53
and so it didn't feel like I actually got
5:55
to play around on my own terms. Yeah, that's
5:57
right. So these are impressions from a very guided.
6:00
very curated look at this new device. Yeah,
6:02
so we went down, you know, at least,
6:04
it was interesting. They had us come in
6:06
separately. It was like we were too powerful
6:08
if we had our demos at the same
6:10
time. They didn't want us to combine forces.
6:12
That was actually by my request. I want
6:14
to thank Apple for accommodating me on that
6:16
one. So
6:18
at least my experience, you know, I
6:20
meet the Apple sort of minder at
6:22
the gate. I get walked in, sort
6:24
of past this manicured lawn into the
6:26
Steve Jobs theater, down a set of
6:28
stairs and into this, like, what is
6:30
essentially like a fake living room. So I
6:33
sit down. I have
6:35
basically two Apple employees in the room with me. One
6:37
is sort of giving me the demo and one is
6:39
just kind of there keeping tabs on the situation. And
6:42
the first thing you have to do is kind of do a
6:44
little setup tutorial thing to sort of calibrate
6:47
it to your eyes because your face. They
6:49
scan your face because there are no controllers
6:51
with this thing. The way that you control
6:53
your cursor, essentially, just by looking around. So
6:56
once I got set up with the eye tracking
6:58
and all the gestures, then Apple
7:01
showed me these. What I thought was
7:03
the highlight of the demo, frankly, were
7:05
these things, spatial photos and video. Did
7:07
you see these? Yeah, this is really,
7:09
really cool. So
7:12
you can either take these photos with your
7:14
iPhone or you can take them with the
7:16
headset itself, although it seems like in just
7:18
about every case, you're better off taking these
7:20
things with a newer iPhone. Well, they only
7:22
work on the iPhone 15 Pro and Pro
7:24
Max. If you don't
7:26
have last year's iPhone, this doesn't work. So
7:29
these are essentially 3D photos and
7:31
videos and you can view
7:33
them in 3D in the
7:35
Vision Pro itself. And
7:38
I don't know, have you spent a lot of time playing
7:40
around with 3D photos and videos? No,
7:42
I mean, not really. I can remember going down
7:44
to Facebook years ago and having them show me
7:46
3D photos and sort of telling me this was
7:48
the future and you just kind of tilt your
7:50
phone around and be able to shift the perspective
7:52
a little bit. Not that impressive. On the Vision
7:54
Pro, it does feel a little
7:56
bit more of like a black mirror situation where
7:59
you're sitting inside somebody else's memory particularly with these
8:01
videos like they showed me this spatial video about
8:03
family having breakfast and of course everyone's like a
8:05
young family and everyone's very adorable and they're poor
8:07
and orange juice or whatever and you just like
8:09
feel like you are there with them and
8:12
as I was looking at this I thought like
8:14
I could see how this would be some family's
8:16
cherished memory and 20 years from now
8:18
the kids are all grown up and they strap on a
8:20
helmet and they revisit this and that that is like a
8:22
special thing for them. Yeah I mean this blew
8:24
my freakin mind I'll be totally honest I was
8:27
like very ready to be skeptical because I have
8:29
played around with a lot of 3d cameras in
8:31
the past I used to take 3d photos and
8:33
videos just sort of like because I wanted to
8:36
like relive I'm like a camera dad you know
8:38
and like I have a kid and yeah I
8:40
just take a lot of videos and so I've
8:42
wanted something that feels a little bit more immersive
8:45
I see and this thing is
8:47
incredible I mean I saw the same demo
8:49
you did it sounds like there were a
8:51
couple photos and videos there was you know
8:53
what looked like a birthday party there was
8:55
a mom sort of making bubbles with
8:57
her kid and we should explain it's
8:59
not these are not sort of like
9:01
just projected at you in like a
9:03
dark environment you can see these kind
9:05
of overlaid on the real world because
9:07
of this pass-through display that Apple has
9:09
built so I was in this living
9:11
room this fake living room you know
9:13
with these Apple employees around me with
9:15
sort of this coffee table in front
9:17
of me and this memory this video
9:19
just kind of popped up you know
9:21
amidst all of that and
9:24
you're right like it really did trick my
9:26
brain into thinking that I was there in
9:28
this scene it was I've never seen anything
9:30
like it and like I'll be honest I
9:32
got sort of a lump in my throat
9:34
because I was picturing like capturing my son
9:36
like first steps this way and like revisiting
9:38
it 20 years from now and like you
9:41
know that kind of thing I think will make
9:44
this more compelling for
9:46
especially parents yeah it is powerful and I think
9:48
particularly if you've already got a phone in your
9:50
pocket and you can take videos like this it
9:52
winds up being pretty easy to take the video
9:54
and maybe if you don't even buy the first
9:57
vision Pro which most people are absolutely not going
9:59
to do maybe three or four or five
10:01
generations from now, you still have these videos
10:03
saved in your iCloud somewhere, and you're able
10:05
to relive them again. So it's actually like
10:07
an interesting development. Right, so after this spatial
10:10
photo and video demo, I got
10:12
this demo of basically a movie theater
10:14
experience where they showed me a clip
10:17
from Super, did you get the Super
10:19
Mario Brothers 3D clip? And
10:22
then a clip from Star Wars, and
10:24
you could kind of transform your surroundings
10:27
in the headset so that it
10:29
looked like you were watching the
10:31
Star Wars trailer on Tatooine, or
10:33
you could be in a volcano
10:35
in Hawaii watching Super Mario Brothers
10:37
in 3D. Yeah, and an interesting
10:40
aspect of this is that like
10:42
the Apple Watch, the Vision Pro has
10:45
this little control feature that Apple calls
10:47
a digital crown, and this little wheel
10:49
that you can spin. And
10:51
on the Vision Pro, you can dial
10:53
the level of immersion up and down.
10:56
So if you want to watch a movie while
10:58
pretending that you're sitting in a volcano crater, you
11:00
can sort of crank that dial all the way
11:02
up and look all the way around you in
11:04
360 degrees and you see
11:06
a volcano crater. Or if you just kind
11:08
of want like a hint of that, you
11:10
can make it semi-transparent, and the cameras in
11:12
the Vision Pro will show you your surroundings
11:15
in really high definition. Yeah, and it actually
11:17
works pretty well. They call this feature pass-through,
11:19
which is a little bit of a misnomer
11:21
for reasons that maybe we should take a
11:23
minute to explain. It's not actually like the
11:26
display is not becoming transparent. It's just
11:28
it has cameras on the outside that
11:30
are sort of capturing the room around
11:32
you and then piping it into the
11:34
headset as your video feed. So it's
11:36
a cool trick. It didn't
11:39
bother me. Like it actually felt like I was looking
11:42
through a semi-transparent display, which
11:44
I just thought was a very impressive
11:46
technical accomplishment, and did actually feel like
11:48
I could kind of control the sort
11:50
of immersiveness of the experience. Yeah, so
11:52
you mentioned these videos
11:54
they showed us, and I want to
11:56
say that to me, it was
11:59
the entertainment. focus stuff that they
12:01
showed us, and mostly just watching video, that
12:03
was the most compelling stuff that I got
12:05
to try during this demo. You know, there's
12:07
a moment where we started, I started to
12:09
watch the Star Wars trailer, and you
12:11
have this like light seal that presses into the device
12:13
that blocks out all the light around you. And so
12:15
everything goes black in the exact same way when you're
12:17
like sitting in a movie theater, and right before they
12:19
start to show the movie, everything goes black, and you
12:21
sort of have that moment of like, okay, now I'm
12:23
just gonna focus on a movie for two hours. I
12:26
felt that way in the Vision Pro. And I like that
12:28
because I don't know about you, Kevin, I am somebody who
12:30
can like barely watch a movie on my TV without
12:32
just scrolling on my phone the entire time. And I
12:34
had that sense of relief of like, oh gosh, like
12:37
maybe if I had one of the things, I
12:39
actually watch a movie from start to finish. Wow,
12:41
yeah, this piece of technology will definitely save your
12:43
attention span. That's,
12:46
that always works. Okay, so that
12:48
was the entertainment piece. Then
12:50
what I was really curious to see was
12:52
the productivity piece, because Apple is not just
12:54
billing this as a very cool sort of
12:56
way to watch movies at home
12:58
or take these spatial photos and videos.
13:00
It really wants office workers to buy
13:02
this and use it for work. So
13:05
in a lot of the promotional material, they would
13:07
show these scenes of people at their desks, and
13:09
they've got their Mac monitor in front of them,
13:11
and then they put on their Vision Pro, and
13:14
they can sort of open up windows in space
13:16
and kind of move them around and resize them,
13:18
make them as big or small as they want.
13:21
And basically, people can create their own
13:23
ideal desk setup and take that with
13:25
them, whether they're at a coffee shop
13:27
or on a plane or sitting at
13:29
their desk. So I was
13:31
excited to try this stuff, and I wanna
13:33
hear your impressions. My impressions were this stuff
13:36
is not quite there. So
13:38
on the positive side, I would say
13:40
that this has the
13:42
best visual fidelity of any headset I've ever
13:44
used. These are really high-end displays. And so
13:47
at one point, they told me to open
13:49
Safari, the web browser, visit any website you
13:51
like. I, of course, entered platformer to see
13:53
how that was gonna look in mixed reality,
13:57
the text was very crisp. You know, you
13:59
can... roll through, I visited some other
14:01
web pages, like the photos looked really great.
14:03
And yet I thought like the
14:06
amount of ingenuity that has gone
14:08
into recreating digitally an experience that
14:10
was already working perfectly fine for
14:12
me in a laptop feels a
14:14
little bit crazy here. And
14:17
so while I appreciate all of
14:19
the skill and the creativity that
14:21
went into making this thing work, I still couldn't
14:23
figure out why I would wanna do all my
14:25
web browsing and typing in the Vision Pro as
14:28
opposed to just like my MacBook. Well, I've always
14:30
had the thought while browsing platformer and reading
14:32
your articles, like I wish I could have
14:34
this just way bigger and closer to my
14:37
face. If
14:39
this could just be on my face, that
14:41
would be the ideal way to read my
14:44
favorite tech newsletter. I could direct feed into
14:46
your digital cortex. I'm very excited for this
14:48
feature, but in seriousness, I thought this was
14:50
the least impressive part of the demo. I
14:53
also asked if I could see like something else
14:55
in the kind of productivity world, like what else
14:58
you got for the office worker? And
15:00
they showed me this part of the demo. They don't
15:02
think they showed you, that's right. Wow. But
15:05
it was a feature in Keynote,
15:07
the slideshow app. You got the
15:09
Keynote demo. I did. And
15:13
so this is their like sort of
15:15
PowerPoint equivalent. And they showed me this
15:17
thing where in the Vision Pro you
15:19
can practice giving a presentation to
15:22
like an empty conference room or an empty
15:24
theater. I would love to see you spending
15:26
more time just practicing talking. I think it
15:28
would have really improved the podcast. And
15:31
I didn't think it was that great a feature. It
15:33
felt sort of like a giving to me, but it
15:35
is exactly the kind of thing that office workers are
15:37
going to try to use to convince their bosses to
15:39
let them expensive Vision Pro, which I did appreciate. Well,
15:43
while you were getting that demo, I
15:45
was getting another productivity demo, which was
15:48
Algoritum's DJ app. Now this
15:50
is an app I've used for many years. I've
15:52
DJ'd my friend's weddings using this app. Wait, really?
15:55
Yeah, there is. I didn't know you were a DJ. I'm
15:57
not a good one, but like for a close friend, I'll DJ your wedding. What's
16:00
your DJ name? Um, DJ TaskRabbit. No,
16:04
that really is. That's what I could be getting. Wait,
16:08
really? Yes. Wow.
16:11
So yeah, DJ TaskRabbit was the best around. I've got
16:13
my kid's birthday party is coming up. Are you available?
16:15
Yeah, but DJ TaskRabbit was very expensive. Just keep that
16:18
in mind. It's an easy
16:20
assignment. All you have to do is play the Cocomelon version
16:22
of Wheels on the Bus 72 times. Okay,
16:24
I think I can do that. Okay,
16:27
so anyway, you were looking for
16:29
productivity enhancements for your side gig as
16:32
a DJ. Exactly. And so we pull
16:34
up this app and you sort of
16:36
immediately see a couple of turntables, a
16:39
couple of records. And
16:41
I should say, in 2022, Meta released its highest-end
16:43
headset today, which is called the MetaQuest Pro. And
16:45
when I went down to try that, I also
16:48
got to use a DJ app. And, you know,
16:50
there was some cool stuff about it. It
16:53
was not the DJ app that I used when I was
16:55
in Cupertino. But one
16:57
of the things that I don't love about the MetaQuest
16:59
Pro is that the visual fidelity is like,
17:01
okay, but it's not great. It doesn't really
17:04
feel like you're standing in front of the
17:06
wheels of deal. Well, in
17:08
this DJ app on the
17:10
Vision Pro, the visual quality is so good
17:13
that you could just kind of pretend that
17:15
you were DJing. And
17:17
once I got the music going, I just
17:20
intuitively reached down to scratch, you know,
17:22
do a little bit of wika-wika, and
17:24
it worked perfectly. And like, you know,
17:27
just as you would want it to. And
17:29
so there was part of it that was like, you
17:31
know, just pure eskumorphism of like, we will make a
17:33
DJ rig and you can DJ on it here in
17:36
mixed reality. Well, but they also had
17:38
this sort of enhancement, which I can only
17:40
describe as like a kind of 3D
17:45
box that sat on top of the wheels of
17:47
steel. And you could reach
17:49
your hand into it. And as you just sort of
17:52
moved your hand around, you can manipulate the sound. So
17:55
you sort of felt like a DJ
17:57
wizard that was just kind of distorting.
18:00
and changing the music that was playing. Did
18:02
it sound good? No. Would I be
18:05
asked to leave a wedding if I were using
18:07
this technology in DJing? Yes, I would. But did
18:09
I enjoy it during my demo? Kevin, yes, I
18:11
absolutely did. I'm so glad for you. Wow. I'm
18:14
still reeling from the revelation. You've been holding out
18:16
your secret side career as a DJ on me.
18:18
I love music. Music's very important to me. Wow.
18:21
Okay, so that's the demo that we got. We
18:23
also have to talk about what we didn't get
18:26
to try, which is I wanted
18:28
to FaceTime call you and see how
18:30
our personas would look. So personas. Yeah,
18:32
tell us about these personas. So personas
18:35
are this feature, Apple reminds people, it's
18:37
a beta feature for reasons that will
18:39
shortly become clear. Basically trying
18:41
to solve the problem of how do you
18:43
do a video call using one of these
18:45
headsets? Because if you are
18:48
just doing a normal Zoom call or a FaceTime
18:50
call, like your face is covered
18:52
by this headset. How are you gonna look
18:54
normal on a video call? And Apple's answer
18:56
to this question is do you basically create
18:58
a deep fake of your face called
19:00
a persona? I would probably
19:02
say an avatar over a deep fake. Well,
19:06
you know, potato patata. But basically,
19:08
you scan your face and then Apple
19:10
sort of renders this 3D model of
19:12
your face that it then uses
19:14
as kind of a virtual stand-in for
19:17
you on video calls. And we were
19:19
not able to try this, but
19:21
several other reviewers have been able to.
19:23
And I will just say, they look
19:26
very funny. They do. People,
19:28
I was reading on threads yesterday, people were
19:30
saying that your persona sort of makes you look like
19:32
the PlayStation 3 version of yourself. Where
19:35
it's like, it's recognizably
19:37
you, but like it's a
19:39
little blocky, you know, it feels like
19:41
some of the individual polygons are like
19:43
almost visible. And of course, you're
19:46
like less expressive in this form than you would
19:48
be like, you know, using a real face. And
19:50
so, yeah, people were having a lot of fun
19:52
yesterday looking at the personas. Yeah, one person I
19:54
saw compared them to like NPCs in like a
19:56
video game who are gonna offer you a side
19:58
quest. Yeah. And
20:00
I just think like, you know, clearly this is
20:02
a product that's going to get better over time. Like this
20:05
is the V1 but
20:07
I think you know if someone showed up in
20:09
a meeting with me as a As
20:13
one of these personas like the meeting agenda
20:15
is over like we're talking about that And
20:17
it's like not going to go unnoticed by
20:20
anyone in the meeting at least at first
20:23
There's there's so much in the vision
20:25
pro that is like it
20:27
is sincerely cool There is some stuff in the
20:29
vision pro that I think might be the best
20:32
way to do things particularly consuming video Although even that
20:34
is kind of still a question mark for me And
20:36
then there is stuff that is so
20:38
much more complicated So
20:41
much more expensive and still obviously worse and
20:43
in the social features are really where I
20:45
think that is the most true So again,
20:47
can all of this get better over time?
20:49
Yes, I'm sure it will But the the
20:51
current state of the art this feels a
20:54
little bit more like a party trick than
20:56
a way that people are gonna be So
20:58
let's do a little summary. What was your favorite thing
21:01
about the vision pro and your least favorite thing?
21:03
so And this
21:05
is another demo that a lot of people
21:07
have talked about already But we did this
21:09
little dinosaur demo where in the in the
21:11
mock beautiful living room There was a blank
21:14
wall and as I looked up with my
21:16
vision pro we started this app and basically
21:18
Imagine it's like it's like the
21:20
wall opens up and all of a sudden you're
21:22
looking at this prehistoric Landscape and a butterfly flies
21:25
out and you raise up your finger and the
21:27
butterfly just flies over and it just sits right
21:29
on your Finger and you know, you move your
21:31
finger around the butterfly moves with you. So
21:33
that's using the eye tracking capabilities, right?
21:36
Then a dinosaur gonna help you in
21:38
your career as your side career as
21:40
a butterfly tender Yeah as a huge
21:43
leopard optorist. Let me just say Yes,
21:46
it's the word. Oh, hey what leopard
21:48
opera? That's the study of
21:51
butterflies I hope
21:53
you know college anyways,
21:56
so after the butterfly Do
22:02
you know Vladimir Dubokov, in addition to being one
22:04
of the great writers, was also a Lepidopterist and
22:07
named him any species of butterfly. You
22:09
contain multitudes, my friend. So then
22:12
the dinosaur walks into the frame and
22:14
there's like a little, you
22:16
know, they're dancing around on the prehistoric rocks or
22:18
whatever and then, you know, one of them gets
22:21
chased away and then the, and I don't know
22:23
that these are velociraptors, but they basically look like
22:25
velociraptors, would you say? Yeah. And
22:27
then, you know, one kind of comes up and you can like sort of, you know,
22:29
try to pet it and like maybe it'll let you pet it
22:31
or maybe it'll roar at you. Why am I talking about
22:33
this at length? Well, to me,
22:36
I saw this, I thought like this
22:38
just feels like new storytelling experiences will
22:40
be enabled. And I truly did believe
22:42
after using the Vision Pro that as
22:44
the technology gets better, cheaper, more widely
22:46
adopted, it's going to enable all sorts
22:48
of new kinds of storytelling experiences and
22:50
you know, may eventually change like the
22:52
kinds of stories that get told and
22:54
how they are made. And
22:56
this is the stuff that I saw that I thought like,
22:59
oh, there's really something here and I believe
23:01
in it more now that I have tried the Vision
23:03
Pro than I did before I had previously. And what
23:05
about your least favorite thing? My
23:07
least favorite thing is I don't
23:10
know if this thing would be comfortable on
23:12
my face for even an hour and a
23:14
half. When I have used
23:16
VR headsets in the past, there was never
23:18
a day when I wasn't relieved to be
23:20
taking the headset off of me where I
23:22
didn't immediately feel better when I was taking
23:25
it off. And even though we
23:27
didn't use the Vision Pro that long, I
23:29
still did basically have that feeling. I
23:31
just kind of had like a little bit of tension like at the temples
23:33
of my head. It was not a
23:35
headache and it went away very soon after I took this
23:37
thing off. But it did make me wonder like, if I
23:40
just want to like watch, you know, Star Wars on this
23:42
thing, how am I going to feel at the end of
23:44
two hours? I don't know the answer to that question. And
23:46
I got to say, Kevin, until I know the answer to
23:48
that question, I don't know if I want to spend $3,500
23:50
for the stacks. Yeah. Yeah.
23:53
Okay. I loved the spatial
23:55
photos and videos. I thought, okay, if this
23:58
were $1,500. Maybe
24:01
that's worth it just for the
24:03
home movie potential alone. At $3,500,
24:06
that's a very expensive home movie collection
24:08
to start building, but I
24:10
think that's by far the most compelling part of
24:12
the demo, at least for me. Yeah. And how
24:14
about the worst? And I would
24:16
say the worst was for me the productivity stuff.
24:19
Like, I understand that for some people, if
24:21
you are a person who works, you know,
24:23
with a multi-monitor setup and you like having
24:26
tons of windows all over the place while
24:28
you work, you are probably the
24:30
prime candidate for using something like the Vision
24:32
Pro, but I'm not a person who likes
24:34
a bunch of clutter and chaos in my
24:37
visual space when I work. I
24:39
like focus. I like full-screen windows. I like not
24:41
a lot of other stuff going on. And
24:43
so for me, it would just be too distracting, and that
24:45
would keep me from using it for work. All
24:48
right. So what do we think the prospects of
24:50
this device are? Like, a year from now, what
24:52
do you think we will be saying about how
24:54
the Vision Pro did and like where it goes
24:56
from here? I don't know. Like, it would not
24:59
surprise me if this is a fairly,
25:01
you know, slow launch for Apple. It
25:04
is a very expensive device. It's a brand
25:06
new category. You know, people,
25:09
it really is one of these things that you need to kind
25:11
of see for yourself to fully understand. You can't really just like
25:13
watch a video or read a couple of reviews and get a
25:15
sense of what it's going to be like to have one of
25:17
these things strapped to your head. The first
25:20
version might not be for everyone, but
25:22
pockets of people will kind of find it and
25:24
adapt it to their own use cases. Yeah, that
25:26
sounds right to me. You know, I think some
25:28
of the analysts before pre-orders went on sale
25:30
thought that maybe Apple will sell like
25:32
half a million of these in the
25:34
first year. Of course, you
25:36
know, Apple sells hundreds of millions of iPhones in a year.
25:39
So this is very small relative to
25:41
that much, much bigger business. And,
25:44
you know, on one hand, that's not a lot.
25:46
On the other hand, that's going to be over
25:48
a billion dollars in revenue. And that I think
25:50
is going to be enough for Apple to say,
25:52
we are going to continue to invest in this.
25:54
Obviously, Apple has essentially unlimited resources to keep investing
25:56
in it. And I think we
25:58
do have enough reason to believe. that
26:00
there will probably be some sort of
26:02
successor platform to the laptop over time
26:05
and you know the best
26:07
way to control the future is to invent it so
26:09
I think Apple is gonna kind of keep going in
26:11
this direction what I am the most curious about is
26:14
what creative types do with this are
26:16
there filmmakers and designers that get their
26:19
hands on one of these and think
26:21
oh I could actually make a really
26:23
cool 10 or 15 minute
26:25
little miniature story and I'm gonna put
26:27
that in the vision Pro App Store
26:30
and you know maybe see it see if kids like
26:32
it and just kind
26:34
of see where that goes maybe in this
26:36
first year the install base is just gonna
26:39
be too small for anyone to justify that
26:41
kind of investment but I do
26:43
believe the technology is good enough that if some
26:45
people took it really seriously and tried they would
26:47
make something really cool that people might spend a
26:50
lot of money to do yeah I can see
26:52
that I also think for me the most interesting
26:54
question about all this is like how is this
26:56
device going to be received like out in public
26:58
because if you remember the Google Glass period where
27:00
Google had just released this amazing like you know
27:02
computer that sat on your face and could do
27:04
all these things it was not that amazing but
27:06
they were
27:08
billing it as this amazing thing and then they you know
27:10
they sent it out to people and people started showing up
27:12
in the real world with them and it
27:14
was mocked and derided and one guy even
27:17
got punched for wearing Google Glass on the
27:19
streets of San Francisco and
27:21
people started calling them glass holes and it became
27:23
this sort of social stigma like I don't think
27:25
that is probably going to happen with the vision
27:27
Pro because like a it looks a little bit
27:29
cooler than Google Glass and be like it's not
27:31
the kind of thing that people are gonna want
27:33
to you know wear around on the street walking
27:35
to and from work all day no and if
27:37
you see somebody wearing one you snatch it off
27:39
their face taking rest you just made $3,500 but
27:43
that is like my question is like yeah you
27:45
sometime in the next few months you
27:47
will get on a plane and see
27:50
someone wearing a vision Pro probably in
27:52
business class and I think
27:54
a very open question and a very important
27:56
question is like how will you feel about
27:59
that six months Will you be like,
28:01
oh, that's so cool. I want one of those or
28:03
will you be like that person is a loser? Yeah,
28:06
I mean, I don't know how anybody's gonna
28:08
feel when they see these things Are
28:11
you gonna buy one? Not
28:14
right now I think The problem
28:16
I had with some of these previous headsets that I would try is
28:18
that they would seem compelling for a week or two And then I
28:20
would throw them in a drawer and I never turn them on ever
28:22
again And I'm not willing to spend
28:24
almost four thousand dollars with packs to have that experience,
28:26
you know This thing we're like $9.99. I
28:29
think it becomes like yeah, I'll just sort of
28:31
try that for like research purposes at 35 $3,800
28:35
I don't want to do that my caveat is
28:37
in two or three months if some of these
28:39
experiences that I was talking about start to get
28:41
made if maybe people actually Find an interesting productivity
28:44
use or a people like rediscover a love of
28:46
movies because it truly turns out to be one
28:48
of the greatest Sort of virtual home theaters around
28:51
then I might do it But this is one where
28:53
I'm just gonna let other people kind of take the
28:55
lead here and then they can tell me in a
28:57
couple Months whether this thing is awesome or not in
28:59
the meantime. I'm gonna save my money. I'm
29:02
tempted Yeah, I gotta say I gotta say I'm
29:04
tempted Yeah, like I'm not gonna buy one right
29:06
now because like I don't have $3,500
29:09
just like burning a hole in my pocket But
29:11
if I did have like a distant relative who
29:13
who died and unexpectedly left me $3,500 Like
29:18
I would be very tempted because it is cool When
29:23
we come back social media went to
29:25
Congress this week talk about
29:27
child thing. I'll tell you what happened You
29:55
Hey, it's Anna Martin from the New York Times and
29:57
I'm here to tell you about something from New York
30:00
home's news subscribers. And
30:02
honestly, if you're a podcast fan, you're going to
30:04
want this. It's an app
30:06
called New York Times Audio,
30:08
where you can get the latest dispatch.
30:10
It's 10 a.m. in teeth, because it
30:12
really allows nice. Perfect your technique. A
30:14
splash of soy sauce, and then a lot
30:17
of red pepper flakes. I'll contemplate the future.
30:19
A computer program is passing the bar exam,
30:21
and we are over here pretending not to
30:24
be amazed by that. It has exclusive shows.
30:26
From the New York Times, it's the headlines.
30:28
Storytelling from serial productions in This American
30:30
Life, Act 2, a fiasco involving
30:32
a village, marauding zizadot, and some
30:35
oil, sports from the athletic, and
30:37
those big moments she puts the
30:39
GMOs back, and narrated articles from
30:41
the Times and beyond. In recent
30:43
years, the unexpected sounds of ice
30:45
have periodically gone viral. New York
30:48
Times Audio. Download it now at
30:50
nytimes.com. Audio app. Casey,
30:56
this week, the CEOs of many of
30:59
the biggest tech companies in Silicon Valley
31:01
flew to Washington to be dragged before
31:03
Congress and grilled by a bunch of
31:05
senators. We've seen a number of these
31:07
kinds of hearings before, but this was
31:09
a big one. The CEOs
31:12
of Meta, TikTok, Snap, Discord,
31:14
and X all showed up
31:16
in person in front
31:18
of the Senate Judiciary Committee for a
31:20
hearing about harms to children on social
31:22
media. Yeah, this has been an issue
31:24
that has been burbling up for a
31:26
couple of years now. More and more
31:29
states are passing laws intended to improve
31:31
child safety online, and now Congress is
31:33
tackling it head on by
31:35
bringing all those CEOs you mentioned and
31:37
confronting them with some hard questions. Yeah,
31:40
I think you and I have both
31:42
gotten a little bit jaded about these
31:44
kind of tech hearings over the past
31:46
five or six years. We've seen it
31:48
just time and time again. Attack platform
31:50
screws up. Its executive is hauled before
31:52
Congress. A bunch of senators or congresspeople
31:54
sort of pepper them with angry questions
31:56
that are more like statements they
31:59
promise to look at it more and
32:01
pass some laws and then nothing happens. The
32:03
executives go back to Silicon Valley, Congress focuses
32:05
on other things, and nothing really changes. This
32:08
one I think is potentially
32:10
very different than that for a couple reasons.
32:12
One is it's about kids. Republicans
32:16
and Democrats disagree about all manner
32:18
of tech problems and solutions to
32:20
those problems, but I think on the issue
32:22
of child safety, this is one where there's
32:24
actually mostly bipartisan agreement that
32:26
this is a real problem. And
32:29
I also think it comes at a time where
32:31
there is actually the potential that something could come
32:33
out of this. We've seen Congress
32:35
trying to pass the Kids Online Safety
32:37
Act COSA, we talked a little bit
32:40
about that on the podcast last year.
32:42
That's a bill that would sort of
32:44
force social media platforms to be more
32:46
active in preventing harms to minors. We've
32:49
also seen a bunch of other laws
32:51
proposed and states that are taking action
32:54
to, for example, require social media companies
32:56
to get parental consent before permitting children
32:58
under 16 to use their platforms. So
33:00
Casey, I know that you, like me,
33:02
have seen a number of these hearings
33:05
come and go without much in the
33:07
way of action, but did this feel
33:09
different to you? Well, I think one
33:11
way that it felt different was the
33:14
way that it shook up the cast
33:16
of characters. So we're sort of used
33:18
to Congress focusing a lot
33:21
on Twitter and Facebook in
33:23
particular. Now we have
33:25
some newer platforms up and coming. Snap,
33:28
Discord, and X, the former Twitter,
33:31
all appeared before Congress for the first
33:33
time. Their CEOs appeared before Congress the
33:36
first time. And so it showed
33:38
that Congress is sort of probing
33:40
at new parts of the ecosystem in
33:42
an effort to kind of trace how
33:45
these problems are flowing across platforms. Yeah.
33:48
So let's go through some clips from the
33:50
hearing because it was pretty dramatic. The
33:52
first one that I think everyone is
33:54
talking about after this hearing was a
33:56
moment where Mark Zuckerberg from Meta was
33:59
asked to apologize. Apologize to a
34:01
number of parents in the room
34:03
whose children had been either Victimized
34:06
or had in some cases taken their
34:08
own lives After being
34:11
bullied or harassed or otherwise
34:13
exploited on social media and
34:16
he did he stood up from his chair
34:18
He turned around and he directly addressed the
34:20
parents in the room. Let's roll that clip
34:23
Let me ask you this there's families of victims here
34:25
today. Have you apologized to the victims? I
34:29
Would you like to do so now? Well, they're
34:31
here. You're on national television Would you
34:34
like now to apologize to the victims who have
34:36
been harmed by your father show them the pictures?
34:39
Would you like to apologize for what you've done to
34:41
these good people? Things
34:51
that your families have suffered and This
34:54
is why we invest so much and are
34:56
going to continue doing industry leading efforts to
34:59
make sure that No one
35:01
has to go through the types of things that your family If
35:07
you can't exactly hear him because he's off mic
35:09
he's basically apologizing he's saying I'm sorry for the
35:11
things people experienced He says no one should have
35:13
to go through what you went through and then
35:15
he says this is why we're investing so much
35:17
in Trying to prevent these harms to children Casey.
35:20
What did you think of this? So this is
35:22
a really dramatic and a sort
35:24
of very rare moment when one of these sort
35:26
of titans of industry Actually has
35:28
to be in the same room and be
35:31
confronted by people who feel like they've Experienced
35:33
really harm as a direct result of the
35:35
the software that that mark and his teams
35:37
have built So I think that that's one that we'll
35:39
remember for a long time. Yeah. All right next
35:42
clip This one was about COSA
35:44
the kids online safety act and
35:47
Richard Blumenthal from Connecticut Who was one
35:49
of the creators of COSA? Basically is
35:51
going down the row of CEOs one
35:54
by one asking them whether they support
35:56
this bill. Yes or no mr.
35:58
Sticker There
36:01
are parts of the act that we think are great. No,
36:03
it's a yes or no question. I'm going to be running
36:05
out of time, so I'm assuming the
36:07
answer is no if you can't answer yet.
36:10
We very much think that the national privacy
36:12
standard would be great. Mr.
36:16
Siegel? Senator, we strongly support
36:18
the Kids Online Safety Act and we've
36:20
already implemented many of its core provisions.
36:22
Thank you. I welcome that support along
36:24
with Microsoft support. Mr. Chu? Senator, with
36:27
some changes, we can support it. In
36:30
its present form, do you support it? Yes
36:32
or no? We are aware that some groups have
36:34
raised some concerns. It's important to understand how this
36:36
is. I'll take that as
36:38
a no. Ms. Yacarino? Senator,
36:41
we support COSUM. We'll continue to
36:43
make sure that it accelerates and make
36:45
sure it continues to offer community for
36:47
teens that are seeking that voice. Mr.
36:51
Zuckerberg? Senator, we
36:53
support the age-appropriate content standards
36:55
but would have some suggestions about
36:57
how to implement it. Do
37:00
you support the Kids Online Safety
37:02
Act? Senator, I think these are nuanced
37:04
things. And I'm just asking whether you'll
37:06
support it or not. These
37:08
are nuanced things. I think that the basic spirit is
37:11
right. I think the basic ideas in it are right.
37:13
And there are some ideas that I would debate how
37:15
to best out on them. Unfortunately. Okay.
37:18
So essentially, two of
37:20
the CEOs in the room, Evan
37:23
Spiegel of Snap and Linda Yacarino
37:25
of X, have broken from the
37:27
other tech companies and decided to
37:29
support COSA, this bill that has
37:31
become extremely controversial in the tech
37:33
industry. Why do you think they did that?
37:36
I think, you know, in Snap's case, they
37:38
just think it will cost them nothing because
37:40
a lot of COSA has to do with
37:42
like algorithmic amplification and stuff that just doesn't
37:45
really concern Snap as a company that's primarily
37:47
focused around messaging with X.
37:49
And this is not based on reporting. This is just
37:52
speculation. But I think X probably just wanted kind
37:54
of an easy win at a
37:56
time when they're facing some very serious
37:58
and important questions about why. they led
38:01
deepfake synthetic nudes of Taylor Swift to
38:03
spread unchecked on the platform until they
38:05
were getting tens of millions of views
38:07
just this past weekend. This at least
38:09
enables them to say, well, look, we
38:11
did something. We supported your bill, Senator.
38:13
You know, I have a lot of
38:15
thoughts about this. But let me just
38:17
say wait until Elon Musk finds out
38:19
what is in COSA, because I suspect
38:21
he's gonna have a strong disagreement with
38:23
Linda Yakarino about excess support for that
38:25
bill. Okay, the last clip I want
38:27
to play is from Mark Zuckerberg, who
38:29
is talking with Amy Klobuchar, the
38:31
senator from Minnesota, about why he
38:33
thinks that the responsibility of verifying
38:36
users ages should not fall to
38:38
meta and other social platforms, but
38:40
should instead be handled by Google
38:43
and Apple who run the app stores.
38:47
I don't think that parents should have to upload
38:49
an ID or prove that they're the parent of
38:51
a child in every single app that their children
38:53
use. I think the right place to
38:55
do this, and a place where it would be actually
38:58
very easy for it to work, is within the app
39:00
stores themselves, where my
39:02
understanding is Apple and Google already, or
39:04
at least Apple, already requires parental consent
39:06
when a child does a payment with
39:08
an app. So it should be
39:10
pretty trivial to pass a law that requires
39:13
them to make it so
39:15
that parents have control any time
39:17
a child downloads an app, and
39:19
offers consent of that. And the
39:21
research that we've done
39:23
shows that the vast majority of
39:25
parents want that, and I think
39:28
that that's the type of legislation, in addition to
39:30
some of the other ideas that you all have,
39:32
that would make this a lot easier for parents.
39:34
Yeah, just to be clear, I remember one mom
39:36
telling me, with all these things she could maybe
39:38
do that she can't figure out, it's
39:40
like a faucet overflowing in a sink, and she's
39:42
out there with a mop while her kids are
39:44
getting addicted to more and more different apps than
39:46
being exposed to material. We've got to make this
39:48
simpler for parents, so they can protect their kids,
39:50
and I just don't think this is going to
39:52
be the way to do it. I
39:54
think the answer is what Senator Graham has been
39:57
talking about, which is opening up the halls of
39:59
the courtroom. So that puts
40:01
it on you guys to protect these parents
40:03
and protect these kids. And then
40:05
also to pass some of these laws that makes it
40:07
easier for law enforcement. Okay,
40:10
so that's the hearing in a nutshell. Congress
40:13
really wants these tech platforms to
40:15
do more to protect underage users.
40:18
And a bunch of senators sort of
40:20
don't think they're doing enough and want
40:22
to use new legislation or maybe the
40:24
courts to go after them
40:26
for not doing enough for
40:28
underage users. The tech platforms all say,
40:31
well, we're doing all these things already.
40:34
And some of them say we would support
40:36
legislation like COSA. Some of them say, well,
40:38
we don't think that's the right approach, but
40:40
we also agree that more needs to be
40:42
done to protect underage users. Basically, everyone is
40:44
agreeing that there is an important problem to
40:46
solve here. There's just some disagreements about how to
40:48
solve it. I think it
40:50
is tricky to talk about. It is very
40:52
emotional. Everyone wants children to
40:54
be safe online. I think we
40:57
have very little agreement at this
40:59
point about what does safety
41:01
mean. And I think the
41:03
fact is that there are just always
41:05
going to be some risks associated with
41:07
being on the internet. But if
41:09
you accept all of that, what is
41:11
a path forward? Well, the path,
41:13
I think, look different depending on
41:15
what problem you're talking about. But
41:18
one place to start that I think would actually
41:20
be productive is talking about this
41:22
issue of age verification that Zuckerberg brings up in
41:25
the last clip. Well, let's talk about it. So
41:27
do you think this should be a responsibility
41:29
of tech platforms to verify how old their
41:31
users are? Or do you agree with Mark
41:34
Zuckerberg that Apple and Google should take that
41:36
on? So I wanted to
41:38
take a quick step back and say
41:40
like we have never been able to
41:42
mandate age verification in this country because
41:44
the Supreme Court has repeatedly held that
41:47
it is a violation of the First
41:49
Amendment, not because of what it does
41:51
to kids, but because it places too
41:53
high of a burden on every adult
41:55
user of the internet to have to
41:57
verify their age every time they want to use
41:59
it. a website, okay? So that's why we've
42:01
never had that in this country. So
42:04
that brings us to what Zuckerberg says, which is,
42:06
well, why don't you make Apple and Google do
42:08
it? And I have to say, I
42:10
think that this solution is obviously correct.
42:12
Because imagine that your child has come
42:15
of age where they now have a
42:17
smartphone, and they start downloading apps. And
42:19
it's not just one app. And it's
42:21
not just 10 apps, it's 40 apps.
42:24
And now all of a sudden, you're being asked
42:26
as a busy parent with a job to, in
42:28
40 different cases,
42:30
verify your child's identity. That
42:32
seems obviously way too onerous
42:35
for me to pass.
42:37
We talk so often on the show, Kevin,
42:40
about the dark side of living in a
42:42
world where there's only two major smartphone platforms.
42:44
This is actually a silver lining, because now
42:46
you have two gatekeepers that can say,
42:48
you know what, we can actually take it upon
42:50
ourselves to do the verification. And in fact, as
42:53
Zuckerberg points out, Apple is already doing this, if
42:55
you have an Apple wallet, it's going to ask,
42:57
Hey, are you older than 12? So I want
42:59
to quickly say,
43:01
it does not have to be Apple and
43:03
Google who are doing this, you could also
43:06
imagine some sort of industry body that is
43:08
passing along some sort of token through an
43:10
API that is made available through iOS and
43:12
Android, right? So I'm not saying 100%. It
43:15
has to be Apple and Google. But that is
43:17
the basic level at which this needs to occur. Not
43:19
at the level of the app. What do you mean?
43:22
I mean, I think I agree with that.
43:24
I think it's just easier to run age verification
43:26
through the app stores. You can imagine, you know,
43:28
you take a new iPhone out of the box.
43:30
And while you're setting it up, it sort of
43:32
asks you to verify the age of the person
43:35
who's going to be using that. And
43:37
then, you know, that sort of
43:39
birth date kind of sticks and
43:41
is passed through to other applications
43:43
as you download and register for
43:45
them too. But I also think
43:47
like, it's, it's good for social
43:49
platforms to have, if not a
43:51
an exact number, then then just a
43:54
general sense of how young their users
43:56
are right now. You know, a
43:58
lot of people just lie. when they register for
44:00
a new Instagram account or a new Facebook
44:02
account if you're underage, you'll just say, you
44:04
know, I'm 18 and that gets you in.
44:07
And I think that it's good that tech platforms
44:09
will soon have to have a better idea of
44:11
how old the people using their apps are. Yeah,
44:14
and we should say, like, there is clearly an
44:16
element of passing the buck in what Zuckerberg says,
44:18
right? Because the moment that this becomes Apple and
44:20
Google's responsibility, he basically never has to think about
44:22
it again. He just flips a switch that says,
44:24
hey, iOS, if this user is under 13, they
44:27
don't get to download Instagram and move on with
44:29
his life. So I can understand why
44:31
they want this. The thing is, I just think
44:33
that would probably be a world where kids are
44:35
made more safe. Yeah, this is,
44:38
as you said, a very emotional topic and one
44:40
where people, especially parents, have a
44:43
lot of strong feelings. We
44:45
did a segment on the show last
44:47
year about COSA in which we sort
44:49
of raised some potential objections to it.
44:51
And you especially said that you were
44:53
opposed to it because it can
44:55
allow state attorneys general to basically try
44:57
to crack down on any content
44:59
that they considered offensive or
45:02
harmful to children, whether that's
45:04
pro LGBTQ content or affirming
45:06
gender content targeted at kids
45:09
who may be experiencing some
45:11
gender dysphoria. You really
45:13
worried that COSA would have all these unintended
45:15
consequences. After that
45:17
episode, we heard from a ton
45:19
of listeners who disagreed with us,
45:21
some of them agreed with us,
45:24
basically talking about what it's actually like
45:26
to have kids who use social media
45:29
platforms, which neither of us, frankly,
45:31
does. And I wanted to just
45:33
bring up two emails that we got in
45:35
the wake of the COSA segment from listeners
45:38
and see if any of your views have shifted
45:40
on this stuff over time. One
45:43
was from a listener named James who
45:45
wrote to us and basically said, look,
45:48
you guys, you talk about
45:50
this stuff, you analyze this proposed
45:52
legislation, you do not understand how
45:54
bad the internet is for kids
45:57
and how many kids are being
45:59
hurt. online. And this
46:02
is sort of a point that you'll hear from people
46:05
who are activists on this stuff. They'll say, you
46:07
know, this is not some small
46:09
number of teens who are being victimized
46:11
by horrible things online. This
46:14
is in fact, millions and millions of
46:16
teenagers. There are studies
46:18
that have been done by
46:20
SNAP that found that two
46:22
thirds or 65% of teens and young
46:26
adults in six countries reported that they
46:28
or their friends had been targeted by
46:30
some of these online sextortion schemes. There
46:33
have also been internal studies that
46:35
have been published from
46:37
meta claiming that one
46:40
in eight Instagram users under age
46:42
16 said they had
46:44
experienced unwanted sexual advances on the
46:46
platform in the past week. So
46:50
these are the kinds of numbers that I
46:52
think make parents really start to freak
46:54
out because all of a sudden, this is
46:57
not just a few isolated incidents. This is
46:59
a real epidemic. Yeah. And those are all
47:01
terrible things that are worthy of legislation that
47:04
reduces those problems. But that's not
47:06
what COSA says. What COSA says
47:08
is the platforms need to protect
47:10
minors from harmful content. Well, who
47:12
gets to define what harm is,
47:14
it is going to differ state
47:16
by state, attorney general by attorney
47:18
general, political party to political party.
47:21
And one of the reasons that the founders of
47:23
this country passed the First Amendment was to
47:25
prevent these questions from being put
47:27
into the hands of politicians. So
47:30
if COSA said we want to
47:32
create a legal framework for ensuring
47:34
that platforms prevent sextortion, I would
47:37
be 100% in favor
47:39
of it. But Kevin, I think anybody who
47:41
thinks that COSA is going to solve the
47:43
problems that you're describing is kidding themselves. Because
47:45
I think in practice, this is going to
47:47
turn out to be primarily a mechanism for
47:49
partisan state attorneys general to launch political stunts.
47:52
I believe Republican AGs are going to sue
47:54
platforms for showing information about abortion in states
47:56
where it is illegal and then they'll wash their
47:58
hands with the whole thing and say, see. We did
48:00
something to protect the kids. We prevented them from harm.
48:03
Well, I just want to clarify that
48:05
the text of COSA has gone through
48:08
a bunch of changes in response to
48:10
some of these concerns, like the one
48:12
that conservative attorneys general could potentially weaponize
48:15
it to sort of go after a speech
48:17
they don't like or a speech that is
48:19
sort of pro-LGBTQ plus in some way. And
48:23
they do seem to be open to further changes.
48:26
Earlier this month, Senator Blumenthal actually told
48:28
Politico that he would be open to
48:31
giving the Federal Trade Commission
48:33
the sort of enforcement authority when
48:35
it comes to COSA instead of putting
48:37
it in the hands of state AGs,
48:40
the idea sort of being that since the
48:43
FTC is federal, maybe it's
48:45
less prone to kind of politicization
48:47
than state attorneys general would be.
48:50
But I agree that the text of the bill,
48:52
as it is written today, does seem
48:54
to give state AGs some authority to
48:57
go after tech platforms if they deem
48:59
them to be sort of not taking
49:01
enough steps to prevent minor users from
49:04
encountering harmful content. And
49:06
I would just say that my views
49:08
on this stuff have been kind of
49:10
in flux recently because I agree with
49:12
you. I think there are problems with
49:14
COSA as it's currently written. I worry
49:16
about unanticipated consequences. We are
49:18
also starting to learn so much more
49:21
about the fact that these tech platforms
49:23
knew they had a problem with minors
49:25
being exploited, harassed, bullied, and harmed on
49:27
their platforms and that they didn't do
49:29
enough to prevent it. The
49:31
Wall Street Journal has been doing a series of really
49:33
great stories on this
49:36
stuff specifically related to Instagram
49:38
and Facebook. They reported
49:40
on this internal meta presentation from 2021 that
49:42
was included in one of these state
49:46
lawsuits against the company that estimated that
49:48
100,000 minors every day receive photos of
49:53
adult genitalia or other sexually
49:55
abusive content on Facebook and
49:57
Instagram. They also describe an internal meta presentation.
50:00
document that was circulated
50:02
at meta, noting that its own
50:04
recommendation algorithms, in particular this algorithm
50:06
called people you may know, which
50:08
sort of tells you people that
50:11
you might want to connect with,
50:13
was known among employees to be
50:15
connecting child users with potential predators.
50:18
And it also says that Facebook
50:20
executives were alarmed about certain features
50:22
in their apps that were harming
50:24
teens, like these photo filters that
50:26
mimic the effects of plastic surgery,
50:28
and that some meta executives actually
50:30
wanted to ban these filters, but Mark
50:32
Zuckerberg refused. So I will say as
50:35
we learn more about what was going
50:37
on inside these platforms over the past
50:39
few years when it comes to child
50:42
harms, I am becoming much more sympathetic
50:44
to the view that these platforms need
50:46
some regulation to force them
50:49
to pay more attention to this stuff.
50:51
Yeah, I think your points are well
50:53
taken. It is true that these companies
50:55
have a lot to answer for. They
50:57
have contributed to a lot of harm.
51:00
Often the voices inside these companies
51:02
that were raising these issues are
51:04
drowned out. And it
51:06
sucks. And I have become much more open
51:08
over the past several months to the class
51:10
action lawsuit that was filed by the Attorney's
51:12
General seeking to hold meta accountable
51:14
for everything that you just said. This is
51:17
just the point where I as a journalist
51:19
and somebody who relies on the First Amendment
51:21
to do my work, I just really don't
51:24
want to live in a world where the
51:26
government is writing laws as broadly written so
51:28
as to put the government in charge of
51:30
deciding what speech is harmful. I think that's
51:33
an extremely dramatic step with a lot of
51:35
obvious downsides. And if we wanted to target
51:37
any of the number of very real problems
51:40
that you just described, we could take a
51:42
much more surgical approach. Yeah. So Casey, where
51:44
do you think we go from here? We've
51:46
had this hearing. We now know where the
51:49
tech companies stand on this one particular piece
51:51
of proposed legislation, COSA. What
51:53
do you think happens now? Well, I think
51:55
we're gonna see a continuation of what we've
51:57
already been seeing, which is more states passing.
52:00
legislation on this subject. In Congress, we have
52:02
this perpetual gridlock, but there are many, many
52:04
states where there is single party control of
52:07
the entire legislature. That is why we are
52:09
seeing so much legislation being passed at the
52:11
state level, and more and more
52:13
states are taking a crack at a variety
52:16
of pretty restrictive things. Montana has tried to
52:18
just ban TikTok in the state altogether.
52:20
That has been blocked by a federal judge. But I
52:22
think you're just going to see state after state taking
52:24
swing after swing. All of this stuff is going to
52:27
go up to the Supreme Court. And then the question
52:29
is just going to be, will the
52:31
Supreme Court actually agree to
52:33
any of these things that
52:36
the previous Supreme Court absolutely
52:38
dismissed? This is where my jadedness
52:40
comes in. I do not think anything is going
52:42
to happen at the federal level. This stuff has
52:44
been in gridlock for years. I think that that
52:46
is going to continue. We're heading into an election.
52:49
Who knows what's going to happen in 2025? But
52:52
I do think you're still going to see
52:54
a lot of movement at the state level
52:56
and in the courtrooms. Remember, that class action
52:59
lawsuit against META is really starting to
53:01
pick up some steam. And I do
53:03
think it is going to cost that company in particular in the end.
53:05
But what do you think? Yeah, I think
53:07
this is going to be maybe the biggest
53:09
challenge that social platforms face over the next
53:12
few years. I think this is an area
53:14
where the big platforms are really vulnerable. I
53:16
think legislators and activists and lobbyists know that.
53:19
And that is why we're going to see them
53:21
continue to hammer the tech companies on these points.
53:23
Yeah, I just want to say, like, I do
53:25
worry. I read this great essay by Dana Boyd,
53:27
who's this researcher who's studied social media for a
53:29
long time. And she was saying that
53:31
there is such a risk in the
53:33
legislation that is now being proposed of
53:35
falling victim to this idea of techno-solutionism,
53:37
right? Which is that social networks cause
53:40
all of our problems with children and
53:42
tech companies can solve all of our
53:44
problems with children. Social
53:46
networks have a huge role to play. Let's
53:48
also remember, kids are suffering for a lot
53:50
of reasons that have nothing to do with
53:53
who said what to them on Instagram, right?
53:55
Kids get bullied in schools, actually. Schools have
53:57
not actually done a lot to solve the
54:00
bullying problem over the past three years. decades.
54:02
So I just want us to keep all
54:04
of that in mind too, because it is
54:06
so easy to fall victim to the same
54:09
thing that critics accused journalists of falling for
54:11
the hype that, you know, tech can save
54:13
the world. If you think that tech can
54:15
solve all of our problems that tech has
54:18
created, I also think you're kidding yourself. Yeah,
54:20
I think that's right. I think there is
54:22
a danger of sort of being too deterministic
54:24
about how the decisions that platforms make affect
54:27
the experiences that kids have on them. And
54:30
I actually like, I don't blame
54:33
tech companies for having
54:35
underage users or even for, you know,
54:38
exposing inadvertently those users to some harms just
54:40
by virtue of the fact that they're massive
54:42
platforms that can't keep tabs on everything that's
54:44
happening on them. What I do
54:46
fault them for and what I do think a lot
54:49
of parents are going to fault them for is
54:51
that after they learned that teenagers
54:54
and young people were having bad
54:56
experiences on their platforms, they
54:58
just didn't do enough to stop it. And
55:00
in some cases, they rejected proposals that would
55:03
have helped kids on their platforms have a
55:05
safer experience because they cost too much money
55:07
or they involve too much bureaucracy or oversight,
55:09
or they just worried that calling attention to
55:12
the fact that there were underage users on
55:14
their platforms would open them up to new
55:16
forms of scrutiny. So I think this was
55:18
a huge mistake tactically that a lot of
55:21
these platforms made and not talking about this
55:23
sooner. And I think those days
55:25
are over because now parents are pissed off
55:27
and they want answers. Well, Kevin, what would
55:29
you say if I told you that there
55:31
was another company that knew about a huge
55:33
problem and when confronted with it did not
55:36
do the right thing? Come on.
55:38
That's right. Coming up after the brief. We'll have a
55:40
clear accident. We'll have a clear accident. Hey
56:21
Kevin, remember that one time we interviewed
56:23
the CEO of Cruise? Yes, the self-driving
56:26
car company. Yeah, nice guy. Whatever happened
56:28
to that? Well,
56:30
it's interesting you ask. It's been a
56:32
very rocky few months for the cruise
56:34
company and Kyle Vought, the CEO who
56:36
we interviewed on the show last year,
56:39
stepped down in November following
56:41
a big scandal involving regulators
56:44
and a serious accident and
56:46
a cruise vehicle called Panini.
56:49
And it all culminated recently with
56:51
this big report that was prepared
56:54
about that incident and kind of
56:56
what has led cruise to the
56:58
place it is now. So as
57:00
of today, the company has lost
57:02
its license to operate driverless vehicles
57:04
in California. It has suspended its
57:06
operations across the country. The company's
57:08
basically entire leadership has been replaced.
57:10
A quarter of the staff of
57:13
Cruise has been cut. It's
57:15
being investigated by the DOJ and the
57:17
SEC. And now, as
57:19
of this week, GM announced that
57:22
they are cutting their investment in Cruise
57:24
by half, about a billion dollars this
57:26
year. So what I'm hearing is that
57:28
this is a story about how one
57:31
car accident destroyed an entire company.
57:34
It's sort of about that, but I
57:36
think it's also about kind of self-driving
57:38
technology as a whole and
57:40
some of the tradeoffs that we're
57:43
going to see as these cars
57:45
become more widespread. And
57:47
I really think we should talk
57:49
about the cruise incident today because
57:51
it's not only just kind of like
57:53
a juicy story, but I think it
57:55
also really symbolizes the issues with this
57:57
kind of move fast and break things
57:59
mentality. that a lot of tech
58:01
companies have, even in areas
58:03
like self-driving, where the costs of
58:05
a mistake are so grave.
58:08
Yes, and I should say, I have read the
58:10
report that came out of this, and it is,
58:13
so there's some really disturbing stuff in
58:15
there, and I think it is worth
58:17
going through it. Yep, and it really
58:19
matters what happens to Cruise. This was
58:22
one of only two companies that was
58:24
offering self-driving sort of halable rides in
58:26
San Francisco and other cities. The other
58:29
one, Waymo, is still
58:31
operating. You can take their cars today
58:33
in San Francisco, but Cruise was a
58:35
big player in a company that GM
58:37
in particular had bet a lot of money
58:40
on sort of being the first
58:42
to bring this technology to market.
58:45
And I'm a person who thinks that
58:47
driverless cars are one of the most
58:49
important technologies out there today. I think
58:51
that roads are tremendously unsafe, and self-driving
58:53
cars, if they work well, they could
58:55
make things a lot safer, but there
58:57
are also still some issues with them.
58:59
And so I think it's worth taking
59:01
sort of a closer look at what
59:03
happened at Cruise, because I think some
59:05
of these same dynamics could play out
59:07
across the industry in the coming years.
59:10
So let's get into it. So I
59:12
think to tell this story, we have
59:14
to rewind the clock to October 2nd
59:16
of last year. And
59:18
that night in San Francisco,
59:21
there was an accident. Now,
59:23
this accident was not caused
59:25
by a Cruise self-driving car.
59:27
It was caused by a human
59:29
driver who basically did a hit
59:32
and run on a pedestrian in
59:34
downtown San Francisco. That
59:36
pedestrian was flung by
59:39
this human driver collision into
59:41
the path of a Cruise
59:44
autonomous vehicle. The Cruise
59:46
vehicle tried to slam on the brakes,
59:48
but it ended up hitting her. And
59:51
then it tried to execute a pullover maneuver,
59:53
which is what it's programmed to do when
59:55
it- And when you say execute a pullover
59:58
maneuver, you mean pullover. Hee hee. I'm
1:00:01
just using the language that Cruz used in his report.
1:00:03
So it tries to pull over because that's
1:00:05
what it's supposed to do. And
1:00:08
in the process of pulling over, it drags
1:00:10
this poor woman about 20 feet
1:00:12
to the side of the road while pulling
1:00:14
over. Because critically, and
1:00:16
here's where the AI messed up, the
1:00:18
AI thought that the woman had hit
1:00:20
the car on its side, and so
1:00:23
that it would be safe to pull
1:00:25
over. In fact, the woman was in
1:00:27
the front of the car. And
1:00:29
so as the car pulled over, it dragged
1:00:31
the woman 20 feet. And
1:00:33
she was left in critical condition. Yes.
1:00:36
So very sad story. But what
1:00:39
happened next is that in responding
1:00:41
to this incident, Cruz was supposed
1:00:43
to meet with a bunch of
1:00:45
regulators to basically debrief the incident
1:00:48
and see what could be done to fix
1:00:50
this kind of thing in the future. And
1:00:54
in the process of those
1:00:56
meetings with government officials and
1:00:58
regulators, Cruz executives basically
1:01:01
left out the fact that their car,
1:01:03
while it had not caused the initial
1:01:05
injury, did drag this woman about 20
1:01:08
feet before coming to a stop. So
1:01:11
this winds up becoming the key
1:01:13
omission that winds up essentially causing
1:01:15
the company to collapse. In the
1:01:17
immediate aftermath of the accident, though,
1:01:19
Cruz really only wanted to communicate
1:01:21
one message, which was, hey, we
1:01:23
didn't cause this accident. Right. So
1:01:25
this is the narrative that they take out to
1:01:27
the media. This is the drum that they beat.
1:01:30
And they're so fixated on the fact that
1:01:32
they did not cause the initial accident that they
1:01:34
wind up making a bunch more mistakes. Right. So
1:01:37
the basic story is, you know, there's this accident. Cruz
1:01:39
responds, meets with all these regulators and
1:01:41
government officials to talk about what happened,
1:01:43
plays them footage from the cameras that
1:01:46
were on the car at the time
1:01:48
of the incident, but kind of glosses
1:01:50
over the fact that their car dragged
1:01:52
the woman about 20 feet before coming
1:01:54
to a stop. They get
1:01:56
caught sort of doing this kind
1:01:58
of selective presentation. And
1:02:00
as a result, GM, which
1:02:03
is the majority owner of Cruise,
1:02:05
sort of replaces the senior leadership
1:02:07
of the company and commissions a
1:02:09
law firm to basically investigate what
1:02:11
happened and come up with a
1:02:13
report. And this report
1:02:15
was made public just
1:02:17
recently and it is a wild
1:02:20
document. Did you read it? I did read
1:02:22
it and it was probably the most interesting
1:02:24
thing I've ever read that was written by
1:02:26
Quinn Emanuel or Carton Sullivan, the law firm
1:02:28
that did the report. Yeah,
1:02:30
I love these kind of lawyers reports because it's
1:02:32
like the precision in these documents, the time stamps
1:02:34
on all the Slack messages. It is just like,
1:02:37
you do not want this kind of thing to
1:02:39
happen to you. You don't and yet at the
1:02:41
end of the day, Cruise is their client and
1:02:43
they do pay for everything that's happening and so
1:02:46
I do feel like the law firm sort of
1:02:48
writes about it in a hilarious letting them
1:02:51
off the hook way that I would like
1:02:53
to talk about. Yeah, so let's talk about
1:02:55
this report. So the report basically says that
1:02:58
in the immediate aftermath of this incident, Cruise
1:03:01
was sort of worried
1:03:03
because initial media
1:03:05
reports suggested, some of them,
1:03:07
that Cruise's self-driving car had in fact caused
1:03:09
this accident, that this was a case of
1:03:11
a driverless car hitting a pedestrian, which would
1:03:14
be a big story. Yes, and it is
1:03:16
the sort of the story that the entire
1:03:18
press corps has been waiting for ever since
1:03:20
these self-driving cars got on the road. When
1:03:22
would one of them cause a potentially fatal
1:03:25
accident? And so now it happens and so
1:03:27
Cruise, which I'm sure already had
1:03:29
a full plan ready to execute the moment that
1:03:31
this happened, they spring into action. Right,
1:03:33
so on October 3rd after
1:03:36
this incident happens, Cruise executives
1:03:38
and employees are kind of trying to piece
1:03:41
together what happened. They're looking at the footage
1:03:43
from the cameras on these vehicles and
1:03:46
at around 3.45 a.m. according
1:03:49
to this report, a Cruise employee
1:03:51
first sees the full video of the
1:03:53
incident and learns that in
1:03:56
addition to kind of hitting this pedestrian
1:03:58
and stopping their car. also
1:04:00
pulled over and dragged the woman
1:04:02
with it. So the company's
1:04:05
senior leadership, including Kyle Vogt, meets to
1:04:07
discuss this new video and
1:04:10
tries to decide whether or not they're going to
1:04:12
sort of update the media with
1:04:14
this new detail and they decide not
1:04:16
to. Oh, that seems like a
1:04:19
mistake in retrospect. Yeah. Yeah, so
1:04:21
they also meet with the
1:04:23
mayor of San Francisco's transportation advisor.
1:04:26
They don't mention the dragging part of
1:04:28
the incident then. They
1:04:30
meet with a bunch of regulators,
1:04:32
including NHTSA, the DMV, and the
1:04:35
California Highway Patrol, and
1:04:37
same thing. They talk about this incident.
1:04:39
They show this video of the
1:04:41
footage from some of the cameras, but they
1:04:43
do not sort of proactively bring up the
1:04:45
fact that their car dragged this woman after
1:04:47
hitting her. And here's how this is described
1:04:50
by Quinn Emanuel, by the way. Quote, in
1:04:52
each of those meetings, Cruz had the
1:04:54
intent to affirmatively disclose those material
1:04:56
facts by playing the full video
1:04:58
and letting, quote, the video speak
1:05:00
for itself. Because Cruz adopted
1:05:02
that approach, it did not verbally point
1:05:04
out these facts. It's like,
1:05:06
that is such a hilarious word salad to
1:05:09
explain why after you drag a woman 20
1:05:11
feet and you're meeting with regulators, you do
1:05:13
not mention that you dragged a woman 20
1:05:15
feet. But you're leaving out the best part
1:05:18
of this, which was that apparently Cruz tried
1:05:20
to play video of the crash to officials,
1:05:23
but the person who was playing it,
1:05:25
their wifi was not up to the
1:05:27
task. They had bandwidth issues and a
1:05:29
poor internet connection that prevented
1:05:32
the regulators from seeing the complete and clear
1:05:34
full video of the accident. And not only
1:05:36
did this happen in one meeting, it apparently
1:05:38
happened in three different meetings. And I would
1:05:41
just like to say to this Cruz employee,
1:05:44
please call Comcast, upgrade your plan.
1:05:46
When you are showing a crash
1:05:48
video to regulators, you're definitely
1:05:50
gonna want the gigabit fiber. This truly is
1:05:52
the most dog ate my homework excuse for
1:05:55
not being able to show this video to
1:05:57
regulators that I can even imagine. And when
1:05:59
I saw... I really did
1:06:01
gag a bit. Unbelievable. So the
1:06:03
entire future of the driverless car
1:06:05
industry, we now know, may hinge
1:06:08
on one cruise employee's shitty Wi-Fi. Wait, I just
1:06:10
want to share a quintimaniels conclusion about this, which
1:06:12
is, quote, even after obtaining the full video, cruise
1:06:14
did not correct the public narrative, but continued instead
1:06:16
to share incomplete facts and video about the accident
1:06:19
with the media and the public. This
1:06:21
conduct has caused both regulators and the media to
1:06:23
accuse cruise of misleading them. Like, yeah, you think?
1:06:27
Yeah. So there's
1:06:30
a lot more in this report. It's very
1:06:32
detailed. It's many, many pages long.
1:06:35
And a lot of it is just kind of internal
1:06:37
fact finding. But basically
1:06:39
the overarching conclusion that this
1:06:41
investigation draws is that cruise
1:06:44
executives and leadership materially misrepresented
1:06:46
what happened on this October
1:06:49
2nd night with this
1:06:51
pedestrian to regulators, which is a very bad
1:06:53
thing to do if you are a company
1:06:55
in a heavily regulated industry like transportation.
1:06:58
So the law firm that did the
1:07:00
report, I think, disagrees with the idea
1:07:02
that this misleading was intentional. The report
1:07:04
says, quote, despite the failure to discuss
1:07:06
the pullover maneuver or pedestrian dragging with
1:07:08
regulators, the evidence review to date does
1:07:11
not establish that cruise leadership or employees
1:07:13
sought to intentionally mislead or hide from
1:07:15
regulators the details of the October 2nd
1:07:17
accident. Instead, they attempted to show the
1:07:19
full video of the accident in good
1:07:21
faith, but with varying degrees of success
1:07:24
due to technical issues. So I
1:07:26
would not say this rises to the level where you
1:07:28
can truly call it a cover up, but it does
1:07:30
sort of feel like a cover up by omission, right?
1:07:33
Where like the company did go out of its way
1:07:35
to just not say what had happened. And
1:07:37
when the inevitable happened, which was that
1:07:39
regulators finally see the full video, it finally becomes clear
1:07:42
to them what has truly happened. Well, of course, they
1:07:44
wind up being way more upset than they would have
1:07:46
than if cruise had just been honest with them from
1:07:48
the beginning. Yeah. And
1:07:50
I've talked to some people who were involved in
1:07:52
this situation over the past few weeks. And
1:07:55
the folks on the kind of GM
1:07:57
and regulator side of this just basically.
1:08:00
paint crews as this kind of reckless
1:08:02
startup that was more interested in kind
1:08:04
of scaling their business and getting as
1:08:06
many of their cars onto the road
1:08:08
as they can and sort of taking
1:08:10
market share away from Waymo than
1:08:13
they were about making sure that all
1:08:15
their vehicles were safe. And
1:08:18
people on the other side, the sort of
1:08:20
more cruise friendly part of this say, well,
1:08:22
we have our risk calculations all wrong when
1:08:25
it comes to driverless cars. You know, we
1:08:27
shouldn't be asking, are these things perfectly safe?
1:08:29
Are they never going to be involved in
1:08:31
an accident? Are they never going to hurt
1:08:33
someone? We should be comparing them against human
1:08:36
drivers who we know cause accidents and hurt
1:08:38
people all the time. And so in their
1:08:40
minds, this is just kind of like an
1:08:42
overly cautious set of regulators and an overly
1:08:44
cautious corporate parent kind of looking
1:08:47
over this with just the wrong set of
1:08:49
risk calculations in their brains. Well, I have
1:08:52
to say, I think that that that really
1:08:54
misses the point. I think that the real
1:08:56
story here is even sadder because look, Kevin,
1:08:58
the moment you decide to create a
1:09:00
self-driving car company, you have to prepare
1:09:02
for the extreme likelihood that at some
1:09:05
point there is going to be a
1:09:07
serious injury accident and even a fatal
1:09:09
accident. Right. And I'm
1:09:11
sure Cruz had done a lot of thinking
1:09:13
leading up to this moment about how it
1:09:15
was going to handle that situation. Again, human
1:09:17
driven cars are killing people every day. This
1:09:19
should not surprise anyone. Our hope is that
1:09:21
these things become safer. And
1:09:24
what this incident is really about is that
1:09:26
the AI made a mistake and probably a
1:09:28
mistake that a human being would
1:09:30
not make. And that is tragic. And that
1:09:33
is worth exploring. But that's
1:09:35
not why Cruz barely exists anymore. It
1:09:37
barely exists anymore because in the aftermath
1:09:39
of this, the company could not just
1:09:41
say like, oh, we made a mistake.
1:09:44
Let's own up to our mistake and let's
1:09:46
fix it and move on. If they had
1:09:49
done that, I truly think cruise vehicles would
1:09:51
still be on the road today. Yeah,
1:09:53
that's a good point. And we should say like Cruz,
1:09:55
it still exists. We're still
1:09:57
planning to expand their service.
1:09:59
I mean, they are, but
1:10:02
like, what happens? Like, is cruise going to continue
1:10:04
to exist in any meaningful way, do you think?
1:10:07
Look, I think GM has sunk
1:10:09
a lot of money into this
1:10:11
company. These driverless sort of taxi
1:10:13
services, they are just hemorrhaging cash.
1:10:15
It's very expensive to build and
1:10:17
maintain and, you know,
1:10:20
keep these cars on the roads.
1:10:23
And you know, they're not really making money
1:10:25
from them right now. And so I think GM
1:10:27
probably sees this as an opportunity to kind of
1:10:29
cut back some of its losses. But
1:10:32
look, I do think driverless cars are
1:10:34
not going away. The technology is here.
1:10:37
So what do you think happens now
1:10:39
to the dream of driverless cars? Like,
1:10:41
does this push the kind of timeframe
1:10:44
out by several years because now we
1:10:46
only have one driverless car company offering
1:10:49
rides instead of two? Or like, what
1:10:51
do you think happens to Waymo, for
1:10:53
example, as a result of this? I
1:10:55
mean, I have to believe that it
1:10:57
does slow the progress of this industry.
1:10:59
I think that competition makes industries grow
1:11:01
faster. And now that as
1:11:04
you point out, Waymo is the only self-driving
1:11:06
car company on the streets, that progress probably
1:11:08
is going to slow a bit. So that's
1:11:10
kind of what I'm expecting to see. I'm
1:11:12
very curious to see where cruise is
1:11:15
a year from now. Like on one hand,
1:11:17
I agree, GM has invested so much money,
1:11:19
you can't really imagine them pulling the plug.
1:11:21
But on the other hand, every accident that
1:11:23
they've taken since this happened has given me
1:11:25
the impression that they really are not thrilled
1:11:27
with this crew. So we'll have
1:11:29
to see. But it's
1:11:31
just a terrible black eye for the industry. And then
1:11:34
I do worry it could cause people's lives. What do
1:11:36
you think? So I think this is
1:11:38
going to slow the self-driving car industry less
1:11:40
on the kind of consumer adoption
1:11:42
side than on the regulatory side.
1:11:44
We're already starting to see evidence
1:11:46
that this is sort of expanding
1:11:49
just beyond cruise. Last week, the
1:11:51
city of San Francisco sued a
1:11:53
state commission for allowing Waymo
1:11:55
and cruise to expand
1:11:57
to the city. this
1:12:00
is going to result in collateral damage to
1:12:02
waymo to even though uh... that company appears
1:12:04
to have done nothing wrong and in fact
1:12:07
their safety record is quite good so we're
1:12:09
not going to go away even in san
1:12:11
francisco now you know that
1:12:13
remains to be seen i think there'd be people
1:12:15
who would be upset if that happened and i
1:12:17
also think that the people who would probably pretty
1:12:19
happy about it now i will be always try
1:12:21
to leave people with a bit of hope early
1:12:24
something you know practical that they do can use
1:12:26
here's a msa the next time that there is
1:12:28
one of these self-driving uh... car accidents and investigators
1:12:30
are trying to get to the bottom of it
1:12:32
and the person trying to show you the video
1:12:34
says i'm sorry my internet just isn't working so
1:12:36
you know what we're just going to pause the
1:12:38
meeting uh... once you go to a coffee shop
1:12:40
and just put the video in a drop box
1:12:42
for okay and then we'll download it and then we'll
1:12:44
look at the video then you can tell us what
1:12:47
you think happened in the video and that we can
1:12:49
sort of you know bypasses the
1:13:30
the questions around retirement have gotten tiring
1:13:32
instead of have you saved up enough shouldn't
1:13:34
they be asking what is it that you
1:13:36
love to do you're not slowing
1:13:38
down so your retirement plan should be more of
1:13:41
an action plan a hiking
1:13:43
plan a golf plan lincoln
1:13:45
financial has the products to help protect and
1:13:47
grow your financial make
1:13:49
your pastimes last a lifetime
1:13:52
at lincoln financial.com/action plan lincoln
1:13:54
financial group marketing name for lincoln national corporation and
1:13:57
its insurance companies and broker dealer affiliate lincoln financial
1:13:59
distributors inc 2024 Lincoln National
1:14:01
Corporation. Heart
1:14:03
Fork is produced by Davis Land and
1:14:05
Rachel Cohn. This episode was
1:14:08
edited by Paula Schumann. Today's show
1:14:10
was engineered by Chris Wood. Original
1:14:12
music by Rowan Nemesow and Dan
1:14:14
Powell. Our audience editor,
1:14:16
Snelga Locley. Video production by Ryan
1:14:19
Manning and Dylan Bergeson. If you
1:14:21
haven't already, check us out on
1:14:24
YouTube at youtube.com/Heart Fork. Special thanks
1:14:26
to Pui-Wing Tam, Kaitlyn Presti and
1:14:28
Jeffrey Miranda. You can email us
1:14:31
at heartforkatnytimes.com with your full Decision Pro
1:14:33
review. We
1:15:22
made USAA insurance for veterans like James.
1:15:24
When he found out how much USAA
1:15:26
was helping members save, he said, It's
1:15:28
time to switch. We'll help you find
1:15:30
the right coverage at the right price.
1:15:32
USAA. What you're made of,
1:15:34
we're made for. Restrictions apply.
Podchaser is the ultimate destination for podcast data, search, and discovery. Learn More