Podchaser Logo
Home
Meta Launches Multi-Modal AI Glasses with Ray-Ban

Meta Launches Multi-Modal AI Glasses with Ray-Ban

Released Thursday, 9th May 2024
Good episode? Give it some love!
Meta Launches Multi-Modal AI Glasses with Ray-Ban

Meta Launches Multi-Modal AI Glasses with Ray-Ban

Meta Launches Multi-Modal AI Glasses with Ray-Ban

Meta Launches Multi-Modal AI Glasses with Ray-Ban

Thursday, 9th May 2024
Good episode? Give it some love!
Rate Episode

Episode Transcript

Transcripts are displayed as originally observed. Some content, including advertisements may have changed.

Use Ctrl + F to search

0:00

Meta has just announced some major

0:02

new AI integrations into the Ray-Bans.

0:04

And the big thing here is

0:06

that it is now multi-modal. So

0:10

in my opinion, these AI Ray-Ban Meta kind

0:13

of glasses, this is probably where the Oculus

0:15

should have gone. I know different people will

0:17

kind of debate that, but I honestly see

0:19

this having a way higher user base than

0:22

something like an Oculus, which might be fun

0:24

for gaming or whatever, but doesn't seem like

0:26

something you're going to wear while you're walking

0:28

around outside. Whereas these Meta Ray-Bans, if you've

0:31

seen them are actually phenomenal. They literally look

0:33

like a pair of Ray-Ban sunglasses. I'm blown

0:35

away. I'm not even sure where they're packing

0:37

the tech into these things. Obviously it's in

0:40

the frames and everything else. So it's amazing and

0:42

it looks very, very natural. Now the cool thing

0:44

here is that these have always kind of had

0:46

the ability to have a camera on them for

0:49

you to like take a picture with them and

0:51

that kind of thing, but it's bringing it to

0:53

a whole new level now that Meta has come

0:55

up with their new LAMA model. LAMA

0:58

3 is going to be integrated into these

1:00

and you're going to be able to have

1:02

this multimodal integrations and feature set. So essentially

1:04

what you're going to be able to do

1:06

is you'll be wearing these glasses, you'll be

1:08

able to look at something and it's able

1:10

to actually see. In my opinion, it's kind

1:12

of like what Siri or Alexa or okay,

1:14

Google's, you know, the Google assistant should have

1:16

or could have been, but they never really

1:18

had the functionality of the camera. Now this

1:20

is built in with this. So they, they

1:22

released a demo and they showed a bunch

1:24

of really cool things you can do, um,

1:26

a few in particular. The first one is

1:28

that Meta's AI assistant, which was previously only for

1:31

audio interactions, right? So it's pretty much like Siri.

1:33

You could just talk to it and be like,

1:35

Hey Siri, what's blah, blah, blah. Or Hey, Meadow,

1:37

you know, what do I do about this? And

1:40

it could give you an answer. It can now

1:42

process the visual data from the glasses and their

1:44

built-in camera and give you insights on

1:46

that. Now we get some sort of similar

1:48

features from chat GPT, where you can upload

1:51

an image to it and you upload a

1:53

picture and they have actually really cool demos

1:55

that they've done for a long time. The

1:57

problem is wait for it. It's

2:00

kind of, I don't know, my opinion, not supernatural

2:02

to like, take the picture, send the picture, type

2:05

out your message. This is so much more seamless

2:07

where you literally say, hey, meta or tap something

2:09

on your glasses. And you're like, hey, meta, you

2:11

know, what, what am I supposed to be fixing

2:14

here? And then it can immediately tell you. So

2:16

that's one thing that's amazing. The second is that

2:18

users can now ask the glasses to translate text.

2:20

So if you're, let's say in a country that

2:22

is not a language you speak, you could be

2:25

looking at the label at the food at the

2:27

grocery store and be like, hey, like, what is

2:29

this? Or what's the price on this? This

2:31

is phenomenal. This is so cool. This is actually

2:33

like very, very useful, different than like what a

2:35

lot of people would say, you know, VR. They're

2:37

like, oh, there's not like that many use cases,

2:39

shirts, phone for video games, but whatever, right? This

2:41

is like incredibly useful to wear it all day,

2:43

every day. I think, I mean, I would be

2:45

curious what people's perspectives are, but you have things

2:47

like the humane AI pin, which is like a

2:49

pin that you wear on your shirt and it's

2:52

got like a camera and it can kind of

2:54

do some of the same things. It's

2:57

interesting though, because with something like the humane pin, this is

2:59

inventing a whole new device that is

3:01

now on you where glasses, tons of people

3:03

just wear glasses all day, every day. So

3:06

it's like, it's something that you might already be doing. You

3:08

kind of tap it in. It's like if you added all

3:10

these functionalities to Apple smartwatch, it'd be like a no brainer

3:12

because people are already wearing their watches. Now,

3:15

the cool thing about this all to me is

3:17

that more than a smartwatch, the camera is right

3:19

where your eyes are. So right where you are

3:21

seeing things, it is seeing things. It's so much

3:23

more seamless and natural. And I think glasses are

3:25

100% the way to go. Now,

3:28

not that, you know, Meta is like the big revolutionary

3:30

in this space. You know, I want to make it

3:32

clear. Obviously, we had Google Glass, which was kind of

3:34

ahead of its time, perhaps, that it was supposed to

3:36

do a lot of these kind of cool things where

3:38

it could see what you're doing and it was gone.

3:41

I think the big thing there was that they looked

3:43

really dorky, so they're not going to be popular. Whereas

3:45

the Meta Ray-Bans, it's like they're partnering with a fashion

3:47

brand. These Ray-Bans are super cool. So this is actually,

3:49

I think, going to go somewhere. The other

3:51

thing I want to bring up is, you know, I've

3:54

also been researching a lot on the Snapchat Spectacles. This

3:56

is a product that was super cool back in the

3:58

day, but it's kind of been forgotten. I never really

4:00

talked about Snapchat spectacles, but it's a real

4:02

thing, right? Snapchat built these glasses with a

4:04

built-in camera. Now, the thing

4:06

that I think Ray-Ban and Meta have

4:09

over them is if you look at

4:11

the Snapchat spectacles, these things are like

4:13

very obviously clunkier than normal glasses. They

4:15

have these huge cameras built in. It

4:17

just, it doesn't look like something that

4:19

you would naturally wear. The

4:22

arms on these things are super fat.

4:24

So it obviously kind of looks like

4:26

this futuristic thing. Meta and Ray-Ban

4:28

have done a phenomenal job. I honestly

4:30

don't know, like you would be hard pressed

4:32

to see where the tech is actually built into this

4:34

thing. Very, very cool. Looks just like

4:37

a regular pair of glasses. So huge hats off

4:39

here. A couple other cool features

4:41

that you can do is if you're wearing these,

4:43

you also can share what you're looking at during

4:45

a video call. So, or, you know, on WhatsApp

4:47

or Messenger or wherever you're doing this kind of

4:49

in the Meta ecosystem. So it integrates with

4:51

the app on your phone, which I think again

4:53

is probably gonna be a huge winning play for

4:55

Meta, right? Because you have people like Humane with

4:57

the Humane pin and they're like, no, we're replacing your

5:00

smartphone. You wear this instead of a smartphone. You

5:02

don't need a phone anymore. Now that you have the

5:04

Humane pin. But like, honestly, we all are used

5:06

to using our phones. It's a screen we kind of

5:08

look at. I know the Humane pin tries to have

5:10

a projector where it can show you some like

5:12

green graphics on your hand. So not as good, right?

5:15

We're used to seeing like, if I'm doing a video

5:17

call, I'm not doing like a hologram video call

5:19

on my hand with a projector, right? I want

5:21

to see like the person's actual face. So

5:23

how cool if you can use your glasses

5:25

as the camera and you're doing

5:28

the video call on your phone, you can

5:30

see them, but also they can see everything

5:32

you're looking at. It's just so much more

5:34

immersive. It's super, super cool. And it doesn't

5:36

feel weird and intrusive like a VR headset.

5:38

And then of course the multimodal AI upgrade

5:40

is gonna be available for users in the

5:42

US and Canada. So huge markets that are

5:45

getting unlocked right now. I think when we're

5:47

talking about, you know, why all of this

5:49

matters, right now I think Meta's multimodal integration

5:51

is a really big step towards smart glasses,

5:53

right? So this is something that obviously it's

5:55

not a hundred percent there yet. They're just

5:57

getting started. But the fact that you can have a

5:59

pair. glasses that looks completely normal and it functions

6:01

in this way. It's got all this AI built

6:04

into it. I think it's impressive. The way that

6:06

you pair this with your phone I think is

6:08

a good play on Meta's part. You could probably

6:10

have a lot of the tech just running on

6:12

your phone but Bluetooth synced to your glasses so

6:14

it kind of makes them a new device but

6:17

it's more like a smart watch where it just

6:19

gives you extra information. I could see people wearing

6:21

a smart watch and these I think that they're

6:23

phenomenal and the ability for you to be able

6:25

to see things and get feedback instantly is going

6:27

to be really hard for anyone to beat. So

6:30

you know whether you're using this for something like

6:32

sharing on a video call, being able to ask

6:34

it questions about everything you're doing. Overall this is

6:36

a fantastic new innovation that I think is fantastic

6:38

out of Meta. Really excited to see the adoption,

6:40

get my hands on a pair of these and

6:43

I'll keep you up to date on how that

6:45

goes when I do that. If you enjoyed the

6:47

episode today I would really really appreciate it if

6:49

you could leave us a review wherever you listen

6:51

to your podcast or subscribe and give

6:53

us a like on YouTube if that's where you're at. We're

6:55

super grateful that you could be here and I hope that

6:57

you have a fantastic rest of your day.

Unlock more with Podchaser Pro

  • Audience Insights
  • Contact Information
  • Demographics
  • Charts
  • Sponsor History
  • and More!
Pro Features