Skip to content
Home » Transcript: Mark Zuckerberg on AI Glasses, Superintelligence, Neural Control, and More

Transcript: Mark Zuckerberg on AI Glasses, Superintelligence, Neural Control, and More

Read the full transcript of Meta founder and CEO Mark Zuckerberg in conversation with host Rowan Cheung on “AI Glasses, Superintelligence, Neural Control, and More”, following the announcements at Connect 2025, September 18, 2025.

INTRODUCTION

ROWAN CHEUNG: Thanks so much for being here.

MARK ZUCKERBERG: Yeah, good to see you. Thanks for doing this.

ROWAN CHEUNG: So today we’re talking everything Meta Connect 2025. Can you give us the rundown of everything you’re announcing and what you’re personally most excited for?

The New Lineup of AI Glasses

MARK ZUCKERBERG: Yeah, so the main things that we announced at Connect were our fall 2025 line of glasses. And you have them here so we can just start and go through them.

I mean, the first one is the next generation of Ray-Ban Meta. That’s sort of the classic AI glasses that we’ve shipped so far. They’re some of the fastest growing consumer electronics of all time. We’re very happy with that. And a lot of the improvements that we have for these are we doubled the battery life, we have 3K video resolution in them now for capture, and we’re introducing new AI features like this thing, Conversation Focus, which allows you, if you’re in a loud place, to basically turn up the volume on friends who you’re talking to. So if you’re in a restaurant or something like that, then that I think is going to be really neat.

Then we’ve got this guy, which is the Oakley Meta Vanguards. This is our second collab with Oakley. It’s more of kind of a performance glasses, but these I think are really cool for a number of things. And they’re designed for a number of things around sports. You’ve got the camera that’s centered, so that’s great for alignment. It’s wider field of view, louder speakers, they’re water resistant. They can connect with your Garmin watch. So you can be running a marathon and basically, you know, you can tell it every mile, capture a video, and then at the end it’ll produce a video for you where it stitches them all together and puts your stats from Garmin on top of it. So I think that’s pretty neat.

The Revolutionary Ray-Ban Meta Display with Neural Control

But I think the most interesting thing by far that we announced is this one, which we call Ray-Ban Meta Display. And that is because it is the first AI glasses that we have shipped that have a high resolution display in them. And the other big breakthrough is that they pair with this guy – the Meta Neural Band, which is the first mainstream neural interface that we’re shipping as the way to control them. And I know that’s pretty neat.

I mean, basically every computing platform has its own new input method, right? So when you went from computers with the keyboard and mouse to phones with touch screens, you kind of got a completely new input method. And the same, I think, is going to be true for glasses and having a neural interface where you can just send signals from your brain with micro muscle movements like this. I mean, this is basically about as much movement as you need to make. And you can enter text neurally. Just all the fun stuff that you got a chance to try out. But I think this is a pretty big breakthrough, so I’m very excited about this. But overall, I mean, the whole lineup is good. So it was a fun Connect.

ROWAN CHEUNG: So you got the Ray-Ban Metas for kind of everyday, the Oakleys for athletes, and then the displays. Now for power users, how do all these glasses kind of tie into that personal superintelligence vision?

Glasses as the Ideal Form Factor for Personal Superintelligence

MARK ZUCKERBERG: Yeah. So I mean, our theory is that glasses are the ideal form factor for personal superintelligence because it’s the only real device that you can have that can see what you see, can hear what you hear, can talk to you throughout the day, and can generate a UI in your vision in real time.

And so there are other devices that people have. I mean, obviously you can have some AI on your phone, you can do some AI on your watch. You can do some of it if you just have AirPods type thing. But I think glasses are going to be the only thing that can do all of the pieces that I just said, kind of visual and audio in and out, and that I think is going to be a really big deal.

ROWAN CHEUNG: So I got to try it for this interview.

MARK ZUCKERBERG: Yeah. And I have to say there were.

ROWAN CHEUNG: Way more many use cases than I initially thought there were going to be. Like the AirPods, for example, have the live translation. But this is live translation. The GPS, the captions, there’s so much more.

MARK ZUCKERBERG: Yeah, the subtitles are awesome.

ROWAN CHEUNG: It’s awesome. So I’m curious, what are your favorite kind of use cases that you’ve been using them for?

Seamless Messaging: The Killer Feature

MARK ZUCKERBERG: I mean, the thing that I really had in mind when we were designing them for is just sending messages. Right. I mean, this is the thing that I think we do the most frequently on our phones. I mean, I would guess that we all are sending dozens of messages a day. I don’t really need to guess. And we run a lot of the messaging apps that people use, so I think we know that people send dozens of messages a day. And we wanted to make it so that the experience is just really great.

So now with the glasses, a friend texts you, shows up in the corner of your eye for a few seconds. It’s off center so it doesn’t block your view. It goes away quickly. It’s not distracting. But if you want, you can just respond as easy as moving your finger like this.