Skip to content
Home » The Inside Story of ChatGPT’s Astonishing Potential: Greg Brockman (Transcript)

The Inside Story of ChatGPT’s Astonishing Potential: Greg Brockman (Transcript)

Here is the full transcript and summary of Greg Brockman’s talk titled “The Inside Story of ChatGPT’s Astonishing Potential” at TED conference.

In this TED talk, Greg Brockman, President of OpenAI, discusses the role of AI in improving education. He argues that traditional education methods are often inefficient and ineffective, with students struggling to retain knowledge and teachers struggling to teach in a way that engages every student. Brockman suggests that AI could help to solve these problems by providing personalized learning experiences for each student.

Listen to the audio version here:  

TRANSCRIPT:

We started OpenAI seven years ago because we felt like something really interesting was happening in AI and we wanted to help steer it in a positive direction. It’s honestly just really amazing to see how far this whole field has come since then. And it’s really gratifying to hear from people like Raymond who are using the technology we are building, and others, for so many wonderful things.

We hear from people who are excited, we hear from people who are concerned, we hear from people who feel both those emotions at once. And honestly, that’s how we feel. Above all, it feels like we’re entering an historic period right now where we as a world are going to define a technology that will be so important for our society going forward. And I believe that we can manage this for good.

So today, I want to show you the current state of that technology and some of the underlying design principles that we hold dear.

So the first thing I’m going to show you is what it’s like to build a tool for an AI rather than building it for a human. So we have a new DALL-E model, which generates images, and we are exposing it as an app for ChatGPT to use on your behalf. And you can do things like ask, you know, suggest a nice post-TED meal and draw a picture of it.

Now you get all of the, sort of, ideation and creative back-and-forth and taking care of the details for you that you get out of ChatGPT. And here we go, it’s not just the idea for the meal, but a very, very detailed spread. So let’s see what we’re going to get. But ChatGPT doesn’t just generate images in this case — sorry, it doesn’t generate text, it also generates an image.

And that is something that really expands the power of what it can do on your behalf in terms of carrying out your intent. And I’ll point out, this is all a live demo. This is all generated by the AI as we speak. So I actually don’t even know what we’re going to see. This looks wonderful. I’m getting hungry just looking at it.

Now we’ve extended ChatGPT with other tools too, for example, memory. You can say “save this for later.” And the interesting thing about these tools is they’re very inspectable. So you get this little pop up here that says “use the DALL-E app.” And by the way, this is coming to you, all ChatGPT users, over upcoming months.

And you can look under the hood and see that what it actually did was write a prompt just like a human could. And so you sort of have this ability to inspect how the machine is using these tools, which allows us to provide feedback to them. Now it’s saved for later, and let me show you what it’s like to use that information and to integrate with other applications too.

You can say, “Now make a shopping list for the tasty thing I was suggesting earlier.” And make it a little tricky for the AI. “And tweet it out for all the TED viewers out there.” So if you do make this wonderful, wonderful meal, I definitely want to know how it tastes. But you can see that ChatGPT is selecting all these different tools without me having to tell it explicitly which ones to use in any situation.

And this, I think, shows a new way of thinking about the user interface. Like, we are so used to thinking of, well, we have these apps, we click between them, we copy/paste between them, and usually it’s a great experience within an app as long as you kind of know the menus and know all the options. Yes, I would like you to. Yes, please. Always good to be polite.

And by having this unified language interface on top of tools, the AI is able to sort of take away all those details from you. So you don’t have to be the one who spells out every single sort of little piece of what’s supposed to happen. And as I said, this is a live demo, so sometimes the unexpected will happen to us.

But let’s take a look at the Instacart shopping list while we’re at it. And you can see we sent a list of ingredients to Instacart. Here’s everything you need. And the thing that’s really interesting is that the traditional UI is still very valuable, right? If you look at this, you still can click through it and sort of modify the actual quantities.

And that’s something that I think shows that they’re not going away, traditional UIs. It’s just we have a new, augmented way to build them. And now we have a tweet that’s been drafted for our review, which is also a very important thing. We can click “run,” and there we are, we’re the manager, we’re able to inspect, we’re able to change the work of the AI if we want to.

And so after this talk, you will be able to access this yourself. And there we go. Cool. Thank you, everyone. So we’ll cut back to the slides. Now, the important thing about how we build this, it’s not just about building these tools. It’s about teaching the AI how to use them.

Like, what do we even want it to do when we ask these very high-level questions?