Skip to content
Home » Transcript of OpenAI’s Sam Altman on the Future of AI, Safety and Power — Live at TED2025

Transcript of OpenAI’s Sam Altman on the Future of AI, Safety and Power — Live at TED2025

Read the full transcript of a conversation between OpenAI’s Sam Altman, and head of TED Chris Anderson, at TED2025 conference on April 11, 2025.

The interview starts here:

Welcome to TED

CHRIS ANDERSON: Sam, welcome to TED. Thank you so much for coming.

SAM ALTMAN: Thank you. It’s an honor.

CHRIS ANDERSON: Your company has been releasing crazy, insane new models pretty much every other week. It feels like I’ve been playing with a couple of them. I’d like to show you what I’ve been playing. So, Sora, this is the image and video generator. I asked Sora this: “What will it look like when you share some shocking revelations here at TED?” You want to see how it imagined it? I mean, not bad, right? How would you grade that? 5 fingers on all hands.

SAM ALTMAN: Very close to what I’m wearing.

CHRIS ANDERSON: You know, I’ve never seen you quite that animated. You’re not.

SAM ALTMAN: No, I don’t. I’m not that animated of a person.

AI’s Creative Capabilities

CHRIS ANDERSON: So maybe a B. This one genuinely astounded me when I asked it to come up with a diagram that shows the difference between intelligence and consciousness. How would you do that? This is what it did. I mean, this is so simple, but it’s incredible. What is the kind of process that would allow this? This is clearly not just image generation. It’s linking into the core intelligences that your overall model has.

SAM ALTMAN: The new image generation model is part of GPT4.0. So it’s got all of the intelligence in there. And I think that’s one of the reasons it’s been able to do these things that people really love.

CHRIS ANDERSON: I mean, if I’m a management consultant and I’m playing with some of this stuff, I’m thinking, uh oh, what does my future look like?

SAM ALTMAN: I mean, I think there are two views you can take. You can say, “Oh man, it’s doing everything I do. What’s going to happen to me?” Or you can say, like through every other technological revolution in history, “Okay, now there’s this new tool. I can do a lot more. What am I going to be able to do?” It is true that the expectation of what we’ll have for someone in a particular job increases, but the capabilities will increase so dramatically that I think it’ll be easy to rise to that occasion.

CHRIS ANDERSON: So this impressed me too. I asked it to imagine Charlie Brown as thinking of himself as an AI, came up with this. This was actually rather profound. What do you think? I mean, the writing quality of some of the new models, not just here, but in detail, is really going to a new level.

SAM ALTMAN: Yeah, I mean, this is an incredible meta answer, but there’s really no way to know if it is thinking that or it just saw that a lot of times in the training set. And of course, if you can’t tell the difference, how much do you care?

Intellectual Property and Creative Rights

CHRIS ANDERSON: So that’s really interesting. We don’t know. Isn’t there though? At first glance, this looks like IP theft. You guys don’t have a deal with the Peanuts estate.

SAM ALTMAN: You can clap about that all you want. Enjoy. I will say that I think the creative spirit of humanity is an incredibly important thing. And we want to build tools that lift that up, that make it so that new people can create better art, better content, write better novels that we all enjoy. I believe very deeply that humans will be at the center of that.

I also believe that we probably do need to figure out some sort of new model around the economics of creative output. I think people have been building on the creativity of others for a long time. People take inspiration for a long time, but as the access to creativity gets incredibly democratized and people are building off of each other’s ideas all the time, I think there are incredible new business models that we and others are excited to explore.

Exactly what that’s going to look like, I’m not sure. Clearly there’s some cut and dry stuff, like you can’t copy someone else’s work. But how much inspiration can you take? If you say, “I want to generate art in the style of these seven people, all of whom have consented to that,” how do you divvy up how much money goes to each one? These are big questions. But every time throughout history, we have put better and more powerful technology in the hands of creators. I think we collectively get better creative output and people do just more amazing stuff.

CHRIS ANDERSON: I mean, an even bigger question is when they haven’t consented to it. In our opening session, Carol Cadwalader showed ChatGPT give a talk in the style of Carol Cadwalader. And sure enough, it gave a talk that wasn’t quite as good as the talk she gave, but it’s pretty impressive. And she said, “Okay, it’s great, but I did not consent to this.” How are we going to navigate this? Isn’t there a way? Should it just be people who consented, or shouldn’t there be a model that somehow says that any named individual in a prompt whose work is then used, they should get something for that?

SAM ALTMAN: So right now, if you use our image generation thing and say, “I want something in the style of a living artist,” it won’t do that. But if you say, “I want it in the style of this particular kind of vibe or this studio or this art movement or whatever,” it will. Obviously, if you’re like, “Output a song that is like a copy of the song,” it won’t do that. The question of where that line should be and how people say “this is too much” – we sorted that out before with copyright law and kind of what fair use looks like.