Skip to content
Home » Anthropic CEO Dario Amodei on Pentagon Feud (Transcript)

Anthropic CEO Dario Amodei on Pentagon Feud (Transcript)

Editor’s Notes: In this detailed interview, Anthropic CEO Dario Amodei discusses the escalating conflict between his company and the Pentagon regarding the use of AI in national defense. He explains the two “red lines” his company has drawn—refusing to support domestic mass surveillance and fully autonomous weapons—in order to preserve democratic values and ensure human oversight. Amodei also addresses the administration’s decision to designate Anthropic as a supply chain risk and makes the case for why Congress needs to establish formal guardrails for technology that is outpacing current laws. This conversation offers a rare look at the tensions between Silicon Valley innovation and government oversight during a pivotal moment for national security. (Feb 28, 2026)

TRANSCRIPT:

Introduction

INTERVIEWER: We appreciate you taking the time. You are Dario Amodei, the CEO of Anthropic, is that right?

DARIO AMODEI: That’s correct, yeah.

INTERVIEWER: Great. Well, my first question to you is, why won’t you release Anthropic’s AI without restrictions to the US Government?

Anthropic’s Work With the US Government

DARIO AMODEI: So we should maybe back up a bit for a little bit of context. Anthropic actually has been the most lean forward of all the AI companies in working with the US government and working with the US Military. We were the first company to put our models on the classified cloud. We were the first company to make custom models for national security purposes, were deployed across the intelligence community and the military for applications like cyber, combat support operations, various things like this.

The reason we’ve done this is, I believe that we have to defend our country. I believe we have to defend our country from autocratic adversaries like China and like Russia. And so we’ve been very lean forward. We have a substantial public sector team.

But I have always believed that, as we defend ourselves against our autocratic adversaries, we have to do so in ways that defend our democratic values and preserve our democratic values. And so we have said to the Department of War that we are okay with all use cases. Basically, 98 or 99% of the use cases they want to do, except for two that we’re concerned about.

One is domestic mass surveillance. We’re worried that things may become possible with AI that weren’t possible before. An example of this is something like taking data collected by private firms, having it bought by the government, and analyzing it in mass via AI. That actually isn’t illegal. It was just never useful before the era of AI. So there’s this way in which domestic mass surveillance is getting ahead of the law. The technology is advancing so fast that it’s out of step with the law. That’s case number one.

Case number two is fully autonomous weapons. This is not the partially autonomous weapons that are used in Ukraine or could potentially be used in Taiwan today. This is the idea of making weapons that fire without any human involvement. Now, even those, our adversaries may at some point have them. So perhaps they may at some point be needed for the defense of democracy, but we have some concerns about them.

First, the AI systems of today are nowhere near reliable enough to make fully autonomous weapons. Anyone who’s worked with AI models understands that there’s a basic unpredictability to them that in a purely technical way, we have not solved. And there’s an oversight question, too. If you have a large army of drones or robots that can operate without any human oversight, where there aren’t human soldiers to make the decisions about who to target, who to shoot at, that presents concerns. And we need to have a conversation about how that’s overseen. And we haven’t had that conversation yet. And so we feel strongly that those two use cases should not be allowed.

The Failed Negotiations With the Pentagon

INTERVIEWER: The Pentagon has told us that they have agreed in principle to these two restrictions and they wanted to strike a deal. Why couldn’t an agreement be reached?

DARIO AMODEI: So there were kind of several stages of this, all done quickly and kind of all determined by the kind of very limited three day window that they gave us. They gave us an ultimatum to agree to their terms in three days or be designated a Supply Chain Risk or Defense Production Act. I guess we’ll get to that later.

But during that time, there were a few back and forth. At one point they sent us language that appeared on the surface to meet our terms, but it had all kinds of language, like “if the Pentagon deems it appropriate” or “to do anything in line with laws.” So it didn’t actually concede in any meaningful way. And there were further steps of it that also did not concede in any meaningful way.

We have wanted to strike a deal since the beginning. If you want to get a sense of the Pentagon’s position, the Pentagon spokesman, Sean Parnell, the day before tweeted and reiterated their position: “we only allow all lawful use.” And this was the same as when they sent us their terms. So they have not in any way agreed to our exceptions in any meaningful way.

Response to the President’s Statement

INTERVIEWER: The President posted today in response to the situation, “their selfishness,” referring to Anthropic, “is putting American lives at risk, our troops in danger, and our national security in jeopardy.” What’s your response?

DARIO AMODEI: So in the statement we issued yesterday and also in the one we issued today, we said that we were willing, even if the Department of War or even if the Trump administration takes these unprecedented measures against us — this kind of supply chain designation that’s normally used against foreign adversaries — we have said that even if they take these extreme actions, we’ll do everything we can to support the Department of War, to provide its technology for as long as it takes to off board us and onboard a competitor who’s willing to do these things that we are not willing to do.

INTERVIEWER: Prepared to exit.

DARIO AMODEI: Yeah, so we have offered continuity.