Editor’s Notes: Is your phone secretly spying on you, and has Big Tech already crossed the line from convenience into control? In this explosive conversation, Tucker Carlson sits down with 25‑year‑old cryptography prodigy Yannik Schrade to expose how surveillance capitalism, “free” apps, and even your own devices are eroding your privacy and freedom. They break down how modern phones, Wi‑Fi routers, cameras, and even digital money are being used to track you, influence you, and potentially shut you out of the financial system. Stay tuned to learn what’s really happening behind the screen—and what cutting‑edge encryption technology could do to win your privacy back. (Feb 13, 2026)
TRANSCRIPT:
The Fundamental Nature of Privacy
TUCKER CARLSON: You’ve dedicated your life to preserving privacy. So let’s just start big picture. What is privacy and why is it important?
YANNIK SCHRADE: So I believe that privacy is core to freedom. At the end of the day, I would even go as far as saying that it is synonymous with freedom and it is protecting you, protecting your inner core, essentially protecting your identity as a human being from forces that don’t want you to be an individual and a human being at the end of the day.
TUCKER CARLSON: That was so nicely put.
YANNIK SCHRADE: I think what it really boils down to is, and in that regard, I think privacy is relatively similar to what was originally intended, also with the Second Amendment in the United States. It is a tool for you as a human being to protect yourself against coercive force, against your very soul, your inner core.
TUCKER CARLSON: So there are forces, and this has always been true at every time in history, that seek to make people less human, to turn human beings into slaves or animals or objects. And privacy is the thing that prevents that.
The Mathematical Foundation of Encryption
YANNIK SCHRADE: So the crazy principle that exists within this universe is that there’s this asymmetry baked right into the very fabric that we exist in. There’s certain mathematical problems where the effort required to undo them isn’t just scaling linearly or exponentially, but that scale so violently that the universe itself prohibits persons that don’t have access, don’t have permission to undo this mathematical problem, that they literally cannot do that.
So what that means is that with a very little amount of energy, a minuscule amount of energy, a laptop, a battery, and a few milliseconds of computation, you can create a secret that not even the strongest imaginable superpower on earth is able to, without your explicit granting of access, are able to recover.
That is the fundamental principle on top of which encryption, cryptography and privacy in the modern age are built. And it’s so fascinating that the universe itself allows for this computational asymmetry where I can create a secret, I can encrypt something, I can make something hidden. And you, with the most powerful imaginable coercive force, violence, you could imagine continent sized computer running for the entire lifespan of the universe, you would not be able to apply that force to my secret, because I have encrypted it.
And the universe inherently sort of smiles upon encryption and appreciates that. So I always found that so intoxicating, this concept, that this is inherently baked into the universe. It is an interaction between mathematics and physics, sort of, and is a fundamental property, just like you could say nuclear weapons are a fundamental property of reality. Right?
And so encryption and privacy exist in this reality. And before we as humans have figured it out, that wasn’t necessarily clear. Right. It could also be that you can never hide something, encrypt something, keep something to yourself, but it turns out you actually can. And so that is fascinating, I think.
And what it conceptually allows you to do is to take something and move it into a different realm. The encrypted realm. Right. And if someone else wants to go into that realm, follow you there, they would need unlimited resources to do so. And I would say that’s what really got me into cryptography and privacy.
A Challenge to Global Authority
TUCKER CARLSON: Okay. I’m having all kinds of realizations simultaneously, for sure, that you’re an extraordinary person. I think that’s first listen to three minutes. Okay. Who are you? Where are you from? And are you ready to suffer for your ideas? Because what you’ve just articulated is the most direct, subtle, but direct possible challenge to global authority anyone could ever articulate. But first, how did you come to this? Where are you from? Tell us about yourself for just a moment.
YANNIK SCHRADE: So I was born in Germany. I’m 25 years old. And I originally, actually in my life I studied law, and then later I studied mathematics and computer science. And then at some point I met a few people who also had these kinds of ideas about privacy technology, distributed technology, decentralization. And we then decided to found a company that builds this kind of technology. And that’s how I ended up here, I guess.
TUCKER CARLSON: So, Europe, you’re German. You’re a product of Europe and European culture, which is not prize privacy for all of its wonderful qualities. It built the world. I love Europe and the culture, but it’s not a privacy culture.
YANNIK SCHRADE: It doesn’t help. No, no.
TUCKER CARLSON: So especially German, how did you, what? Why did you come to this conclusion when all of your neighbors didn’t?
The European Privacy Paradox
YANNIK SCHRADE: So I think it’s interesting, right, if you view privacy as this inherent political thing that protects you as a human being, there is data protection laws, GDPR, there’s fines against surveillance capitalist tech giants in Europe. But as you said, I feel like most of that stuff is a charade. It’s not really about protecting your privacy. And we are seeing that in the UK, in the European Union, I mean, there’s so many cases that already have made some significant movements already this year.
So I would say for me personally, it has really been this technological and mathematical understanding of the power of this technology.
And what I realized sort of is that what humans have done in the past is that they’ve allowed information, right, any type of information that we now share with our mobile surveillance devices, so that information to be encrypted and be put addressed somewhere securely. That is how encryption has mainly been used. Or to do things like Signal is doing, where we do end to end encrypted messaging, where we are able to send some message from one human to another human being via something, some untrusted channel where there can be interceptors that’s trying to get those messages.
But thanks to mathematics, we are able to send this message across the whole universe and it arrives at the end point with no intermediary being able to take a look at the message. Because of this inherent property of the universe, what I realized sort of has been that there’s a missing piece, which is whenever we are accessing this information, whenever we are interacting with this information, whenever we want to utilize it, basically we have to decrypt it again, which then makes it accessible to whoever takes a look at it, right?
Whoever runs the machine that you decide to put that data on, which can be AWS, which can be cloud providers, big data, big AI, whoever, right? And so this idea that I had was, what if we can take this asymmetry that is a fact of reality, and move that to computation itself to enable that all of those computations can be executed in private as well.
And then we can do some amazing things. Then the two of us can decide to compute something together. Not just exchange information via some secure communication channel, but actually perform some mathematical function over something, produce an output from some inputs, but we can keep those inputs to ourselves.
So Tucker has a secret, Yannik has a secret. And with this technology, we can produce some value, some information. While you don’t have to share your secret, I don’t have to share my secret. And we can scale that to enormous sizes, where the entirety of humanity can do those things, where countries can do those things.
But importantly, at its core, what we are doing is we are implementing this asymmetry that exists within the universe and bringing that to the next level, to the final form, sort of. And that’s how I ended up founding Arkhium. Yeah.
TUCKER CARLSON: Getting older can make you realize you don’t actually want all the things you have. That’s why mini storage is so big. Consumerism kind of loses its appeal. What you really want is peace, peace of mind. And what’s the best way to get that? Well, keeping your home and family protected would be at the top of the list. Enter SimpliSafe.
This month you get a 50% discount on your first SimpliSafe system. SimpliSafe takes a much better approach to home security. The idea is, how about we stop people invaders before they come into the house. Not just trying to scare them once they’re already inside your house. And they do that with cameras backed by live agents who keep watch over your property. If someone’s lurking outside, they will tell the person, get out of here. And then they’ll call the police if they don’t.
60 day satisfaction guarantee or your money back. So there really is no risk here. Not surprisingly, SimpliSafe has been named America’s Best Home Security System for five years running. Protect your home today. Enjoy 50% off a new SimpliSafe system with professional monitoring at SimpliSafe.com/Tucker. S I M P L I Safe.com/Tucker. There is no safe like SimpliSafe.
Preserving Humanity Through Privacy
TUCKER CARLSON: I can’t think of a more virtuous project. And you said it in the first minute. The point of the project is to preserve humanity, to keep human beings human. And they’re not just objects controlled by larger forces. They’re human beings with souls. And again, I don’t think there’s any more important thing that you could be doing with your life. So thank you for that. Can you be more specific about our current system and how it doesn’t protect privacy?
Surveillance Capitalism and the Extraction Economy
YANNIK SCHRADE: Yes. So I would say there’s, so I think there’s a lot of things to unravel if we take a look at the systems that we are interacting with every single day. What those, all those tools and applications, the social media networks, basically everything that we do in our digital lives and all of our lives have basically shifted from physical reality to this digital world.
So everything we basically do, everything we do in this room, everything we do when we are out on the street, because all of the technology has become part of physical reality has been consumed, sort of. And so all of this has been built on top of what the former Harvard professor Shoshana Zuboff has called surveillance capitalism. Right. And I think that that really lies at the core.
And it’s relatively straightforward to understand what those companies are doing. If you ask yourself, hey, why is this application that I’m using actually free? Right. Why is nobody charging me to ask this super intelligent chatbot questions every day? Why are they building data centers for trillions of dollars while I don’t have to pay anything for it, right? So that’s the question that you need to ask yourself, right?
And what you end up realizing is that all of those systems are basically built as rent extraction mechanisms where from you as a user, you’re not really a user, you’re sort of a subject of those platforms. You are being used to extract value from you without you noticing. And they’re able to extract value from you because all of your behavior, all of your interactions with those systems are being taken.
And they perform mass surveillance, bulk surveillance, and it’s those companies, right? We’re just talking about companies. We’re not even talking about intelligence or governments or anything. We’re just talking about those companies that exist within our economy. And so they record everything they can. Because every single bit of information that I can take from your behavior allows me to predict your behavior.
And where I can predict your behavior, I can utilize that to, in the most simple case, do something like serving you ads, right? But in more complex cases, I can do things like I can steer your behavior, I can literally control you, I can turn you into a puppet that does whatever I want.
And so those are the systems that we are faced with right now. And the Internet has sort of been this amazing emancipator for humanity, right? This show is only possible because of the Internet, otherwise with traditional media we wouldn’t be able to speak about those topics. I feel like that’s right. But at the same time, sort of nowadays it has transformed into one of the biggest threats to human civilization.
The Reality of Private Communication
TUCKER CARLSON: At the user level, at my level, the level of the iPhone owner. Is it possible to communicate privately with assurance of privacy with another person?
YANNIK SCHRADE: That’s an interesting question. So we start with this concept of insecure communication channels. And since every communication channel is insecure, what we employ is end to end encryption. And end to end encryption allows us to take this information, take a message and lock it securely so that only Tucker and Yannik are able to unlock them and see what’s going on. And that is a fact.
So there have been many cases where big players with big interests, I guess, have attempted to undermine cryptography, have attempted to get rid of end to end encryption, to install backdoors. There has been what is commonly called the crypto wars in the 1990s, right, where the cypherpunks fought for the right to publish open source encryption and cryptography and many, many more cases, I guess.
But at the end of the day, I would say as a realistic assessment, this kind of cryptography is secure and it works now that unfortunately is not the whole answer. Because what you have to think about is now what happens with those end devices. Right?
TUCKER CARLSON: Fair.
The Challenge of Device Security
YANNIK SCHRADE: I mean, the messenger, right, that is being sent from Yannik to Tucker might be secure. But now if I cannot undermine and apply force to this message to understand what’s inside, well, I’m just going to apply force to your phone. And that’s sort of what’s happening.
So when we look at different applications, for sure, there is a whole variety of applications, messaging applications that do not employ encryption and security standards and might collect all of your messages and images and utilize them for those machines, right, that extract as much value as possible from you. But there’s applications like Signal that don’t do that, that are actual open source cryptography technology, that anyone can verify themselves, take this code and turn it into an actual application, install it on your phone. All of those things are possible, right? So that’s not the issue.
The underlying issue really is that you have this device in your hand that is sort of closed hardware. You don’t know how that thing works. Right. It is impossible to understand how that thing works. It is impossible to understand how the operating system on that thing works. And there’s flaws in those systems, right? Those are closed systems. There’s flaws in those systems for some reason because people don’t always have the best interests of others in mind. But also…
TUCKER CARLSON: Not always, not always.
YANNIK SCHRADE: But also because people make mistakes, right? Honest mistakes are not malicious. And so I think that in general also speaks for the importance for free, accessible hardware where people with technical skills can play around with and find issues.
But at its core, what you’re being subjected to right now, I would say is tactical surveillance. And what it means is that there’s some actor, can be some state actor, can be someone else that decides that Tucker Carlson is worth to be surveilled.
TUCKER CARLSON: I think that has been decided. Yeah.
YANNIK SCHRADE: You think so?
TUCKER CARLSON: I do, yeah. I’m getting that sense.
Tactical vs. Strategic Surveillance
YANNIK SCHRADE: So tactical surveillance, that means that you specifically are being targeted. And that is in contrast to strategic surveillance, which is this idea of everyone is being surveilled. Let’s just surveil everyone, collect every single bit of information and store that for the entirety of human history. And then someday maybe we’ll be able to use that. Right. So those are those two concepts.
And what we’ve seen over the last few years is sort of a shift away from tactical surveillance towards strategic surveillance and surveillance. Capitalism has really helped this concept because there’s so much data that is being locked that can be stored. There are so many new devices and applications that can be employed.
And so we see pushes like for example, chat control within the European Union that is sort of a backdoor to implement backdoors within all of the messenger applications, to be able to scan your applications, to scan your messages, to take your messages somewhere else and decide whether or not those people like what you’re saying within your private messages.
So I would say in general, as a normal human being, with your iPhone, you are still able to privately communicate. That is still something that exists. However, this ability has greatly been limited. If there is someone who wants to see your message, I would say they can, unfortunately.
TUCKER CARLSON: How difficult is it for a determined, say, state actor, an intel agency, to say, I want to read this man’s communications, listen to his calls, watch his videos, read his texts. How hard is it for them to do that?
The Backdoor Problem
YANNIK SCHRADE: So I think that, and we can look at different court cases that have publicly emerged in regards to Apple, for example, where Apple has refused intelligence to give them backdoor access to their devices. And what’s so important about this discussion that we are having here is that every time you’re building a system where you add backdoor access so that someone in the future can decide to get access and take a look at what you’re writing, right? What it invites is for everyone to do that, because a backdoor inherently is a security flaw in our system.
And it’s not just some specific intelligence agency that decides to read your messages. Right. It’s every intelligence agency on that point. Right. And so that’s why as a nation, you cannot weaken security by getting rid of privacy without weakening your entire economy, cybersecurity, and also social fabric at the end of the day, right, and the whole strategic positioning of you as a nation.
How difficult it is, I would say also from a practical operational security standpoint, depends on what are you doing with your phone, right. Is your phone this strict device that is only used for messaging, or is your phone also using different types of media? Are you sending images? Are you receiving messages?
I think two years ago there was this case where there was a zero day backdoor being used across Apple devices. Because when I sent you an image and your messenger had auto download on, I could get full access to your phone by sending you a message. And you’re not my contact even probably, right? I just figure out what your phone number is, I send you an image, the image gets automatically downloaded, some malicious code that I have injected gets executed, and now I own your phone and I can do whatever I want.
And then end to end encryption doesn’t help you. Right. Because I have literal access to the end device that decrypts this information. And so that’s very dangerous. That has been fixed. But I think what it highlights really is that complexity is the issue here. So complexity in the kinds of applications that you’re running, complexity in the underlying operating system that this device has, all of that complexity invites mistakes and also malicious security flaws to be installed in those systems. Of course, yeah.
TUCKER CARLSON: Human organizations are the same way. The bigger they are, the easier they are to subvert.
YANNIK SCHRADE: Yes, of course. Yeah.
TUCKER CARLSON: February is the perfect month to get cozy because it’s chilly outside. Our partners at Cozy Earth understand this and they’re helping Americans everywhere stay toasty throughout the frigid winter. We hope you’re seated because this detail may shock you. Cozy Earth offers bamboo pajamas. Lightweight, shockingly soft. These pajamas are a true upgrade. They sleep cooler than cotton, plus they’re made out of bamboo. That is just wild and awesome.
From pajamas and blankets to towels and sheets, Cozy Earth is something unusual and great for everybody and it’s entirely risk free. You get a hundred night sleep trial, 10 year warranty. There is no downside that we can see. So share Love this February. Wrap yourself or someone you care for in comfort that feels special. Bamboo pajamas. Visit cozyearth.com use the code Tucker for 20% off. That’s code Tucker for up to 20% off. And if you get a post purchase survey, make certain to mention you heard about Cozy Earth from us.
So that’s very, I mean, that’s a very simple thing.
YANNIK SCHRADE: Yeah.
TUCKER CARLSON: To send someone, you know, to text him an image and all of a sudden you have control of his phone. I think we can be fairly confident that people who have adversaries are being surveilled, right?
Accepting Tactical Surveillance with Legal Safeguards
YANNIK SCHRADE: Yes, I think so. I would say that tactical surveillance really is something that exists. I would say in this battle for privacy, it is actually not the most important thing to focus on. Right. Because this kind of tactical surveillance sort of, I feel like to a certain degree we need to accept. Unfortunately not the tactical surveillance that says Tucker Carlson is a journalist. I don’t like that. Let me surveil him. Right. That’s not the kind of tactical surveillance I’m speaking of.
But if we have legal procedures and actual judiciary warrants in place. Right. I feel like as a society we could accept that to convert as long as we’re trusted.
TUCKER CARLSON: We could definitely accept that. Of course.
YANNIK SCHRADE: But the fundamental issue really is, and that’s sort of so ironic. Right. That all of the surveillance sort of needs to operate under secrecy in order to function right. You should not know that you’re being surveilled. Nobody sort of has oversight, not even the democratic processes are able to have oversight because it’s all wrapped in secrecy.
So that really brings us to the fundamental issue here also with strategic surveillance, surveilling everyone just deciding, well, I’ll take a look at everyone’s phone store, everything, and maybe I don’t like someone in the future, then I have this backlog of information.
So the important question to consider here is thinking about is there even a future where from a legal standpoint, it is possible to implement procedures that guarantee that there is no secret surveillance in place? Which I think the answer is pretty clear to that question. And it is, I think it is not.
So I think it is important to have these laws in place, right, that prohibit surveillance and that enable different kinds of processes with warrants, right? Literally the Fourth Amendment right to allow for debt to be implemented in the 21st century. But what we’ve seen sort of is that the tools that governments have access to are so powerful that it is impossible to make a law that prohibits use of that because whoever within a centralized architecture, that’s always the case, has access to this technology, basically becomes a single point of failure. And that single point of failure will necessarily be corrupted by the power that exists.
iPhone vs. Android Security
TUCKER CARLSON: Just a couple obvious low brow technical questions. Is the iPhone safer than the Android or less?
YANNIK SCHRADE: That’s a good question. So I would say a huge advantage that Android devices bring to the table, right? Is this nature of, I guess, a subset of those devices, right, not speaking for the entirety, but the operating system, for example, being publicly viewable by anyone, you can understand it. And I think that is so important, not just for security, but also for technological innovation. And so I would say that is a huge advantage.
Now, the devices are manufactured by some manufacturer who you need to trust at the end of the day, based on how the hardware is built and how the firmware is compiled and then put on your device. So there have been interesting operating systems, operating systems. I think there’s one called GrapheneOS, which is a secure open source operating system, as far as I know. Haven’t looked too deeply into that. But you could on an Android device, theoretically, say, I’m going to run my own operating system on that, which I think is a strong value proposition.
Now, I myself am also an Apple user. There is also a sort of element of institutional trust involved here, right, where you say, okay, I trust the manufacturing and software process that this company has. But in general, if I’m being honest, if I wouldn’t be lazy. Right. What I’d be doing is I would actually be looking for a minimalistic, secure open source operating system for my mobile phone and I would build that myself and get some hardware and put that on there. So I would say that would be the smartest thing to do. If you are technically versatile.
TUCKER CARLSON: I read that you use an iPad, not a Mac. Is there an advantage?
YANNIK SCHRADE: That’s what I did back in the day when I started. Yeah.
TUCKER CARLSON: Is there an advantage to the iPad over the Mac from a privacy standpoint?
Sandboxed Systems and Decentralization
YANNIK SCHRADE: I think what it boils down there, down to there is what kind of applications could be installed on your system. I would say in general, devices like the iPhone or the iPad operate in a more sandboxed way where applications are actually isolated. Right. Rather than how it works on operating systems like macOS or Windows, where you could compromise the entire system way more easily. Right.
So on the iPhone, you just have an app store with applications and the level of compromise that such an application can have, theoretically, at least from the idea, is limited to just the single application. Right. Doesn’t have access to your messenger if you’re installing an app, although it has, I guess, if there’s some flaw in the system, which always is the case. So you never have this absolute security.
I think what it really boils down to is this idea that really emerged in the 1990s of decentralization, moving away from central single points of failures towards decentralization, where we can mitigate a lot of these risks by not depending, I guess, on one single type of computer and not even depending on one single computer, but having many computers, which introduces redundancy, resilience, and I guess, risk reduction and distribution to computer systems. So speaking more broadly about how the Internet in a free society should be built, I guess, yeah.
The Hardware Problem and Secure Communication
TUCKER CARLSON: You’ve said a couple of times that the problem is the hardware. It’s not the software, it’s the device.
YANNIK SCHRADE: It is the device, right? It’s the union of the hardware and the software.
TUCKER CARLSON: Yes. So what’s the option? Is there an option at this point? If I am intent on sending a private message to someone else electronically, is there a way to do it? As of right now, that’s private, guaranteed private.
YANNIK SCHRADE: So I would say the way that I myself at least handle it really is to have a dedicated phone for that specific use case, right. And then just have encrypted messenger there that you can trust because maybe you don’t even install it via the app store, but you have built it yourself and there’s no other interactions taking place with that phone.
I would say from an operational security standpoint that is as good as it can get. Otherwise you would really have to look at, I don’t know, you can do creative things always, right. You could write your message and hand encrypt it and then type it in the phone. Right. So doesn’t matter at that point. So maybe we need to get away from the devices altogether.
What’s interesting, what we’re doing with Archeum is that we never have a single point of failure. Everything is encrypted, everything sits within a distributed network where as long as you’re not able to basically get access to the entire globally distributed network, to every single participant, you have security and it’s difficult to do that with your own phone.
But at the end of the day, I think over time those systems get more secure. However, what is important is to be certain that there is no backdoors explicitly installed right from those manufacturing processes. I think there are some countries where if you’re buying a phone from there, you could be certain, okay, there might be something installed because the company itself is owned by the government and we need legal frameworks for that.
And also what we require sort of is that the manufacturing process itself mirrors distributed decentralized systems where there again is not a supply chain of single points of failure, where if one single worker decides to install some backdoor because they get paid off, right, they can do so, but instead there is oversight. And I think that Apple runs on that model already. So I would be relatively comfortable with these kinds of systems.
But there’s also other interesting technologies. For example Solana, which is American company blockchain network, they actually have their own phone company or offering phones. They have a very small manufacturer and they manufacture those phones because they say, well, those phones need to be very secure because you literally store your money on there now because your money is digital and on top of a blockchain network.
And so I think those are very interesting approaches where I’m really looking forward to seeing more phones like this where there’s then again a competitive market emerging for who’s building the most secure phone.
Yeah, I actually think a friend of Julian Assange from Germany, I don’t remember his name, had a company manufacturing secure phones. The issue with explicitly built secure phones, however, always is that I would say many of these companies are honeypots.
TUCKER CARLSON: I’ve noticed.
YANNIK SCHRADE: Yeah. With the Encrochat or whatever it was called, there was this large scale police operation to stop drug cartels, which worked out nicely I guess in the end, but the company itself was just a facade to sell backdoored phones. Yeah, right.
TUCKER CARLSON: I mean it’s the perfect honeypot. And so by the way, Signal, which I’m not saying is a honeypot, of course, but it was, and I use it as the authorities know, but it was created with CIA money. So it doesn’t mean it’s a CIA operation. But why wouldn’t it be? I mean, honestly, I’m not accusing anybody because I have no knowledge, but pretty obvious move, right?
Signal and Open Source Security
YANNIK SCHRADE: It would be. I think what’s important when we look at Signal actually is that we look at what Signal is. Signal is open source software that anyone can verify for themselves. And what that means is that we have this global community of mathematicians and cryptographers that have invented those protocols that have independently, without getting funding from CIA or whomever thought of mathematical problems that they want to solve, that they are passionate about.
And all of those people look at those open source lines of code and mathematical formulas and they find those flaws and those systems. And so that makes me confident in the design of Signal itself.
TUCKER CARLSON: Do you use it?
YANNIK SCHRADE: I use Signal, yes. I got my entire family to use Signal.
TUCKER CARLSON: Okay, good. So that’s, and I have to say, I know a lot of intel people use Signal a lot. All the ones I know. And so that tells you something right there.
YANNIK SCHRADE: Yes. So I think it would be highly unlikely that Signal itself would actually turn out to not be secure.
The Dual EC DRBG Backdoor
There has been this interesting case called, that was in the early 2000s, where there was this attempt to actually undermine strong encryption called very exotic name, Dual Elliptic Curve Deterministic Random Bit Generator. Dual EC DRBG. Right. Nobody understands, nobody, no non-technical person understands what that means. Right.
And it was actually what you need to understand in order to comprehend what has happened there is that when we encrypt information, as I said earlier, when we take something and move it into this different realm, where you cannot follow this information into that realm because that would require you to have literally infinite resources, more energy than the sun will emit over its lifespan. Isn’t that crazy? Right. So you cannot follow there.
Well, how fundamentally this asymmetry is achieved in cryptography is that the universe runs on energy and uncertainty, right? Particles chitter, stars burst. And so there’s this randomness in the universe. If you look at the sky or if you just look at how things are made up, there’s random noise everywhere.
And so when we encrypt something, we make use of that chaos and we inject it into a message that we are sending, for example, and it’s only possible to not decrypt that message in an unauthorized way if the randomness that has been injected in this message is actually unpredictable. Now, if we think of random…
TUCKER CARLSON: Unless it’s truly random.
YANNIK SCHRADE: It has to be truly random. Yeah. I cannot figure out how you arrived at the random number.
TUCKER CARLSON: No pattern.
YANNIK SCHRADE: No pattern. Exactly. True randomness, true entropy, right?
TUCKER CARLSON: Yes.
YANNIK SCHRADE: That’s what cryptographers, I would say, spend most of their time on thinking about. How can we achieve true randomness? Because then if we are able to inject that using mathematics for you, it becomes impossible to distinguish this message from randomness. You can’t find a pattern, hence you’re not able to apply any optimized algorithm to undermine.
TUCKER CARLSON: Exactly.
YANNIK SCHRADE: So if you think about it practically, what that means is, let’s say we have a deck of cards, 52 playing cards, right? And I randomly shuffle this deck of poker cards. We have 52 cards. What that means is that there’s so many possible ways that a deck could be stacked, that it is very unlikely that for truly randomly shuffled decks there have ever been two identical decks in the history of humanity. Which is hard to believe in general, but that’s how statistics and mathematics work, right?
So we take this deck and we use it as the randomness. Now if I play with a magician, the magician can pretend to shuffle the deck, but actually they have not shuffled the deck. They know what the cards look like.
What we’re doing with all of this randomness that we are injecting into information is we are basically describing what key is being used to unlock them. And if I don’t know how the randomness looks like, if I don’t know what the next playing card in the stack is, I have to try every single possible key and try to unlock it with this message.
So you could think of it as I have this message. Now I want to apply violence to this message in order to recover it. What I’m doing is I take key number one, I try to unlock it, doesn’t work, then let’s try key number two. And you do that for an inconceivably large amount of numbers. So that’s why you basically, practically speaking cannot brute force these kind of mechanisms.
Although you can if you know where to start looking for the keys, if you know that you need to start looking at millionth key, then you can recover it. And so if the deck is being manipulated, the randomness is being manipulated, then you can undermine encryption while the process of encrypting it itself remains sound. Right? You don’t notice it.
You actually do what you mathematically need to do to securely send your message, but the value that you use to do so, this randomness, is actually not random. And that’s what had been attempted with this specific algorithm, Dual EC DRBG.
What they did was they created this concept of kleptography where they actually have randomness, derive it in a way that is deterministic and they actually have some secret value. And then from that secret value they derive fake randomness that looks random, but it’s not actually random.
And the NSA proposed this algorithm to the NIST, the National Institute of Science and Technology in the early 2000s as the best state of the art randomness derivation function, I guess. Right. And that got accepted, they got accepted as official standard.
And then there was companies like RSA actually, a highly sophisticated and respected cryptography company. Right. With the founders being some of the fathers of modern day cryptography. Right. So that then built products and distributed it to industry and people using this technology.
Nobody knew about it, but it’s not actually true that nobody knew about it. So there were a lot of cryptographers that raised questions a couple of years later where they were like, I don’t think this is actually random. It looks suspicious to me. Whatever. Like if someone theoretically had access to some secret key and then created some mathematical formulas and actually mathematically proved that there was insecurity there.
TUCKER CARLSON: It was not random because they noticed a pattern.
YANNIK SCHRADE: They realized sort of that. So basically what they realized is that there’s just those numbers. So they wrote this proposal. “Hey, let’s use this algorithm.” And this algorithm contains some constant numbers. So there’s those numbers written there. And then they were like, “Are those numbers random?” Because we are literally deriving our randomness from those numbers. And we’re like, “Yeah, those are random. We randomly generated them.”
It turns out there was some other key that is being used to then mathematically be able to recover whatever randomness you used. So that, but there was this secret attempt to undermine cryptography by the US Government.
TUCKER CARLSON: Yes, yes.
YANNIK SCHRADE: And I think what’s so striking about this again is that you’re not just undermining privacy. Right. You’re undermining the entire security of your economy, your country. Right. And banking.
TUCKER CARLSON: Banking, missile codes, everything.
YANNIK SCHRADE: Yes, everything. So the thing that then happened was in 2013, Snowden revealed a few papers, I guess, and one of those was Project Bullrun. And within Project Bullrun, they allocated funding to that specific project where they tried to undermine cryptography.
And so once that got published, the corresponding companies and standardization institutes, and it’s so striking that you get standardization because once it’s defined as a standard, you in industry need to implement it. Right. To get certification. So it’s literally impossible to then use some other alternative that is secure because certification only gets provided for this backdoored technology. But it got uncovered thanks to Snowden. Then people stopped using it.
TUCKER CARLSON: Was he celebrated? Did he win the Presidential Medal of Freedom for this?
YANNIK SCHRADE: Yes. And then alternative reality in a different realm, I guess.
TUCKER CARLSON: One of the great patriots of our time. Relentless. I mean, they’d murder him in a second. He’s still in exile. Not by choice. But…
YANNIK SCHRADE: What they also uncovered is that they actually paid this company that built those products $10 million, the NSA, to use that as the standard. So, yeah, that’s why you cannot trust anyone.
The NSA’s Cryptographic Backdoors and National Security
TUCKER CARLSON: As you point out, it’s not simply—I mean, so this is an intel agency trying to spy on its own people, the ones you pay for it to exist. And that’s immoral and something that we should fight against. But they were also sabotaging the US economy and US national security. Because if your cryptography is fake, then that means you’re exposed on every level throughout your society.
YANNIK SCHRADE: You are. Yes. And it’s so interesting because it is their task. That’s why it was possible for them to do that, to increase national security. Right. At that point they were the leading cryptography research company in the world. And so that really is striking to me that you’re willing to undermine the entire security of your nation. And that at the end of the day puts you in a worse strategic position. I think many people don’t realize that.
TUCKER CARLSON: I’d never thought about it until you mentioned it. But it just highlights—I mean I love Ed Snowden and I’m not embarrassed of that, I’m proud. But it just highlights the suffering that he’s been through in order to help his own country. And he’s still slandered constantly. It drives me crazy.
But this is yet another example of why he did something more than almost anyone else to help this country. So it sounds like you’re convinced that the current state of the art in cryptography is actually secure.
Modern Cryptography and Global Standards
YANNIK SCHRADE: Yes, yeah, 100%. As I said, I think this is a great example to look at where even with those backdoors that had been implemented, there were cryptographers within this global open source mathematics cryptography community that rang the bell, but nobody was listening to them.
But they actually identified the issue years in advance and rang the bell and said this is not secure, not random, even within those companies and standardization institutes. But nobody took it seriously, or I guess took it seriously. But it doesn’t matter if the law is you have to use this algorithm. Right.
So that makes me very confident that this system works. The system of mathematicians, cryptography global.
TUCKER CARLSON: Which is to say, is Chinese cryptography different or stronger than European or American cryptography?
YANNIK SCHRADE: It is interesting. So you have actually specific encryption standards used by militaries of the world. Right. So the Chinese use different cryptography than the Russians than the Americans. It is at the end of the day the same thing. Right. From a mathematical standpoint, but there are some deviations in the level of security and the kind of numbers used. Right.
So everyone builds their own standards because they mutually distrust each other. But at the end of the day, the underlying mathematics are the same. The cryptographic standards, the way that cryptography works. That is the same.
TUCKER CARLSON: So there’s no reason to think the Chinese or the Russians have stronger cryptography than the Europeans and the Americans.
The Holy Grail of Cryptography: Encrypted Computation
YANNIK SCHRADE: So I think, no, no. And I think—I mean, it’s interesting to think about, is there cryptography that is being developed in house within militaries or whatever proprietary human organization that is not publicly known. That is incredibly powerful.
I mean, what I’ve been doing with my team, and I’m so glad that I have those incredible cryptographers in my team that actually understand all of those things on a way more detailed level than I do, is build this protocol that allows us to literally take everyone’s data.
So you could imagine the entirety of the United States, we take everyone’s healthcare data, something like that, right? And then we say, well, we need to do something with the data. Let’s say we need to research a disease or whatever. Instead of taking that data and passing it to some company that will inevitably expose it, lose it, it will get leaked, or it will be used against those people, we encrypt it.
Nobody ever has to share any information, and we just run whatever computation that we collectively set we are going to do with this data. We do that, we get the result. We, I don’t know, figure out a cure to cancer or whatever. But at no point in time, you ever had to share your data. Your data never left your ownership.
And I think that’s really core, and it sort of is the holy grail of cryptography, I would say, being able to do these kinds of things, because you can now run any type of computer program instead of in the public, in private, and you can restructure the way that your entire economy and country can work.
And that goes beyond just economical human interactions. That also touches upon things like rethinking how we can actually improve democratic processes. Because what those computations inherently have as a property is so called verifiability.
So what’s the status quo, sort of, in the current Internet, is you task some cloud provider to run a computer program for you, right? Because you have limited resources, you want them to run that computer program for you. So you pass them some information, an algorithm, and you get an output back. But how do you know that this output is actually correct? Right.
Could be that there was an error. Could be that they maliciously tried to undermine the output that they have sent you. So this technology that we’ve built actually solves this. Right? Verifiability for computations. You can mathematically verify that a computation has been correctly executed.
And that itself is an amazing property. An amazing property that you want to see within every system. Right. But you don’t get that amazing property without implementing privacy for those systems. Isn’t that amazing? It is amazing.
TUCKER CARLSON: It is. How did you all create this?
Building Practical Privacy Technology
YANNIK SCHRADE: So I’m very lucky that within my company I have very experienced cryptographers who’ve literally worked more years on these specific issues than I have been in cryptography. And so I’m sort of building on the shoulders of giants, of course. Right.
And there has for a very long time been research in those areas, being able to run those encrypted computations, but it has never been practical enough where it is fast enough, cheap enough. Right. And versatile enough where you can actually do all of those things.
And so I think what really guided us is to—and what really guided me in the way that I designed the system is to think about, okay, how can I actually build this system so that people are going to use it and are going to build applications and are going to integrate that into systems? Right.
Because I think with privacy technology in general, in the past, what has been done is that it sort of has been created in an echo chamber, in a vacuum, almost, where you’re a smart cryptographer that builds amazing technology, but you maybe don’t understand how markets work and how to get product market fit, how to actually get those users. Right.
And so we’ve tried to build it in a different way, and that’s how we ended up here. But to be honest, it was an evolutionary process for us. So we originally started with a different kind of cryptography, I would say, that was more limited, that didn’t allow for all of those interactions.
And then at some point we sort of decided, okay, and we realized that that was not good enough. That was not enough. And at that point, basically everyone was still building with that technology. And we were like, let’s do something different instead. Let’s think about how the future will look, like how sort of computation and privacy can converge in something bigger for the entirety of humanity.
And that’s then how we built it in very, very quick time, actually.
TUCKER CARLSON: How did you fund it?
YANNIK SCHRADE: So we got investor funding, and I’m incredibly thankful for all of the investors that I’ve gotten. Coinbase, for example. So big names in the space of blockchain distributed systems, all of those networks, like Bitcoin, all of those networks are distributed in nature, decentralized.
And yeah, there’s a lot of players within that space that truly believe in the value of privacy. And that privacy is a human right. And privacy is inevitable as a technology. They like to support it, but not just support it. Right. Because it is something they believe in, but invest in it because they sort of have realized that this is one of the most powerful technologies that can exist in humanity. Right.
Being able to take information, move it into this realm, and then it can stay in this realm and it can be processed and everyone can do that. That is incredibly powerful. It is emancipating and it is powerful for businesses, but also nation states. At the end of the day, it is a neutral technology. And so we have investors that believe in that.
Privacy in Cryptocurrency Transactions
TUCKER CARLSON: So one of the applications, we were just talking off camera, one of the applications for this technology, well, one of the big ones is the movement of money in a way that’s private. How exactly does that work?
And let me just add one editorial comment. The great disappointment of the last 10 years for me is that crypto transactions don’t seem to be as private or beyond government control as I thought they would be. I hope they are someday. But watching the Canadian truckers have their crypto frozen was just such a shock. I’ve never gotten over it. Will this technology change that?
YANNIK SCHRADE: Yes. So if you think about Bitcoin as the original, not state of the art, but the original kind of blockchain network. Right. What it is, at the end of the day is a way for distributed people to find consensus over some unit of money, which is actually more like a commodity than actually a financial instrument. That’s right.
And they find consensus and they create this currency. And that’s why people think that it’s fake, non-existent. Right. Although it’s a way more real process of creating a currency than fiat currency. They mine it by taking energy and solving a mathematical problem. And once they correctly solve that mathematical problem, they get rewarded in that newly mined currency. Right.
So it’s a very, very elegant design. Most people think that these kinds of networks are anonymous and are dangerous. Right. Because I feel like it has actually been a narrative that media and different actors want the people to believe.
TUCKER CARLSON: I just have to add, I would like them to be anonymous and dangerous.
YANNIK SCHRADE: Oh, yes. Yeah, yeah, yes.
TUCKER CARLSON: That’s what I was hoping for.
The Reality of Bitcoin’s Pseudonymity
YANNIK SCHRADE: So people believe that which attracts people. Right. And also keeps other people from using them and trying to outlaw them. In actuality, they’re not anonymous. What you have in Bitcoin specifically is pseudonymity.
So you don’t see on the blockchain Tucker Carlson has 10 Bitcoin or whatever and sent Yannik 1 Bitcoin, you instead see a random string of numbers and letters has sent something to another random string of letters and numbers. However, they’re linked to this identity that you have.
So for every single transaction that you’ve performed in history, on top of this distributed ledger, you will see all of those transactions. So when you later after the show send me one Bitcoin, I guess, right?
TUCKER CARLSON: They’re cheaper today than they were yesterday, I noticed.
YANNIK SCHRADE: So when you send me something, what I’ll be able to see is all of the other transfers that you’ve performed in the past. Right. That’s unfortunately how Bitcoin works. And so it has this inherent full transparency. There is no privacy because it’s so easy to then via, I guess on and off ramps, how you actually moved money in. Right.
Because you most likely don’t actually get this currency through work by applying energy, you buy it for a different currency, fiat money. Right. So your identity is linked, everything is public. And so that’s a fundamental issue that is actually a dystopian scenario where we could end up if this is adopted as the technology where all of your money now sits and you’re sending transactions where you have this big upside of having cash like properties, which is amazing, but you have this tremendous downside of literally everything being recorded for the conceivable future of humanity. Right.
And you have no privacy. And that inherently limits your freedom to use this technology. And so that is an issue that exists not just within Bitcoin, but also other blockchain networks.
And Bitcoin is this pure form. That’s why within this crypto industry there’s a lot of competition also between different players. Let’s say Bitcoin is this pure form that only allows transfers of money. Right. And other networks allow execution as well.
And that has led to what is commonly being called smart contracts. So this concept of computer programs that simply exist in the ether, basically a computer program that can execute something that you tell it to do and it will guarantee to do so. And this amazing property that all of the founding fathers of those networks basically identified as important is so called censorship resistance, which I think is also important in real life.
TUCKER CARLSON: Very.
The Censorship Resistance of Decentralized Networks
YANNIK SCHRADE: And so those networks provide censorship resistance. It doesn’t matter if one computer decides, well, I’m not going to accept Tucker’s transaction because I don’t like Tucker. Well, there’s going to be another computer that says, I will accept it. So that is censorship resistance that is inherently baked into those systems.
And what that means is if you interact with this as this invisible machine, you get guaranteed execution for whatever you tell it to do. Either send someone money or perform some other computational logic that is baked into the system.
And so there have been different kinds of pioneers on the front of performing adding cryptographic privacy to those systems. There has, for example, emerged a network called Zcash, which is basically Bitcoin with cryptographic privacy. And there have also been pioneers like the inventors of Tornado Cash, who have built a smart contract that exists within this ledger is unstoppable. Once you’ve uploaded it, you cannot stop it anymore. So they did that, and the kind of code that they implemented there gave you privacy on top of this public network, which was or is the Ethereum virtual machine. So they did that, and Tornado Cash did that.
TUCKER CARLSON: Tornado Cash, did they win the Nobel Prize? Did they get the Presidential Medal of Freedom? What happened next when they offered privacy?
The Prosecution of Tornado Cash Founders
YANNIK SCHRADE: So there were, I think it was three founders. Roman Storm, who’s an American citizen, Roman Semenov, who is a Russian national, and Alexei Petrosev, who is a Russian national as well, who lives in the Netherlands. He has been convicted of assisting in money laundering for five years.
TUCKER CARLSON: Five years in prison.
YANNIK SCHRADE: Five years in prison. And Roman Storm has been convicted in the United States of conspiring to run a money transmitter without a license. Now, why has this happened? Why did they suffer such grave consequences?
TUCKER CARLSON: They were arrested.
YANNIK SCHRADE: They were arrested and brought on trial. I mean, it’s actually, if you look at what Roman Storm has faced, it was 40 years in prison for this.
TUCKER CARLSON: In the United States.
YANNIK SCHRADE: In the United States of America. And why has that happened? They built a privacy tool. Well, it was an illicit actor that used their privacy tool. And that is a shame, because it was an illicit actor that a lot of people agree on it as an illicit actor. I think the two of us also agree that North Korea laundering stolen, hacked funds is an illicit actor misusing a tool. So there’s no question about this.
TUCKER CARLSON: The underlying question, really, when we’re sure that actually happened.
YANNIK SCHRADE: We are sure that happened. Yes, for sure that has happened. And so they stole funds because they were able to hack different systems and then were able to utilize this platform to gain privacy, to then move those funds somewhere else.
TUCKER CARLSON: Did Roman Storm participate in the North Korean hedge fund theft?
YANNIK SCHRADE: He did not.
TUCKER CARLSON: So if I rob a bank and then jump into my Chevrolet and speed away, does the president of General Motors get arrested?
YANNIK SCHRADE: Usually he doesn’t, which is interesting because he provided clearly this tool for you to get away. And he knows that people get away with cars, right?
TUCKER CARLSON: Yes, he does. Kind of weird how he’s dodged those obvious charges.
OFAC Sanctions and the Censorship of Code
YANNIK SCHRADE: That’s really what happened. Yeah. And has faced 40 years in jail, but the jury could not find a unanimous decision on the main charges. I guess circumventing sanctions and helping with money laundering.
Now, the interesting thing is, before they got arrested, what has happened? OFAC, the Office for Foreign Asset Control in the United States, they took the software that those developers had written and uploaded to the Ethereum, where it has become out of anyone’s control, unstoppable by nature. Anyone can use it. They essentially wrote code for a software tool for anyone to get privacy. That software tool got sanctioned. It got put on the SDN list for specially designated nationals, where you put the names of terrorists and you put the address in this Ethereum thing of the software.
So the source code itself became illegal. It was deleted from the Internet. All of the companies closed their developer accounts. The software they wrote, the free speech that they performed by coming up with those ideas and publishing it to the world got censored because they were added to a list which they don’t even belong on.
TUCKER CARLSON: Without any vote in Congress, by the way. This is just part of the, I think it’s under State Department now, but I could be, or Treasury, I can’t even remember. But they have enormous power. They’ve destroyed the lives of many thousands of people without any democratic oversight at all. And it’s pretty shocking.
YANNIK SCHRADE: Yeah. And so it got added onto this list, and I think last year a court in the state of Texas actually ruled that OFAC does not have the statutory authority to do any of that. And they then silently removed Tornado Cash again from the SDN list. However, nobody is able to use that tool now. Because every company, for compliance reasons, outcasts you from the user base if you’ve ever touched anything related to that.
TUCKER CARLSON: And Roman Storm is, he was convicted. You said there was a hung jury on the strongest charges, but on other charges, he was convicted.
YANNIK SCHRADE: He was convicted on one charge on, I think it is called conspiracy to run a money processor financial institution, a bank without a banking license, conspiracy to start a bank. So they put him in jail, actually. So it is one year jail sentence. That’s on the charge. But he’s currently in the process of appealing that.
So Roman Storm didn’t run a bank. He didn’t create a bank, he created software. He made use of his inherent right for freedom of speech to build something that enables others to make use of their right for freedom of speech. Because that is, at the end of the day, the freedom of economic interaction. That is what he helped others protect for themselves. He never processed a transaction for anyone. He’s not an intermediary. He specifically built technology that is disintermediated where you yourself use that software.
TUCKER CARLSON: And so the remarkable thing is I pay some attention. Obviously not enough. I was not aware of this story until I was reading up on you. Where’s all the coverage on Roman Storm? He doesn’t even have a Wikipedia page, I’ve noticed.
YANNIK SCHRADE: So there is, I think, incredible institutions like the Electronic Frontiers Foundation, the EFF and DeFi Education Fund, but also companies like Coinbase who actually have invested substantial amount of money into defending Roman Storm. And, yeah, Alexei Petrosev as well. I think Alexei Petrosev also doesn’t get enough attention. I mean, he’s now under house arrest in the Netherlands and preparing to appeal this decision.
The Russian Connection to Privacy Technology
TUCKER CARLSON: Why are so many of the players in this Russian?
YANNIK SCHRADE: I think it really boils down to them having a deep understanding about, I think historically, maybe culturally, they have an understanding about the importance of privacy in a society to uphold freedom, which is a shame.
TUCKER CARLSON: Well, they’ve suffered for that knowledge for 70 years more than. So, yeah, it’s very striking. It’s 140 million people. It’s a tiny country, relatively speaking, and yet they are way overrepresented from Pavel Durov on down, for sure.
YANNIK SCHRADE: Yeah, that is true. So I think it’s interesting how we also, all of us, take that as granted that these kinds of people go out of their everyday lives and put a target on their head by shipping this technology to enable you to gain privacy. And simply the knowledge about the existence of bad actors in the world has made them victims and put them in jail, which is insane.
TUCKER CARLSON: Well, I mean, it’s something the rest of us should push back against, I think. But the hurdle for me is not knowing. Again, I didn’t even know this was happening. I should have guessed. So if you could be more precise about what you think the real motive was behind going after Tornado Cash and Roman Storm. Like, why was the US Government not prosecuting drug cartels in order to prosecute Roman Storm?
The Two Possible Futures of Money
YANNIK SCHRADE: I think so. That has taken place under the previous administration. So I think President Trump, with his administration has done tremendous work in regards to pushing the adoption of decentralized technology, of really allowing us to all of the people in that space to try to rethink the financial system and build this technology. Because they’ve sort of realized that technological innovation runs at a faster pace than legislative processes. And under the previous administration that looked differently. So I think that has helped this technology spread a lot.
And it is however, important to consider privacy. And when the executive order banning CBDCs was signed, Central Bank Digital Currencies, an explicit reason why CBDCs should never be adopted in the United States was the privacy concern. Because if we look at all of those new digital shiny currencies being built in Europe and all around the world, I guess, besides the U.S., which is great, which actually is amazing, I think, is that all of them are surveillance machines to even a higher degree than the current financial system is already. It is already a surveillance system.
But what’s so important about this next generation of money is we are sort of at a crossroads. Do we want our money to enable us freedom, freedom of economical interaction, freedom of thought at the end of the day, because whatever we think we do, where we want to put our money, where our mouth is, or do we want a monetary system that enables automatic subsequent action based on whatever activity you perform in your digital life?
Which can mean things like now all of your money is frozen and you don’t have any access to it anymore because whatever you just did was deemed as undesirable by Big Brother, I guess. So that is literally the two possible futures that we have. It’s two extremes, there’s no possible future in between. And what the architects of this…
TUCKER CARLSON: So you’re assuming cash is over.
YANNIK SCHRADE: Cash already is also being heavily surveilled. So your banknote has a serial number. So if you actually think about something like Tornado Cash or all of the, I mean, there’s a lot of applications that, for example, utilize Arquium to also bring this level of privacy. If you think about all of these systems, they are in my mind personally, I mean, as long as you have an Internet connection, if you don’t have an Internet connection, maybe you cannot spend your money right now. But as long as that exists even superior to cash. Because you don’t have any serial numbers anymore, right?
The Surveillance of Physical Cash
TUCKER CARLSON: Wait, so you say cash is being surveilled?
YANNIK SCHRADE: Sure. I mean, when I go to the ATM and withdraw money, the serial numbers are recorded in some database. And when a merchant at Walmart, I guess, or wherever puts that into their cash registry, you can also record a serial number.
TUCKER CARLSON: Is that true?
YANNIK SCHRADE: Yeah, there has been, I read an article a few months ago about a tracking system like that within Europe. So that is very practical.
TUCKER CARLSON: I’m going to take a magic marker, a pen, and distort the serial numbers and all my cash now.
YANNIK SCHRADE: Yeah. So, I mean, it should still be legal tender, right?
TUCKER CARLSON: I would think so, yeah. I’d never heard of that.
The Power of Code and Surveillance
I mean, there could be other tracking mechanisms. I don’t know. But I’ve read about this technology which clearly exists and is being used to even turn the cash system into a surveillance system. And it’s not even, again, I think all of this is not even just someone with governmental authority deciding to surveil people. It is also companies seeing economical value in surveilling you and then utilizing this new technology, utilizing the Internet to do that.
And it boils down to power, I would say control. If you have access to as much information as possible, you can better prepare for the future and you can predict behaviors of your users or different actors. And so that’s why those systems get implemented.
So we are on this fork in the path towards the future. And what the people that are architecting those central bank digital currency systems have realized, and that’s so interesting to me, is this old concept that the cypherpunks in the 1990s came up with, which is code is law, which expresses what has happened with tornado cash, I think, nicely, where it is the ultimate law.
When you have this network that nobody controls and there’s some piece of software and it just executes whatever is written within that software, code executes. There’s no way of stopping it. There’s no way of doing anything about it. And so that’s what I mean when I say code is law.
And the architects of those alternative systems have realized that there’s so much power in being able to, let’s say, take your chat messages and see that you have set something against Big Brother. And Big Brother doesn’t appreciate that. And so automatically now your money is frozen. And that is code is law, in the utopian sense and in the dystopian sense, where software automatically can lock you out of all of those systems.
And I would much rather have a utopian future than dystopian future. But at the end of the day, from a technological standpoint, those things are similar. The only difference really is cryptography, privacy.
TUCKER CARLSON: Privacy, because you’re offering that on a scale even larger than anything tornado cash or Roman Storm attempted. It has to have occurred to you that whether or not you have prominent investors, you face some risk.
Building Privacy Technology at Scale
YANNIK SCHRADE: Sure. So I think what I’m doing with Arkham at the end of the day is I’m providing the most versatile and superior form. You can execute a computer program within encryption, you can execute a computer program and you can have many people contribute encrypted data, and you can do all sorts of things. You can do things starting with financial transfers. You can add privacy to financial systems.
But that doesn’t just mean we are adding privacy to me and you, Tucker, interacting with each other. We can also add privacy to entire markets.
TUCKER CARLSON: Right.
YANNIK SCHRADE: Which again, can also have downsides. I’m not arguing that there’s only upsides with this technology. There might be actors that then utilize that. Not just talking about criminal activity, but just unethical activity. The way that people maybe interact. So at its core, it is neutral technology.
But the use cases that I’m really focused on enabling, also our use cases, like enabling within the healthcare system to actually utilize data that currently is being stored, but it is being stored in a very inefficient way where it’s isolated. So with my technology, we can take this data and use it without ever risking that data to be exploited, without ever taking ownership of your data because you’re the patient, you’re the human. I have no right to take ownership over that. And I don’t need with that technology.
Because you can consent and say, let’s improve healthcare or whatever with my data, but you’re not getting my data because it’s encrypted. I don’t know. It’s a crazy concept to wrap your head around, I get that. But it enables so much also on a national security level that it is strictly superior technology.
And I think this example that I told you earlier about verifiability, mathematically being able to be convinced that a computer program, a computation that has been executed in privacy has been executed correctly is such an amazing concept. And the way I think about it really is opening up a new design space altogether and allowing companies to do actual innovation instead of innovating only on the front of how can I extract as much value as possible from my user by surveilling them.
So I don’t really think about it the way that you framed it. I’m building this generalized computing platform that can be used by anyone because I don’t have any control over it. I’m not building a controlled infrastructure. I’m building open software that is used for good.
TUCKER CARLSON: And I’m grateful that you are. And I don’t at all mean to make you pessimistic or paranoid, but in so doing, you’re threatening current stakeholders.
Disruption and Risk
YANNIK SCHRADE: Sure, but I think that’s always the case with new technology. I mean, when cars first came along, there were unions of horse carriage, taxi ride providers. They did not want to see cars on the road.
TUCKER CARLSON: Of course.
YANNIK SCHRADE: So there’s always interests that try to utilize both technology and law to prevent others from getting monopoly in place.
TUCKER CARLSON: Of course, the stakes depend entirely on how disruptive the new technology is. Ask Nikola Tesla. So it’s not a concern?
YANNIK SCHRADE: It is not a concern for me, no.
TUCKER CARLSON: I wonder if that’s just a quirk of your personality where you’re just not afraid of stuff.
YANNIK SCHRADE: That’s actually an issue. I would say I suffer from sometimes not being afraid of things, but I think it’s…
TUCKER CARLSON: I think you need that in order to proceed. So from the perspective of the average American consumer who’s not following this carefully, when does your life begin to look different as a result of this kind of technology? When will you see this sort of thing in action? How will you experience it?
YANNIK SCHRADE: That’s actually a brilliant question, I think just trying to run numbers in my head and trying to predict the future.
TUCKER CARLSON: That’s something I’ve never done, by the way. I’ve never paused in mid conversation that I’ve got to run some numbers in my head.
YANNIK SCHRADE: I do this all the time.
TUCKER CARLSON: I never have.
The Future of Privacy Technology
YANNIK SCHRADE: So much so I think it will affect your everyday life positively. Once I guess there is an inflection point reached on multiple fronts. I was talking about healthcare, national security, also financial system, but it also, I mean, so that’s a criticism I actually have for Signal that is that there exists one single point of failure within Signal’s technological stack that I’ve been vocal about and I dislike, which is that what they call private contact discovery, where I have a set of contacts in my contacts on my phone. You do the same thing.
And if there is an intersection between the two sets that we have, where I have you as a contact, you have me as a contact, I get Tucker suggested on Signal. Only in that case, how does that work? How does Signal ensure that those contacts are encrypted and secure? They use trusted hardware for that and that is a critical flaw within their infrastructure.
So there’s technology, trusted execution environments is what they’re called, manufactured by Intel, for example. And this technology comes with this promise of being secure and being able to basically do what we are doing with mathematics, but instead with trust. So they say we built a secure machine.
TUCKER CARLSON: You think we shouldn’t trust Intel?
YANNIK SCHRADE: I think so, yes.
TUCKER CARLSON: I think they’re required to trust Intel.
YANNIK SCHRADE: Yeah. I think it’s insane idea to begin with. Last year it’s been funny. Last year there have been a myriad, just last year, but over the last 10 years, a myriad of exploits of the technology. So in the past, it has always been sold as here’s this technology and it does verifiability and privacy. And just put your data in that. There’s no backdoor. Of course not. Why would there be a backdoor?
TUCKER CARLSON: Why would Intel cooperate with anyone?
The Problem with Trusted Hardware
YANNIK SCHRADE: Sure, right. And you would do that. And then last year there were those researchers that said, well, if you have physical access to this computer, you can just read out all of the data. And you can not even just read out all of the data, but you can fake keys and then you can perform fake computations on behalf of other people.
So if you’re building a financial system with a computer like this, I can just change numbers. And I know what your numbers and I can change those numbers. And that’s not even the core issue I have with that in the case of Signal. So Signal is I think, still relying on that attack. So I think they run this hardware. I mean, I hope they run the hardware because at least there I have a little bit of remaining trust assumption that, okay, they will not try to hack those PCs, which is relatively straightforward. You just connect a few cables at the end of the day and then you can exfiltrate the information, which is the interactions. Is Tucker my contact? Is Yannik Tucker’s contact? That’s very sensitive information.
And so that is a single point of failure. Whereas they could access that information or whoever gets access to that information. And we are not even thinking about potential backdoors at that point within that hardware. So within the manufacturing process. I mean, I think it would be very naive to assume that there is no backdoor. Similar to what we talked earlier about with Dual EC or something like the Clipper Chip thing that was attempted in the 90s.
So it’s very likely, I would say, that there’s some randomness, tempering, let’s call it that, that could be in place because you are literally also getting keys from the manufacturing process. So it’s this proprietary supply chain, and then they ship that computer to you, and it comes with random keys that have been generated in that proprietary production line.
So there’s many single points of failure. And that’s what I don’t like about Signal, because I don’t want this information out there. What does my address book look like? So they can fix that. They can fix that with technology that we’ve built. They can use our technology. I’m more than happy to just give them the technology. I mean, it’s open source. And then they can just build this thing without a single point of failure, without a way, because this is a reasonable way for a state also to say, well, you actually have this data. Give us this data.
But they cannot really argue that they don’t have that data because they could connect a few cables to that computer and then get that data. So it’s not the secure device that people claimed in the past it was. So I think that is important to resolve. I actually don’t recall how I got to that tension.
The Challenge of Secure Hardware
TUCKER CARLSON: I wonder if any big hardware manufacturer will begin to offer truly secure devices for sale. It’s not worth it, probably, right?
YANNIK SCHRADE: So I think it is worth it. You as a military want to have secure devices. Everyone, I think everyone would rather compute on a secure device than an insecure device.
TUCKER CARLSON: But the manufacturers aren’t making their money from the devices. I mean, they’re making money. I don’t know what it costs to make an iPhone less than 900 bucks. But I mean, it’s an annuity. Like the second you buy an iPhone, you’re making money for the company every day you use it, right?
YANNIK SCHRADE: Sure, sure. So I think it is impossible to build secure hardware in that regard where those claims of full privacy and security are factually true. That is impossible. There have been so many techniques where actually just use so many different tools to play around with those devices, where it is literally impossible to implement secure and verifiable systems, because even while verifying them, you need to take them apart, destroying them in the process.
So that does not exist. What I think, however, exists is this concept of decentralization and why that’s so powerful, because it doesn’t really matter if this manufacturer here creates a backdoor as long as I have 10 different computers or 100 computers from different manufacturers and there’s one that does not have a full system level backdoor. I am secure under this trust model that we’ve developed in our company. So I think that’s why decentralization is so important.
TUCKER CARLSON: That was the basis of our political system when it was created. That same concept, power is dangerous and so it has to be spread among different holders, different entities, so it doesn’t concentrate and kill everybody and enslave them. That’s obviously going away. But that was the concept of the American Republic.
The Power and Danger of Surveillance Infrastructure
YANNIK SCHRADE: Yeah, exactly. And I think it is sort of important to look at surveillance in the same way where if you have access to surveillance, you basically have access to unlimited power. So whatever surveillance system we implement, be it chat control in the European Union, where I’ve been very vocally opposed to on X, and I actually just learned last week that the UK implemented their version of chat control on 8 January, which is a censorship machine and surveillance vector installed within all of your messaging applications.
And it comes with this claim of, “Well, we are implementing this because we need to fight child exploitation.” Right. There’s always one child exploitation.
TUCKER CARLSON: They care about their children.
YANNIK SCHRADE: Yeah, I strongly believe that. So they basically have, there’s basically four reasons to implement surveillance. So there’s child exploitation, there’s terrorism, there’s money laundering and there’s war on drugs.
TUCKER CARLSON: Oh, war on drugs.
YANNIK SCHRADE: Those are the four reasons. Right. And they always rotate.
TUCKER CARLSON: The people engaged in importing drugs into our country, laundering the money, exploiting the children and committing serial acts of terror against their own population, they’re all very concerned.
YANNIK SCHRADE: Oh, man. I really think we now need surveillance.
TUCKER CARLSON: Now that you say it, not of us.
YANNIK SCHRADE: So what’s so funny is that in 1999, some policing working group of the European Commission, there was a transcript of their discussions and literally within the transcript, when they were talking about implementing digital surveillance systems, they were like, I think we should switch our arguments over to child exploitation because that is more emotionally charged. Right. That convinces people. And so it’s not just that for us, it is obvious that that’s not what’s going on. Right.
TUCKER CARLSON: When the people who covered up the grooming gangs are making that case, it’s like, I don’t think it’s sincere at this point.
Why Surveillance Infrastructure Will Always Be Abused
YANNIK SCHRADE: Exactly right. So there is a reason why we don’t believe that that’s the actual reason. But what I’m arguing for is that it doesn’t even matter, even if the politicians are convinced that it’s about protecting the children. And that’s the most effective measure to do that. Right. To survey all of the chats.
What’s going to happen is thanks to this being implemented as infrastructure that exists everywhere and there being a small circle of people that have access to this technology, it will get abused. It is very easy to abuse those systems because the abuse itself happens within secrecy. So there’s no oversight, of course.
TUCKER CARLSON: And instantaneously, because of the rising computational power. It’s not like someone has to go to the Stasi archives to read all the files. It’s like…
YANNIK SCHRADE: And Sam Altman will gladly help you to sift through all of your data.
TUCKER CARLSON: Oh, he’s a good guy. By the way, a lot of these businesses draw the worst. Like, the most unethical people have the most power, in case you haven’t noticed. It’s wild.
YANNIK SCHRADE: It is wild. Yeah, yeah. I mean, there’s an economical function sort of to reward this. Right. Because if I build an application and you build an application and we just provide some value to our user and the user pays for that, basically capitalism. Right. All of that works out nicely.
But then you decide, what if I take all of this information from my user and I use that to extract additional value from him? Right. You’re way more profitable through that.
TUCKER CARLSON: So the incentives.
YANNIK SCHRADE: And so then those incentives shift towards the setup. And these kinds of applications are the ones that receive investment. Right. And so that trust increases and so unethical behavior gets rewarded in the system.
The UK’s Online Safety Act
TUCKER CARLSON: Just to be clear about what you’re saying, are you saying that all texts sent within the UK are now monitored by the UK government?
YANNIK SCHRADE: I’m not 100% familiar with all of the intricacies of what the Digital Service, or Online Safety Act, I think it’s called in the UK. What is happening there is that there is censorship being applied to the messages. So you receive whatever unsolicited image and then that’s being censored.
So what’s happening there is, I think what’s important to understand is that censorship is a byproduct of surveillance. Generally speaking, yes. And so you need to take a look at all messages in order to censor something. Right. And so that’s what’s happening there. And even if we assume only the best of intentions, you have this infrastructure in place that tomorrow cannot just be abused by someone.
TUCKER CARLSON: Well, we should test it. I’m in the UK all the time. I have family there. And I’m going to do a double blind study with my wife. I’m going to text to every person in my contact list, “Overthrow Keir Starmer.”
YANNIK SCHRADE: Okay. Yeah.
TUCKER CARLSON: And to thousands of people, exclamation point. And she won’t. And we’ll see who gets arrested.
YANNIK SCHRADE: Yeah, that’s a great experiment. Actually. I need to attend a conference in the UK this year and it’s so funny because a month ago there was this, I think it’s also some proposal that basically specifies that people that work on encryption are sort of persona non grata in the UK. Something like that. I think it’s not yet implemented, but I saw that on X.
TUCKER CARLSON: You can’t get in the country if you’re for privacy?
YANNIK SCHRADE: Something like that, yeah.
Where Can People Escape the Control Grid?
TUCKER CARLSON: Where are we going to, big picture, where is everyone going to end up, do you think? If the control grid snaps into place and it is snapping into place, where do people go? US? Is that the only place?
YANNIK SCHRADE: So all of those, I mean we are basically, I would say, not just sliding into that direction, but galloping. And the infrastructure, it has been quite a while since they started trying to implement those in your face things. Right. Where you literally call it chat control. I mean, imagine how crazy that is.
This literally stating every single messaging platform, email, whatever we need to scan for this made up reason. But trust us, we will only do that for this made up reason and no other reason. And it happens on your device. Right. So that’s why end to end encryption is not undermined because it is being scanned on your device. Right.
TUCKER CARLSON: And that’s very different from putting microphones in your bedroom. Trust us, very, very different.
The Extent of Modern Surveillance
YANNIK SCHRADE: Yes. Yeah. I mean, I think people don’t realize the extent to how surveillance is possible nowadays. So with Wi-Fi routers you can determine movements within your apartment. Right. And so there was this one company, I mean there wasn’t a big scandal, it was literally just, I don’t know if you’re familiar, I think he’s called Louis Rossmann, who’s a YouTuber from New York who was fighting for the right to repair devices and stuff. Right. So he’s always been very much educating those efforts.
And so he just made this video where he went through the privacy policy of some Internet service provider and the privacy policy explicitly stated that they’re allowed to monetize the movement data that they get from those devices that you put in your home. And the funny thing about this case that he was highlighting is that for you as a person that lives in this building, you didn’t even have an option to choose a different Internet service provider because with, I guess, bulk agreements between a landlord and the Internet service provider, you are forced to have those routers and those routers aren’t even within your apartment, they’re in the walls or somewhere.
And so you’re just being scanned within your most intimate area of life, your home, by your Internet service provider.
TUCKER CARLSON: And what about phones? Listening to people, the microphone on the phone or the camera on the phone taping you?
Ultrasound Tracking and Phone Surveillance
YANNIK SCHRADE: So there’s an interesting concept of ultrasound listening of those phones, where basically you have a TV advertisement and we don’t hear ultrasound. Right. But your phone with its microphone could hear it. I don’t know if it’s ultrasound or whatever frequency. Right. So within that advertisement, we’re going to play that sound so your phone can pick that up.
And then when you go to our fast food restaurant on the same day, we know that this advertisement has worked because your phone previously registered it. So there have been a lot of attempts like this, I think that surfaced a couple of years ago. I don’t recall the exact name of how this technology was called, but especially there were court cases actually against that where they required a company that offered the technology to make the user aware that this is happening.
Because a lot of apps had this technology installed and they had microphone permissions and they just installed this library because maybe that library pays the app developer some money. Right. And at the end it is tracking you. So what I’m just trying to say is there’s a sort of infinite amount of ways you can be tracked.
I mean, just last year in the US there were those cases surfacing surrounding city surveillance cameras. Around 40,000 of these, I think exist in the US and those cameras, or also license plate readers. Right. All of that are incredibly smart, equipped with artificial intelligence to directly track faces of humans.
And there was this one YouTuber, Ben Jordan, who actually exposed that and funnily enough, after exposing that, got private investigators from the said company to his home to I guess fully destroy his privacy. But so I think he helped expose that. None of these cameras were encrypted. So they were recording all cities across the US permanently 24/7, storing that, everything being mass surveilled while anyone could just, via a Google search and some specific query, get access to the camera feed and see what is going on.
And he showed videos of playgrounds where children were playing. Right. And so that’s what I mean when I say that surveillance does not bring us safety or security. It is in most cases doing the opposite.
Networked Surveillance and Facial Recognition
TUCKER CARLSON: It’s also all networked. It’s digital and it’s networked. So that means that companies can pull up CCTV cameras from around the world.
YANNIK SCHRADE: Oh yeah. Anyone can.
TUCKER CARLSON: Facial recognition?
YANNIK SCHRADE: Yeah, anyone can. I mean it’s, and what I really found so striking about this story is him outlining how he was able to follow people around. Right. He was able to say, “Oh yeah, they went to church here on Sunday and then they went there for shopping.” That is insane. Right.
And I don’t know, you as a human being, just there was this one video of an adult man just going onto a completely empty playground and just hopping onto the swing and just swinging there. Right. If this person knew that he was being watched, he would never have done that. Right. And so this idea of escapism is entirely impossible in a world like this.
TUCKER CARLSON: Because there is no escape.
YANNIK SCHRADE: There’s no escape.
TUCKER CARLSON: Yeah.
License Plate Readers and Abuse of Surveillance
YANNIK SCHRADE: Also with license plate readers, which aren’t license plate readers, they are surveillance cameras that pretend to only do a specific function.
TUCKER CARLSON: What other functions do they do?
YANNIK SCHRADE: I mean, record everything and be able to track cars even if they don’t have a license plate. So you cannot be trusted as a license plate reader if one of your capabilities is to also help you identify cars that don’t have a license plate. Right.
TUCKER CARLSON: Fair.
YANNIK SCHRADE: So I just recall one case where there was a police officer who then used this access to technology to stalk his ex-girlfriend, which is inevitable with this kind of technology. You put that power into the hands of individuals who can use this technology in secrecy. It’s not like throwing a nuclear bomb on a country. People will notice. Right. Mass surveillance, nobody notices.
Contact Information
TUCKER CARLSON: Can you, so people have made it two hours into this interview. They’re obviously interested in you. First, can you pronounce and spell your name?
YANNIK SCHRADE: Yannik Schrade. Y-A-N-N-I-K, S-C-H-R-A-D-E.
TUCKER CARLSON: The name of your company and its spelling?
YANNIK SCHRADE: Arcium. A-R-C-I-U-M.
TUCKER CARLSON: How do you speak English as fluently as you do since it’s your second language?
YANNIK SCHRADE: I would say it’s funny because as a child, when I was in high school, there were phases because I was consuming so much English content on the Internet that I was consciously thinking in English as a child.
TUCKER CARLSON: Yeah, I would say that you’re on Twitter. Where else can people go to read your views on technology and privacy?
TUCKER CARLSON: Mainly on my Twitter Yrtrade. And I also have a small website, just my personal website, I guess. I don’t have a blog there. I write all of my articles basically on Twitter. Sometimes I get the chance to publish my views on some very niche news outlets in Germany. But most news outlets don’t really care about privacy, so I stick with X. And I really like talking on X, sharing my thoughts on X, writing articles there.
When I talked about chat control specifically on X, and it’s so funny, we haven’t even touched on the fact that chat control, the way it’s aimed to be implemented in the European Union with the current proposal. I mean, what happened is that there was this proposal where they said all providers need to have chat control, which is so called client side scanning. Tucker’s phone is going to check the message that Tucker is sending right now. If that message is illicit under some definition, and if so, then it’s going to send a message to the police. That is what client side scanning is.
And in its most, I guess, innocent form, it would just be we’re going to censor the message because, I don’t know, child exploitation or whatever made up reason. So we’re going to censor that message. The worst case, it would just be we’re going to forward that message. And that’s what the law that they had is. That received a lot of backlash also thanks to Elon Musk and didn’t pass.
And then as you would expect, shortly after, I think it was less than a month, they came back with a new proposal and the new proposal made it voluntary. So the new proposal basically states, hey, Mark Zuckerberg, do you want to voluntarily add a surveillance mechanism to your applications? Which is insane, right? Because of course companies will voluntarily implement those surveillance mechanisms.
But if you go down those different paragraphs in that proposal, what you will realize is that it is in fact not voluntary. What you will realize is that in order to combat child exploitation, the European terrorism, money laundering. Yes, yes. So in order to do that, they’re going to introduce a new bureaucratic agency who is tasked with risk assessing different platforms.
So we’re going to look at Signal, we’re going to look at WhatsApp, we’re going to look at Gmail, every single platform, we’re going to risk assess and then we’re going to be like, how risky is that platform? If it’s risky, then we apply coercive measures and they need to implement all measures to combat whatever illicit activity is targeted, which in the case of child exploitation explicitly means that, because that’s the only thing you can do, scan those messages.
And so it is not voluntary after all, because if, and it explicitly says that if you don’t want to land in the high risk category, just voluntarily scan and then you’re not in that category.
TUCKER CARLSON: In the US that’s called extortion. You don’t have to give me your money, but I’ll shoot you.
YANNIK SCHRADE: Yeah. But feel free to not give me your money.
TUCKER CARLSON: It’s your choice.
YANNIK SCHRADE: Yeah.
Looking Ahead
TUCKER CARLSON: Last question. You’re 25 years old, which is remarkable. Where do you imagine you’ll be at 45?
YANNIK SCHRADE: At 45, you mean, what will you be doing?
TUCKER CARLSON: What will the world look like?
YANNIK SCHRADE: What the world will look like? I’m a very optimistic person. So while there is those two trajectories that I think not just the United States, but humanity in general will either take, one of those, I strongly believe that we will be able to move into the utopian direction instead of the dystopian direction.
And so what that means for what I need to achieve is I need to not just tell people about the importance of this. People sort of know that privacy is important. I think most of your audience realizes that. Otherwise I feel like they wouldn’t be listening to you. So it is, of course, about education and stuff, but more importantly, and that’s this core realization that I have had is that privacy is only going to get adopted if it enables strictly superior technology.
And so that’s what I’m doing. That’s the mission. That’s what I’m doing with Arkhium, to enable a situation in which you have to adopt it, sort of. Because it would be retarded to not do so. And so that’s what I’m trying to do. And I think we can end up in a world like this where—
TUCKER CARLSON: Because that’s what it needs. You’re exactly right. It’s not enough to say we’re not fully human without it. The board of directors is going to say, well, yeah, but look at the returns.
YANNIK SCHRADE: Exactly.
TUCKER CARLSON: Right.
YANNIK SCHRADE: Yeah.
TUCKER CARLSON: I can’t thank you enough. If our viewers knew how this interview came about, I don’t think they would believe it. So I’m not even going to say how this interview came about, but it was through a series of chance encounters that just really felt like the hand of God said so. Thank you very much for doing this, Yannik.
YANNIK SCHRADE: Thanks for having me, Tucker. I appreciate it.
Related Posts
- Anthropic CEO Dario Amodei’s Interview on Dwarkesh Podcast (Transcript)
- Transcript: Jensen Huang’s Interview @ Cisco AI Summit 2026
- Dwarkesh Podcast: w/ Elon Musk on Terawatt of GPUs in Space (Transcript)
- SRS #275: w/ NANO Nuclear Energy’s CEO Jay Yu (Transcript)
- Conversation with Elon Musk @WEF 2026 (Transcript)
