I’m going to start with a quote from a dead white guy, Plato. This is a quote in which he says, “The world’s not going to be in good shape until,” get this, “Philosophers become kings, and kings become philosophers.”
Now you might think that as a philosopher I really like that statement, right, because I should be a king. But I’m a huge fan of democracy, huge fan of democracy. Plato, however, was not at all a fan of democracy. It’s not like he thought it was the worst form of government, but he did think it was the second worst form of government. The only thing worse is absolute tyranny where you are ruled by a despot.
But he compared democracy, the democratic state, to a ship at sea. It’s beset by a bunch of sailors. The sailors are fighting with each other, and the sailors each have one overriding goal: to somehow get the owner of the ship, that would be the people, to hand over the rudder of the ship to them. Not because they know anything about navigation, but because they want to plunder the ship. That’s what he thought of politicians in a democracy.
Now, you might think, well, the politicians, yeah who likes politicians? Plato also did not think much of the people. He compared the people to a beast, moved by appetite and passion rather than reason. And here’s what he said about the art of political persuasion in a democracy, “Trying to persuade the people in a democracy is trying to wrestle and sooth the savage beast.” And that’s not reason, that’s not reasoned discourse, that’s just manipulation and persuasion.
Now here’s something I would really like to believe. I would like to believe that Plato got it absolutely wrong about democracy. I would like to believe that, but you know what, I don’t. And neither did our founding fathers, so I’ve got a problem, right? And I ask Plato’s question, “Are human beings fit for self-governance, rational self-governance, either individually or collectively?” He’ll say, “No, look at the human mind, most humans are not fit. Only an elite few are and so democracy is completely unworkable.” Think that’s a terrible argument? I think it’s a troubling argument.
So, I said I want to defend democracy, I’m here to defend democracy, but I’m going to make it harder. Because I’m going to show you modern cognitive science. We’re going from Plato, 2500 years ago, to modern cognitive science. One thing modern cognitive science does: it’s probe the mind a great deal, it’s probed the human mind a great deal. And one of the things that it’s found is that the human mind is shot through with irrationality, in its belief formation and in its decision making. All kinds of irrationalities.
Here’s one kind of irrationality, it’s called the Endowment Effects, and actually it has to do with both goods and beliefs. Basically, the idea of an Endowment Effect is that if you have something or if you believe something, you’ll place greater value on the thing that you have, than a thing that you could get that’s equivalent. Like I said that applies to both goods and beliefs. I could do a little thought experiment.
Suppose I was to give this half of the room a cool thing, a plaque, a cup, whatever, and I said, “It’s yours to keep. You may have it.” And I want to say this half of the room, “I’m going to give you the same cool thing, but only if you’re willing to pay the right price.” I ask you what price you’re willing to pay. You’ll write down a price. I ask you guys, “What price you’re willing to sell your cool thing to them for?” You’re going to set a price like twice as high as the price that they’ll be willing to buy it. Why? Because you got it. It doesn’t just apply to goods, it also applies to beliefs. People fall in love with their beliefs. There’s a phenomenon that because it was discovered by some Canadians – they talk funny – it’s called PerSEVerance of Discredited Belief, rather than PerseVEERance of Discredited Beliefs.
So, here I’m going to do a little experiment. I’m going to ask you to form a belief based on some evidence. Do risk-takers or cautious people make better firefighters? I’m going to give this side the following evidence: There’s a John, a risk-taking firefighter, he rushes into a burning building, puts himself at great risk, but in the process saves a family from death. What do you think? Do risk-taking people or cautious people make better firefighters? Group A, you’re going to conclude, “Oh, risk-taking people make better firefighters.” Because you’re going to say something like “Being a firefighter takes courage and look at – look at – look – and – and – and – so you got to be brave.”
OK, Group B, I’m going to give you a different story. I’m going to say, “Look, John the cautious firefighter decides it’s too risky to allow his units to enter the burning building,” which immediately after, it collapses, the whole thing is engulfed in flames and smoke. If he had sent his guys in there, they would have died. Group B says, “Oh, cautious people, make better firefighters because you’ve got to have good judgment.” And here’s the problem though, it’s all setup, and I’m going to now debrief you, I’m going to say, “You know what? I just made that all up, it’s all a setup, I was just playing with you. But you know what? You think you’re going to change your mind about what you believe?” On the basis of this discrediting evidence – No chance. You’re going to hold on to what you believe. Even though the evidence on which you base your belief is completely discredited you still believe it. You think that’s rational – No way. We make bad decisions, we’re really adverse to loss. We’re adverse to losing goods, we’re adverse to losing our cherished beliefs. That’s about beliefs you already had, suppose I wanted to seek out a belief? Sometimes you have to seek out a belief, you know, decide what to believe. And to decide what to believe, you form a hypothesis and you test it.