Why Don’t Scientists Have More Authority in Government? by Robert Crease at TEDxCERN (Transcript)

There’s a cartoon by Randall Munroe, the xkcd artist, that shows two people speaking and one says to the other, “That person over there believes silly things, like that fossils are fakes, and the world is only 6,000 years old.” And the other person goes, “Not a problem, the Universe doesn’t care what people believe.”

And the first person goes, “But that’s our congressman.” And the second person says, “OK, we have a problem.” I love that joke because we do have a problem, we have congressmen who don’t believe in things like fossils and evolution. But what’s wrong with that? After all, they were elected. I’m going to say what’s wrong with that, and what we can do about it, if anything.

First of all, this is not what I thought 21st century politics was going to be like. When I was a graduate student in the Humanities in the 1970s – the late 1970s – my professors thundered against what they called the coming technocratic state “Politicians,” they said, “would soon not care about human values but only about efficiency.” “Politicians,” they said, “would soon not listen to citizens but only to scientists and engineers”. If only! Never before have there been so many issues that required so much scientific input to solve.

Issues involving energy, the environment, infectious diseases, pollution, global warming, and so forth. But never before has the required scientific input been so sabotaged, misused, or ignored Politicians sometimes even view scientists as the enemy. Is that over the top?

Few years ago, a US Congressman, Paul Broun of Georgia, declared that evolution, embryology, and the Big Bang theory were lies straight from the pit of hell, and said that he knew the Universe was only a few thousand years old. And what is supposed to happen to him? He not only got reelected but he was put on the House Committee in charge of the United States’ Science, Space and Technology Program.

How does science denial work? I’m fascinated by stories, both real and fictional, which illustrate the dynamics of the collision between science and social, economic, or religious values. And one of my favorites is in the movie Jaws. Has anyone seen it? Small seaside town that depends for its livelihood on tourism. The day before the first major holiday of the season, a woman’s badly mangled body washes up onshore. A scientist from the Oceanographic Institute, played by the nerdy Richard Dreyfuss, says, “It’s a shark!” The town’s mayor, who is terrified at the prospect of closing the beaches, says, ‘We have to be reasonable, we have to act in the town’s best interest. It was probably a boating accident.” And, by the way, isn’t Richard Dreyfuss acting in his own self-interest? Isn’t he really interested in getting into the pages of National Geographic?

ALSO READ:   The Beauty of Conflict: Clair Canfield (Full Transcript)

Now, we in the audience, we, watching the film, are in a special position. Unlike anyone in the film at the point, we have actually seen the shark. So we know what’s up, and we know whom to believe. But what about the people on film? What about the people in the town? To them, it seems like just a question of the judgment of one person, Richard Dreyfuss, versus the other, the town’s mayor.

Now, when science denial happens, it’s really easy – whoops, I forgot to show you my picture of the shark – when science denial happens, it’s really easy to try to find a villain to blame it on. The press, scientific illiteracy, maybe what sociologists call amoral calculators, or people who know what the right thing to do is but are swayed by political, economic, or religious factors. Or villains, people who know what the good is, but don’t do it. But really, it’s a question of authority. Why is the authority of science in government so low? Someone who thought about that an awful lot was Jack Marburger, the former US Presidential Science Adviser.

And Marburger liked to tell the following story. Shortly after the 9/11 terrorist attack, – you might recall – someone sent letters containing deadly anthrax spores to a number of congressmen and to some news agencies; five people died and more were injured. And mail became piling up that might or might not contain anthrax. And Marburger was asked to come up with a method to neutralize the anthrax so that the letters could be read. He convened a team of scientists, they did some research, consulted the literature, and came up with the recommendation involving electron beam irradiation.

He turned the method over to the government, and it looked like a triumph of the use of science for the public good. But a funny thing happened, when the method was first tried, it didn’t work. It burned the mail to a crisp. And Marburger looked into it, and found that the government officials had second guessed the scientists. They had reasoned that if the scientists had said that X was the right dose, wouldn’t it be a lot safer to up the dose? To make it 5X or 10X.

ALSO READ:   Daniel Ellsberg: The Doomsday Machine @ Talks at Google (Transcript)

And when he had the dose scaled back, the method worked just fine. Marbuger called this a relatively benign instance of a potentially disastrous behavior. Namely, the tendency of government officials to ignore, or alter scientific advice. And he had more serious examples, such as the Bush administration’s claim in 2002, that the Iraqi government was looking for a certain kind of aluminium tubes because they wanted to produce nuclear weapons, which scientists said was wrong.

But after Marburger stepped down as science adviser, he began to investigate why is science such a weak force in government circles? He consulted the writings of Max Weber, a German sociologist and historian, who is well-known for his writings on the nature of authority, or the reasons why we obey commands that were issued by others.

Pages: First | 1 | 2 | 3 | Next → | Last | Single Page View

Scroll to Top