Henry Bauer
Kyrie Irving, is an amazing basketball player, who made quite a stir earlier this year regarding the flat earth theory.

Ben Nichols, this is a shout out to you. He came to us with this super conspiracy theory, he said, "The earth is flat."

No, the earth is flat.

Oh here we go.

No the earth is flat.

Now, don't get me wrong, there's no way a bright, Duke University educated guy like Kyrie Irving should fall for something anyone can debunk with the United Airlines timetable and some common sense. But take a minute to understand how he formed his opinion.

Hopefully they'll either back my belief or they'll throw it in the water. I think it's interesting for people to find out on their own.

You've seen pictures of the planet thought right? Like it's a circle?

I've seen a lot of things that have been... and my educational system has said that was real and it turned out to be completely fake. I don't mind, I don't mind going against the grain in terms of my thoughts and what I believe.

And with that, you might begin to understand why today's guest, Dr. Henry Bauer, might cut Kyrie some slack.

The popular view of science has not caught up with the present situation where people should be as skeptical about what official science says as they are about what the experts say about any other aspect of society.

Now, like I said, the Kyrie Irving flat earth thing caused quite a stir.

Did you know that the earth is flat?

Kyrie Irving is unfortunately helping to spread it, the conspiracy...

Here's a meme that was played over and over again by the mainstream science media.

Middle school teacher, Nick Gurol, said that his students, thanks to Irving, staunchly believe that the world is flat. Gurol remarked, "And immediately I started to panic. How have I failed these kids so badly...?"

Yes, panic indeed. How else should a science teacher respond to students that don't just, well, believe everything they're told.

"It's definitely hard for me because it feels like science isn't real to them." The educator said that he has tried to get the students to understand that the world is indeed round, to no avail. "The influence of Irving was just too strong."

I did ask Henry about this:

Listen to the interview here.

Read Excerpts:


Alex Tsakiris: Henry, welcome back to Skeptiko. Thanks so much for joining me.

Henry Bauer: Thank you Alex, it's good to be here.

Alex Tsakiris: So, I briefly sketched out your background, tell folks more about who you are and what led you to write this book, which may sound a little bit like the last book about dogmatic science, but has some new interesting twist to it as well.

Henry Bauer: Right, thanks. I was turned on to science in high school and I studied chemistry and I taught chemistry and did research in chemistry [he is emeritus professor of chemistry and science studies, and emeritus dean of the College of Arts and Sciences at Virginia Polytechnic Institute and State University, "Virginia Tech"]

Alex Tsakiris: I asked you for a synopsis of Science Is Not What You Think, your latest book and you gave me, what I think, is a pretty good one. Here's how I heard it.

You said: "Science is pervasive in our society, but it's also widely misunderstood, in terms of how science works and this has led the manipulation of public policy."

So, I was hoping you could kind of elaborate on that, kind of encapsulating where you're really coming from.

Henry Bauer: Exactly, thank you. The thing is that even as science is accepted as being the hallmark of a true understanding, the fact of the matter is, the history of science shows quite clearly that scientific consensus, at any point in time, can't be accepted as absolutely correct.

Alex Tsakiris: Early on in the book you identify some mistaken beliefs about science, that I really think get to the heart of what you're trying to convey. Let's start with this one, "Science is not skeptical."

Now, that's one that's going to catch a lot of people off guard. Tell folks what you mean by that.

Henry Bauer: I saw this at firsthand when I was doing chemistry also. When you're first fooling around and you're looking into new things, you gather data, you look at stuff and then you develop a theory and while this is going on, you could maybe say that people are pretty skeptical in the sense that we argue with one another about what the best explanation is, but once an explanation has been largely accepted, within the research community, people are no longer skeptical when they do experiments. They take that theory as correct, they don't question it anymore.

So, in the book I say science has a love/hate relationship with the evidence. We love it as we are gathering it, and then afterwards we hate any evidence that goes against the theory that we've accepted.

Alex Tsakiris: That's well said. The other way I kind of took it from the book is that, science isn't particularly any more skeptical than any other field. I mean, they're skeptical when it suits them and they're not skeptical when it suits them to be not skeptical. So this idea that science is at every turn, questioning every aspect of every experiment and every conclusion, is a mistaken belief that leads to this overconfidence that we have in science.

Henry Bauer: Yeah exactly, and really, science is done by human beings and human beings are not inherently skeptical all the time and science reflects human characteristics and social characteristics.

Alex Tsakiris: Here's another one of those mistaken beliefs from your book, and I think it's related to the first one, "Science is self-correcting." And you hear this all the time, when you pin somebody down and you really say, "Look, clearly we've gone off the rails in this direction, the data..."

I've run into this time and time again with parapsychology and psi experiments where I go, "Okay, the guy repeated it. He repeated it 9 times. Every different lab. 27 labs now across the world." Well, you know what, if it's true it will correct itself because science is self-correcting. Do you want to elaborate on how that is not really that meaningful?

Henry Bauer: I love that one because on the one hand we have the scientific method, which supposedly says that we don't accept theories until the evidence has proved them. On the other hand we say, science is self-correcting. Well, if it was right the first time, why does it need self-correction? And if it is self-correcting, that is an admission that scientific consensus, at any particular time, can't be given too much credence.

Alex Tsakiris: You know, that reminds me of the other extreme of that, which I always bring up because it always just amazes me when I hear it and that's the old, 'extraordinary claims require extraordinary proof' and I always want to go, "Do you realize how completely unscientific that saying is?" That this idea that there should be this meta body that decides what's extraordinary, in terms of claims, and what's extraordinary in terms of proof, completely undermines the whole purpose of science, which was to remove from this biases. Do you have any thoughts on that?

Henry Bauer: Well, I mean, extraordinary is a matter of judgement isn't it? So it's a meaningless statement to start with.

Alex Tsakiris: Exactly, but the fact that it resonates with people, I think gets back to your point and the point of the book, Science Is Not What You Think, people think that's what science is supposed to do; kind of police thought and kind of regulate what's extraordinary and what's not and what I hear you saying is the real purpose of science is something completely different than that, it's to, kind of, find a way that we can navigate in an unbiased way as possible, these new areas of understanding and these new aspects of our environment that we find, and we can't do that if we let all these biases in.

Henry Bauer: Well, that's right and one of the main points I make in the book is how drastically scientific activity has changed, in particular around the time of the Second World War, mid-twentieth century. I think up until then, it wouldn't have been all that wrong to say that science is a fairly independent activity that tries to discover or get the best understanding of how the world works. But the Second World War, the atomic bomb and all the other things like radar started this era in which science is involved in just about everything. I mean, medicine is said to be scientifically based or seeks to be scientifically based, and there's been an enormous expansion of money going into scientific research. So that nowadays, what you find is, that the activity of doing research is actually being controlled from the outside, it is no longer independently precarious about how the world works, looking into stuff.

Alex Tsakiris: I want to build on that point a little bit by going back to the first part where you started, because I think it's really interesting and we were chatting just a tiny bit before the show about the history of science and how that can be a turnoff to people, it can just turn them off to say, "I'm not really that interested, I'm more interested in what's coming next," and we're all excited about advances and stuff like that. But I love how you've tried to identify the root cause, or some of the root causes of this problem, in that the problem as it turns out, as you identify it, is dysfunctional excessive growth and if that is the problem, I think you've done an awesome job of putting your finger on this whole turn of history that you just eluded to in terms of, it's World War II, we're all unbelievably excited and proud that we've conquered the world and radar has been key to keeping England from being completely destroyed, that's the only way they survived and then this atom bomb thing that these scientists got together and figured out, that actually won the war. This is the narrative that we're all told.

Henry Bauer: Right.

Alex Tsakiris: So we can understand the enthusiasm that people had to say, "Well heck, let's just give all the power and the glory to science and see where it goes." But take us through how that led to this dysfunctional excessive growth that you're talking about, how did that lead to that?

Henry Bauer: A key one was that Vannevar Bush wrote a report to the president about the marvels that science had accomplished during the war and pointing out that if this were applied to society as a whole, think what we could do and that's when the National Science Foundation was founded and when the National Institutes of Health experienced an enormous increase in budget. Federal funds flowed into science and when I came to the States in the mid-60s, it was wonderful to be a scientist because all you had to do was to write a grand proposal, it would be definitely funded, and away you go.

Well, the government was also funding scholarships for people to study science and get into science and they wanted them to take up positions doing research and in the space of about ten years there had been so many science PhDs produced that the demand for research funds began to exceed the supply.

Alex Tsakiris: Now, I really like and I want to point out to people, how you're using these economic market force terms as it applies to science, because it takes us in a different place that makes this so much more understandable. So continue with the story of what happens when we have an oversupply in the market. What happens?

Henry Bauer: You get competition becoming more and more fierce and as competition gets fierce, human beings start to, maybe fudge a little bit, cut corners and so on.

Around 1980 there was a book published by two science journalists, Broad and Wade, called Fraud and Deceit in the Halls of Science and they claimed that dishonesty was an inherent part of scientific activity.

Now imagine: traditionally, science had been regarded as about as honest an endeavor as you could ever find and here were people saying the absolute contrary, that fraud and deceit were inherent in it.

The scientific community and that includes me, did not take kindly to that book, but as it turns out they had noticed something that was really going on.

Alex Tsakiris: Which we may agree with. I mean, I agree, I'm an environmentalist, I think we have to be very careful with the planet. I'm very concerned with pollution, with overpopulation, with all the things that could significantly reduce our quality of life. I just think the global warming thing is bullshit because the data doesn't support it, that's my thing.

Henry Bauer: Well, I also am very keen to preserve as much of nature as is possible and I think that burning fossil fuels is a very bad idea, not because of the CO2, but because of something much more pernicious, which is that every time you burn coal and oil, what you're spreading into the environment are tiny traces of heavy metals, mercury amongst them and that stuff. It doesn't remain in trace amounts in the environment, it gets accumulated in living organisms.

You know, we've had some scares about mercury in fish for example and lead coming from... when we used to add lead additives to gasoline, but these are only examples of what happens inevitably as we burn the fossil fuels.

So, I'm all in favor of getting rid of fossil fuel energy sources as soon as is feasible, but I don't think we should do it by telling lies about the danger coming from carbon dioxide.

Alex Tsakiris: And that leads to a really interesting, important and difficult point that I wanted to dive into and that is the science policy link, and you talk about this in the book throughout and that is, when you look at how science informs people who make policy decisions, it gets really mucky, cloudy and in a lot of ways it gets very depressing. Because if you look at the social justice warrior, now they've doubled-down, they've doubled their problem, because one, they don't seem to be able or willing to sort through the scientific data, like we just said, like with global warming, it's really not that hard. The best measure we have of temperature is these satellites and that's what everyone relied on, including Al Gore, who stood there in front of us, he said, "Hey, here's the best measure we have, it's these satellites, and they're telling us that the temperature is rising," right? And then 20 years later those same satellites told us, "Well, the temperature has leveled off and carbon has continued to rise." So, anyone with a 5th grade science understanding says, "Okay, well that theory turned out not to be true. That's okay, theories come and go, but that theory clearly needs some work."

Again, the social justice warrior manages to just, kind of, blind all that from their sight, can't sort through the data, but what I really wanted to get to is the double-down thing, is they also aren't willing to think through the policy kind of things.

As I mentioned in this interview that I had with Daniel Pinchbeck, here's his policy. His policy idea is that everyone should have a plant-based diet in order to cut down on greenhouse gases. Now, I'm mostly vegetarian, I eat very little meat, but I would never, in a million years, believe that from a policy standpoint, I could talk everyone into turning into a vegetarian or a vegan or that I would make that kind of a policy thing. I'd say that is never going to happen or it's going to happen lifetimes and lifetimes in the future.

So, you know, there's this gap, I think, between not only understanding the science but then understanding how science might inform reasonable policies and when I say 'reasonable', policies that would pass the test of our republican system, democracy, where we vote on stuff.

Henry Bauer: Right and I mean, the most important point really is, how can we make it possible for policy makers to benefit from the best possible scientific advice, objective scientific advice?

A fellow has written a book called, The Honest Broker, in which he says the role of science ought to be to present policy makers with the pros and cons, because it's rarely, if ever, totally cut and dried.

I suggest, at the end of my book that it's time for something like a science court, in which the pros and cons of scientific issues - where there is disagreement - should be publicly argued, so that the general public and the policy makers can see for themselves how much, what sort of evidence is there on the two sides. The present system we have, where dissenting views from the scientific consensus, dissenting views are suppressed and many people don't even know that there is dissent from them at all. This is very dangerous.

The idea for a science court goes back about 50 years, to the time when people were arguing whether nuclear reactors, power reactors could be safe or not and there were experts arguing both sides and the issue was never resolved to the extent that policy makers could be fairly clear about what the probabilities of risks were and we need that nowadays because policy makers are often unaware that there are serious scientific issues about the carbon dioxide hypothesis. People nowadays base their opinions about carbon dioxide, not on the science but on their political views.

Alex Tsakiris: But that's by design. I love the idea, I mean, anyone loves the idea because it suggests that we could actually rise above these special interests and other forces that are in play.

You know, I recently had a chance, and I sent you a clip of this, but I recently had a chance to interview Dr. Jacques Vallée, who's best know, of course, for his UFO research but is in fact a very well-rounded scientist, highly respected. I don't want to get into the UFO research; it's a huge topic to try and cover and it's not really your thing, although I know you've looked into it with your work at the Society for Scientific Investigation. But what I wanted to do was point to it as an example of the disinformation and misinformation factor, because it's dark.

It's just clearly well-documented that scientifically there was an organized, concerted effort to spread disinformation and misinformation about these UFOs. So take out what you think about UFOs and just look at how science and scientists, and this is one of Vallée's points in the interview; the main point is that scientists were intentionally misled.

Again, I think that is so clearly documented that if anyone looks into it, what I hope they'd ask is the larger question that relates to your book and I think also maybe challenges this idea of a science court and that is that, if they can do it and if they're doing it that completely in this area, why would be suspect that they're not doing it in every area that matters, AIDS, medicine, whatever? They're messing with the system in this very covert way, with these tactics that they've refined of misinformation and disinformation aren't they?

Henry Bauer: Sure, and you know, people in power can find all sorts of rationalizations as to why they suppress other people's points of view. For that matter, the story of Galileo and the Church, is often presented as science versus the Church, but actually The Pope didn't like what Galileo was doing because he was challenging authority and if you challenge the authority of our scientific establishment, then they will react just as badly as The Pope or the Catholic Church reacted against challenges to its doctrines on everything, not just theological doctrines but what they were saying about how the world works, about the science.

That's what you find nowadays in the same sort of way, that people who are in charge of the National Institutes of Health for example, they are vested, their careers, their lives, their status, their prestige is vested in what the institutes claim is the truth about medical matters and they will resort to anything, any tactic at all to maintain that position.

I have a quote somewhere or other from Anthony Fauci and from Robert Gallo telling journalists that they may lose any access to official sources if they don't toe the ideological line of what they're putting out.