Recent polls suggest that fewer than 40 percent of Americans believe in Darwin’s theory of evolution, despite it being one of science’s best-established findings. Parents still refuse to vaccinate their children for fear it causes autism, though this link has been consistently disproved. And about 40 percent of Americans believe that the threat of global warming is exaggerated, including many political leaders. In this era of fake news and alternative facts, there is more bunk than ever. But why do people believe in it? And what causes them to embrace such pseudoscientific beliefs and practices? In this fully revised second edition, noted skeptic Massimo Pigliucci sets out to separate the fact from the fantasy in an entertaining exploration of the nature of science, the borderlands of fringe science, and—borrowing a famous phrase from philosopher Jeremy Bentham—the nonsense on stilts. Presenting case studies on a number of controversial topics, Pigliucci cuts through the ambiguity surrounding science to look more closely at how science is conducted, how it is disseminated, how it is interpreted, and what it means to our society. The result is in many ways a “taxonomy of bunk” that explores the intersection of science and culture at large. No one—neither the public intellectuals in the culture wars between defenders and detractors of science nor the believers of pseudoscience themselves—is spared Pigliucci’s incisive analysis in this timely reminder of the need to maintain a line between expertise and assumption. Broad in scope and implication, Nonsense on Stilts is a captivating guide for the intelligent citizen who wishes to make up her own mind while navigating the perilous debates that will shape the future of our planet.
|Publisher:||University of Chicago Press|
|Edition description:||Second Edition|
|Product dimensions:||6.00(w) x 8.90(h) x 0.80(d)|
About the Author
Massimo Pigliucci is the K. D. Irani Professor of Philosophy at the City College of New York. He is the author, editor, or coeditor of many books, including How to Be a Stoic: Using Ancient Philosophy to Live a Modern Life and, most recently, Science Unlimited?: The Challenges of Scientism, the latter also published by the University of Chicago Press.
Read an Excerpt
Aristotle, so far as I know, was the first man to proclaim explicitly that man is a rational animal. His reason for this view was one which does not now seem very impressive; it was, that some people can do sums.
— Bertrand Russell, A History of Western Philosophy
The greatness of Reason is not measured by length or height, but by the resolves of the mind. Place then thy happiness in that wherein thou art equal to the Gods.
— Epictetus, Discourses
I am a professional scientist (an evolutionary biologist) as well as a philosopher (of science). You could say that I value rational discourse and believe it to be the only way forward open to humanity. That's why I wrote this book. But I don't want readers to be under the mistaken impression that one can simply explain, say, the difference between astronomy (a science) and astrology (a pseudoscience), pepper it with a few references to Karl Popper and the demarcation problem (see Introduction), and be done with it. Human beings may be rational animals, as Aristotle famously said, but often enough our rationality goes straight out the window, especially when it comes into conflict with cherished or psychologically comforting beliefs. Before we get into the nuts and bolts of science and pseudoscience, therefore, I'd like to recount a couple of conversations that will serve as cautionary tales. They may help us calibrate our expectations, dispelling the illusion that we can quickly and effectively dispatch nonsense and make the world a more reasonable place. That is the ultimate goal, but it is harder than one might think.
The first example recounts a somewhat surreal discussion I had with one of my relatives — let's call him Ostinato — about pseudoscience (specifically, the nonexistent connection between vaccines and autism), conspiracy theories (about the 9/11 attacks on New York's Twin Towers), politics, and much, much more. Of course, I should have known better than to start the discussion in the first place, especially with a relative who I knew subscribed to such notions. Blame it on the nice bottle of Aglianico wine we had been sharing during the evening.
The pattern of Ostinato's arguments is all too familiar to me: he denied relevant expertise (you know, scientists often get it wrong!), while at the same time vigorously — and apparently oblivious to the patent contradiction — invoking someone else's doubtful expertise (the guy is an engineer!). He continually side-tracked the conversation, bringing up irrelevant or unconnected points (an informal logical fallacy known as a red herring) and insisting we should look "beyond logic," whatever that means. The usual fun. I was getting more and more frustrated, the wine was running out, and neither I nor Ostinato had learned anything or even hinted at changing our mind. Why was I not persuading him? There must have been something I was missing.
It was at that point that another of my relatives, observing the discussion and very much amused by it, hit the nail right on the head. He invited me to consider whether Ostinato was simply confusing probability with possibility. I stopped dead in my tracks, pondered the suggestion, and had a Eureka! moment. That was exactly what was happening. Pretty much all of Ostinato's arguments were along the lines of "you say so, but isn't it possible that ..." or "but you can't exclude the possibility that ..." And of course he was right. It is possible (though very, very unlikely) that the 9/11 attacks were an inside job. And no, I cannot categorically state that vaccines never, ever cause autism. But so what?
I changed strategy and explained to Ostinato that he was racking up a number of rhetorical victories, nothing of substance. Yes, I conceded, it is true that for most things (in fact, for any statement that is not mathematical or purely logical) there is always the possibility one is wrong. But usually we don't make decisions based on possibility; instead, we use the much more refined tool of probability (estimated to the best of our abilities).
I tried to make the point by drawing two diagrams, like this:
The graphs illustrate two hypothetical probability distributions for a set of events, with the probability estimate on the vertical axis and the type of event on the horizontal one. The top diagram represents my relative's view of the world: he is acting as if all events have equal probability. Not literally — he does understand that some outcomes are more likely than others — but in practice, since he considers mere logical possibilities, however improbable in reality, to be worthy of the same attention as outcomes that are much more likely. It would be as if you asked someone to join you for dinner, and she replied, in all seriousness, "I'd love to, assuming the earth doesn't fall to an alien attack before then." The lower diagram, by contrast, shows how the world actually behaves. Some outcomes have much higher probabilities than others, and the resulting distribution (which doesn't have to take the shape I drew, obviously) is far from flat. Aliens may attack earth before dinner time, but the possibility is remote, far too slight to preclude your making firm plans.
I therefore resumed my discussion with Ostinato by mentioning Enlightenment philosopher David Hume's famous statement, to the effect that a reasonable person proportions her beliefs to the evidence,a principle restated two centuries later by astronomer Carl Sagan, in the context of discussions of pseudoscience: extraordinary claims require extraordinary evidence.
A modern version of this principle is what is known as Bayes's theorem, which we will consider in more detail in chapter 10. For now, suffice it to say that the theorem proves (mathematically) that the probability of a theory T, given the available evidence E, is proportional to two factors: the probability of observing evidence E if theory T were true, multiplied by the probability that T is true based on initial considerations (the "priors").
The beauty of Bayes's theorem is that it updates its results in a recursive fashion, as new evidence becomes available. The result one gets each time one applies the theorem is called the posterior probability and is obtained — conceptually speaking — by updating the priors in proportion to the newly available evidence. Not only that, people have proven that no matter what the initial priors are (i.e., your initial assessment of the likelihood that theory T is right), after a sufficient number of iterations the posteriors converge toward the true value of T. This makes Bayes's theorem a formidable tool for practical decision making and, indeed, for the rational assessment of pretty much everything. As metaphor, it serves as a good guide for assessing beliefs — which, as Hume advises, should stand in proportion to the (ever changing) evidence.
I concluded my explanation to Ostinato — inspired by Bayes's theorem and probability theory more generally — by suggesting that when we make an assessment of any given notion we are basically placing a bet. Given the best understanding I have of the vaccineautism controversy, for instance, I bet (heavily) that vaccines do not actually cause autism. Do I know this for certain? No, because it isn't an a priori truth of mathematics or logic. Is it possible that vaccines do cause autism? Yes, that scenario does not involve a logical contradiction, so it is possible. But those are the wrong questions. The right question is, is it likely, on the basis of the available evidence? If you had to bet (with money, or with the health of your kids), which way should you bet? I'm not sure I made a dent in the convictions of my relative, but I did my best.
As Fate wished, I had a second chance to observe how people fail to think properly a few months later, in the course of another conversation about science and pseudoscience. This exchange lasted days, on and off on social media, interacting with someone I've never met and likely never will meet. The range of topics this time was much narrower than with Ostinato, and far closer to my own areas of expertise: evolutionary biology and philosophy of science. I felt, therefore, that I really knew what I was talking about, providing not just a reasonably intelligent and somewhat informed opinion but an expert one, based on more than three decades of studying the subject matter at a professional level.
Predictably, it didn't help. Not in the least. My interlocutor — let's call her Curiosa — is an intelligent woman who has read a lot of stuff on evolution in particular and science more generally. She has also read several of my blog posts, watched some of my debates, and even bought one of my books on evolution. She discovered me by way of reading creationist Michel Denton's Evolution: A Theory in Crisis, which cites me several times as a "reluctant" critic of evolutionary theory — one of those people who know that there is something seriously wrong with Darwinism, yet somehow can't let go of the orthodoxy and embrace the revolution.
My actual position on the topic is easy to check online, in several places, and it boils down to this: evolutionary theory has evolved by way of several episodes, from 1859 (original Darwinism) to the 1930s and '40s (the so-called Evolutionary Synthesis) through current times (what is known as the Extended Synthesis), and it will likely continue to do so. There is nothing wrong with Darwin's original twin ideas of natural selection and common descent, but in the subsequent century and a half we have added a number of other areas of inquiry, explanatory concepts, and of course empirical results. End of story.
Not according to Curiosa. She explained to me that Darwinism is a "reductionist" theory, apparently meaning something really bad by that term. I explained that reductionism is a successful strategy throughout the sciences and that when it is properly done, it is pretty much the only game in town to advance our knowledge of the world. It really amounts to saying that the best way to tackle big problems is by dividing them into smaller chunks and addressing one chunk at a time, properly aligning small pieces of the puzzle until the full picture comes back into view.
But, countered Curiosa, how do you then explain the bacterial flagellum? This was a reference to "Darwin's black box," a notion advanced by intelligent design creationist Michael Behe. You know, Behe is a scientist! With a PhD!! Working at a legitimate university!!! How do you explain that, Professor Pigliucci?
Well, I said. If you wish I can walk you through several papers that have proposed likely, empirically based scenarios for the evolution of the bacterial flagellum. As for Behe himself, you will always find legitimate academics who position themselves outside of the mainstream. It's a healthy aspect of the social enterprise we call science, as we will see later in this book. Occasionally, some of these people range far from consensus opinion, into territory that is highly questionable, or even downright pseudoscientific. Some consider themselves rebels or mavericks. Some tend to put their ideology (usually religious, but sometimes political) ahead of reason and evidence. The latter is the case for Behe, a fervent Catholic who simply can't wrap his mind around the conclusion that life originated and differentiated through purely natural means, no gods required.
Ah!, continued Curiosa, if that's the case, how come there is so much disagreement among scientists about evolution, and even the origin of life? Well, I replied, let's begin by distinguishing those two issues. First, there is not widespread disagreement about Darwinism among evolutionary biologists. Pretty much all professionals I know accept the idea. There is disagreement, but it is over the shape of the current theory, just as in other disciplines. Physicists, too, disagree on cutting-edge questions — but not about Newton or even Einstein.
Second, the reason there are indeed many theories about the origin of life, and truly no consensus, is that the information available is not sufficient for us to zero in on one or a small subset of hypotheses. (We'll talk about this in chapter 2.) We don't have, and likely never will have, fossils documenting what happened at the onset of life. The historical traces are, unfortunately, forever erased, which means that our ideas about those events will remain speculative. Even if we are one day able to recreate life from scratch in a laboratory, we will have no guarantee that the path we followed under controlled conditions was the one historically taken by nature on our planet. But so what? Science never promised to answer every question, only to do its best. Sometimes its best is not good enough, and the wise thing to do is to accept human epistemic limitations and move on.
Not at all satisfied, Curiosa shifted topic again: haven't you heard about Roger Penrose's quantum mechanical explanation of consciousness? Doesn't that imply that consciousness is everywhere, that it is a holistic property of the universe? Hmm, I said, with all due respect to Sir Roger (a top-notch scientist), I doubt physicists have a clue about consciousness, which so far as I can see is a biological phenomenon, whose explanation is hence best left to biologists. Besides, I told her, beware of any "explanation" that invokes quantum mechanics for anything other than quantum phenomena, even when proffered by a credentialed physicist like Penrose. At any rate, I concluded, even if Penrose is right, what does that have to do with Darwinism and its alleged failures?
I think you get the idea. My exchanges with Curiosa continued, becoming increasingly frustrating and eventually downright useless, until I politely pointed out that we were going in circles and perhaps it was time to call it a day.
What did I learn from this exchange? A number of things, none of them boding well for the advancement of rational discourse and public understanding of science. But we need to face reality for what it is — it is the rational thing to do.
First, let me remind you that Curiosa is a smart, well read, and genuinely curious person. Second, because she reads widely, she is exposed not only to what I write — and what truly eminent evolutionary biologists like Stephen Jay Gould write — but to fluff put out by the Behes and Dentons of the world. And she has no way to discriminate, since all of these people have PhDs and affiliations with reputable universities. (This is an issue of trust and expertise, the topics of chapter 12.)
Third, while we always assume that knowledge is an unqualified good, it turns out that a bit of knowledge may do more harm than complete ignorance. When someone as intelligent as Curiosa thinks she understands enough to draw conclusions, she will not hesitate to do so, rejecting expert opinion outright in the name of making up her own mind as an independent thinker. When this has to do with the status of evolutionary theory, not much harm is done. But when it has to do with, say, climate change or the safety of vaccines, that's an altogether different, and far more dire, story.
Fourth, Curiosa has fallen for the well-known technique of spreading doubt about mainstream science, to the extent that people genuinely cannot make up their minds about what is going on. This was the deliberate strategy of the tobacco industry in its absurd (and lethal, for many people) denial of a link between smoking and cancer, well described in the book and documentary Merchants of Doubt. The same approach has been used by other partisans to sow doubts about climate change, vaccines, and so forth. And of course it has also been the main strategy behind the so-called intelligent design movement.
Fifth, and rather ironically, Curiosa has absorbed and internalized the vocabulary of skeptical (i.e., pro-science) organizations, accusing me and others of perpetrating all sorts of logical fallacies, a convenient shortcut that saves her the trouble of actually engaging with my arguments. For instance, when I pointed out — reasonably, it seemed to me — that Discovery Institute fellow Jonathan Wells is a member of the Sun Myung Moon's Unification Church, and that his antipathy toward evolution is entirely ideological in nature, I was accused of committing an ad hominem attack. When I pointed out plenty of reliable sources on evolutionary theory, I was demonstrating confirmation bias. And so on.
Lastly, Curiosa's spirited discussion with me was clearly fueled by her pride in taking on Big Science and its Orthodoxy, in favor of open-mindedness and revolution. She saw herself as David, and I was the Goliath to be slain.
I'm afraid there is little I or anyone else can do for the Curiosas of the world. If — and it's a big if — they ever manage to get their heads clear about what is and is not legitimate science, they will have to do it on their own, painfully and slowly. The resources are readily available, at their disposal (this book being one of them). But they often have no psychological incentive to do so.(Continues…)
Excerpted from "Nonsense on Stilts"
Copyright © 2018 The University of Chicago.
Excerpted by permission of The University of Chicago Press.
All rights reserved. No part of this excerpt may be reproduced or reprinted without permission in writing from the publisher.
Excerpts are provided by Dial-A-Book Inc. solely for the personal use of visitors to this web site.
Table of Contents
Introduction Science versus Pseudoscience and the "Demarcation Problem",
CHAPTER 1 Frustrating Conversations,
CHAPTER 2 Hard Science, Soft Science,
CHAPTER 3 Almost Science,
CHAPTER 4 Pseudoscience,
CHAPTER 5 Blame the Media?,
CHAPTER 6 Debates on Science: The Rise of Think Tanks and the Decline of Public Intellectuals,
CHAPTER 7 From Superstition to Natural Philosophy,
CHAPTER 8 From Natural Philosophy to Modern Science,
CHAPTER 9 The Science Wars I: Do We Trust Science Too Much?,
CHAPTER 10 The Science Wars II: Do We Trust Science Too Little?,
CHAPTER 11 The Problem and the (Possible) Cure: Scientism and Virtue,
CHAPTER 12 Who's Your Expert?,
Conclusion So, What Is Science after All?,