A Field Guide to Lies: Critical Thinking in the Information Age

A Field Guide to Lies: Critical Thinking in the Information Age

by Daniel J. Levitin
A Field Guide to Lies: Critical Thinking in the Information Age

A Field Guide to Lies: Critical Thinking in the Information Age

by Daniel J. Levitin

Hardcover

$28.00 
  • SHIP THIS ITEM
    Temporarily Out of Stock Online
  • PICK UP IN STORE
    Check Availability at Nearby Stores

Related collections and offers


Overview

Winner of the National Business Book Award

From the New York Times bestselling author of The Organized Mind and This Is Your Brain on Music, a primer to the critical thinking that is more necessary now than ever 

We are bombarded with more information each day than our brains can process—especially in election season. It's raining bad data, half-truths, and even outright lies. New York Times bestselling author Daniel J. Levitin shows how to recognize misleading announcements, statistics, graphs, and written reports, revealing the ways lying weasels can use them.

It's becoming harder to separate the wheat from the digital chaff. How do we distinguish misinformation, pseudo-facts, and distortions from reliable information? Levitin groups his field guide into two categories—statistical information and faulty arguments—ultimately showing how science is the bedrock of critical thinking. Infoliteracy means understanding that there are hierarchies of source quality and bias that variously distort our information feeds via every media channel, including social media. We may expect newspapers, bloggers, the government, and Wikipedia to be factually and logically correct, but they so often aren't. We need to think critically about the words and numbers we encounter if we want to be successful at work, at play, and in making the most of our lives. This means checking the plausibility and reasoning—not passively accepting information, repeating it, and making decisions based on it. Readers learn to avoid the extremes of passive gullibility and cynical rejection. Levitin's charming, entertaining, accessible guide can help anyone wake up to a whole lot of things that aren't so. And catch some weasels in their tracks! 


 

Product Details

ISBN-13: 9780525955221
Publisher: Penguin Publishing Group
Publication date: 09/06/2016
Pages: 304
Product dimensions: 6.20(w) x 9.10(h) x 1.20(d)

About the Author

Daniel J. Levitin, Ph.D., is Founding Dean of Arts & Humanities at the Minerva Schools at KGI, a Distinguished Faculty Fellow at the Haas School of Business, UC Berkeley, and the James McGill Professor Emeritus of Psychology and Music at McGill University, Montreal, where he also holds appointments in the Program in Behavioural Neuroscience, The School of Computer Science, and the Faculty of Education. An award-winning scientist and teacher, he now adds best-selling author to his list of accomplishments as This Is Your Brain on Music , The World in Six Songs and The Organized Mind were #1 best-sellers. His work has been translated into 21 languages. Before becoming a neuroscientist, he worked as a session musician, sound engineer, and record producer working with artists such as Stevie Wonder and Blue Oyster Cult. He has published extensively in scientific journals as well as music magazines such as Grammy and Billboard. Recent musical performances include playing guitar and saxophone with Sting, Bobby McFerrin, Rosanne Cash, David Byrne, Cris Williamson, Victor Wooten, and Rodney Crowell.

Read an Excerpt

Plausibility

Statistics, because they are numbers, appear to us to be cold, hard facts. It seems that they represent facts given to us by nature and it's just a matter of finding them. But it's important to remember that people gather statistics. People choose what to count, how to go about counting, which of the resulting numbers they will share with us, and which words they will use to describe and interpret those numbers. Statistics are not facts. They are interpretations. And your interpretation may be just as good as, or better than, that of the person reporting them to you.

Sometimes, the numbers are simply wrong, and it's often easiest to start out by conducting some quick plausibility checks. After that, even if the numbers pass plausibility, three kinds of errors can lead you to believe things that aren't so: how the numbers were collected, how they were interpreted, and how they were presented graphically.

In your head or on the back of an envelope you can quickly determine whether a claim is plausible (most of the time). Don't just accept a claim at face value; work through it a bit.

When conducting plausibility checks, we don't care about the exact numbers. That might seem counterintuitive, but precision isn't important here. We can use common sense to reckon a lot of these: If Bert tells you that a crystal wineglass fell off a table and hit a thick carpet without breaking, that seems plausible. If Ernie says it fell off the top of a forty-story building and hit the pavement without breaking, that's not plausible. Your real-world knowledge, observations acquired over a lifetime, tells you so. Similarly, if someone says they are two hundred years old, or that they can consistently beat the roulette wheel in Vegas, or that they can run forty miles an hour, these are not plausible claims.

What would you do with this claim?

In the thirty-five years since marijuana laws stopped being enforced in California, the number of marijuana smokers has doubled every year.

Plausible? Where do we start? Let's assume there was only one marijuana smoker in California thirty-five years ago, a very conservative estimate (there were half a million marijuana arrests nationwide in 1982). Doubling that number every year for thirty-five years would yield more than 17 billion-larger than the population of the entire world. (Try it yourself and you'll see that doubling every year for twenty-one years gets you to over a million: 1; 2; 4; 8; 16; 32; 64; 128; 256; 512; 1024; 2048; 4096; 8192; 16,384; 32,768; 65,536; 131,072; 262,144; 524,288; 1,048,576.) This claim isn't just implausible, then, it's impossible. Unfortunately, many people have trouble thinking clearly about numbers because they're intimidated by them. But as you see, nothing here requires more than elementary school arithmetic and some reasonable assumptions.

Here's another. You've just taken on a position as a telemarketer, where agents telephone unsuspecting (and no doubt irritated) prospects. Your boss, trying to motivate you, claims:

Our best salesperson made 1,000 sales a day.

Is this plausible? Try dialing a phone number yourself-the fastest you can probably do it is five seconds. Allow another five seconds  for the phone to ring. Now let's assume that every call ends in a sale-clearly this isn't realistic, but let's give every advantage to this claim to see if it works out. Figure a minimum of ten seconds to make a pitch and have it accepted, then forty seconds to get the buyer's credit card number and address. That's one call per minute (5 + 5 + 10 + 40 = 60 seconds), or 60 sales in an hour, or 480 sales in a very hectic eight-hour workday with no breaks. The 1,000 just isn't plausible, allowing even the most optimistic estimates.

Some claims are more difficult to evaluate. Here's a headline from Time magazine in 2013:

More people have cell phones than toilets.

What to do with this? We can consider the number of people in the developing world who lack plumbing and the observation that many people in prosperous countries have more than one cell phone. The claim seems plausible-that doesn't mean we should accept it, just that we can't reject it out of hand as being ridiculous; we'll have to use other techniques to evaluate the claim, but it passes the plausibility test.

Sometimes you can't easily evaluate a claim without doing a bit of research on your own. Yes, newspapers and websites really ought to be doing this for you, but they don't always, and that's how runaway statistics take hold. A widely reported statistic some years ago was this:

In the U.S., 150,000 girls and young women die of anorexia each year.

Okay-let's check its plausibility. We have to do some digging. According to the U.S. Centers for Disease Control, the annual number of deaths from all causes for girls and women between the ages of fifteen and twenty-four is about 8,500. Add in women from twenty-five to forty-four and you still only get 55,000. The anorexia deaths in one year cannot be three times the number of all deaths.

In an article in Science, Louis Pollack and Hans Weiss reported that since the formation of the Communication Satellite Corp,

The cost of a telephone call has decreased by 12,000 percent.

If a cost decreases by 100 percent, it drops to zero (no matter what the initial cost was). If a cost decreases by 200 percent, someone is paying you the same amount you used to pay them for you to take the product. A decrease of 100 percent is very rare; one of 12,000 percent seems wildly unlikely. An article in the peer-reviewed Journal of Management Development claimed a 200 percent reduction in customer complaints following a new customer care strategy. Author Dan Keppel even titled his book Get What You Pay For: Save 200% on Stocks, Mutual Funds, Every Financial Need. He has an MBA. He should know better.

Of course, you have to apply percentages to the same baseline in order for them to be equivalent. A 50 percent reduction in salary cannot be restored by increasing your new, lower salary by 50 percent, because the baselines have shifted. If you were getting $1,000/week and took a 50 percent reduction in pay, to $500, a 50 percent increase in that pay only brings you to $750.

Percentages seem so simple and incorruptible, but they are often confusing. If interest rates rise from 3 percent to 4 percent, that is an increase of 1 percentage point, or 33 percent (because the 1 percent rise is taken against the baseline of 3, so 1/3 = .33). If interest rates fall from 4 percent to 3 percent, that is a decrease of 1 percentage point, but not a decrease of 33 percent-it's a decrease of 25 percent (because the 1 percentage point drop is now taken against the baseline of 4). Researchers and journalists are not always scrupulous about making this distinction between percentage point and percentages clear, but you should be.

The New York Times reported on the closing of a Connecticut textile mill and its move to Virginia due to high employment costs. The Times reported that employment costs, "wages, worker's compensation and unemployment insurance-are 20 times higher in Connecticut than in Virginia." Is this plausible? If it were true, you'd think that there would be a mass migration of companies out of Connecticut and into Virginia-not just this one mill-and that you would have heard of it by now. In fact, this was not true and the Times had to issue a correction. How did this happen? The reporter simply misread a company report. One cost, unemployment insurance, was in fact twenty times higher in Connecticut than in Virginia, but when factored in with other costs, total employment costs were really only 1.3 times higher in Connecticut, not 20 times higher. The reporter did not have training in business administration and we shouldn't expect her to. To catch these kinds of errors requires taking a step back and thinking for ourselves-which anyone can do (and she and her editors should have done).

New Jersey adopted legislation that denied additional benefits to mothers who have children while already on welfare. Some legislators believed that women were having babies in New Jersey simply to increase the amount of their monthly welfare checks. Within two months, legislators were declaring the "family cap" law a great success because births had already fallen by 16 percent. According to the New York Times:

After only two months, the state released numbers suggesting that births to welfare mothers had already fallen by 16 percent, and officials began congratulating themselves on their overnight success.

Note that they're not counting pregnancies, but births. What's wrong here? Because it takes nine months for a pregnancy to come to term, any effect in the first two months cannot be attributed to the law itself but is probably due to normal fluctuations in the birth rate (birth rates are known to be seasonal).

Even so, there were other problems with this report that can't be caught with plausibility checks:

. . . over time, that 16 percent drop dwindled to about 10 percent as the state belatedly became aware of births that had not been reported earlier. It appeared that many mothers saw no reason to report the new births since their welfare benefits were not being increased.

This is an example of a problem in the way statistics were collected-we're not actually surveying all the people that we think we are. Some errors in reasoning are sometimes harder to see coming than others, but we get better with practice. To start, let's look at a basic, often misused tool.

The pie chart is an easy way to visualize percentages-how the different parts of a whole are allocated. You might want to know what percentage of a school district's budget is spent on things like salaries, instructional materials, and maintenance. Or you might want to know what percentage of the money spent on instructional materials goes toward math, science, language arts, athletics, music, and so on. The cardinal rule of a pie chart is that the percentages have to add up to 100. Think about an actual pie-if there are nine people who each want an equal-sized piece, you can't cut it into eight. After you've reached the end of the pie, that's all there is. Still, this didn't stop Fox News from publishing this pie chart:

First rule of pie charts: the percentages have to add up to 100. (Fox News, 2010)

You can imagine how something like this could happen. Voters are given the option to report that they support more than one candidate. But then, the results shouldn't be presented as a pie chart.

Fun with Averages

An average can be a helpful summary statistic, even easier to digest than a pie chart, allowing us to characterize a very large amount of information with a single number. We might want to know the average wealth of the people in a room to know whether our fund-raisers or sales managers will benefit from meeting with them. Or we might want to know the average price of gas to estimate how much it will cost to drive from Vancouver to Banff. But averages can be deceptively complex.

There are three ways of calculating an average, and they often yield different numbers, so people with statistical acumen usually avoid the word average in favor of the more precise terms mean, median, and mode. We don't say "mean average" or "median average" or simply just "average"-we say mean, median, or mode. In some cases, these will be identical, but in many they are not. If you see the word average all by itself, it's usually indicating the mean, but you can't be certain.

The mean is the most commonly used of the three and is calculated by adding up all the observations or reports you have and dividing by the number of observations or reports. For example, the average wealth of the people in a room is simply the total wealth divided by the number of people. If the room has ten people whose net worth is $100,000 each, the room has a total net worth of $1 million, and you can figure the mean without having to pull out a calculator: it is $100,000. If a different room has ten people whose net worth varies from $50,000 to $150,000 each, but totals $1 million, the mean is still $100,000 (because we simply take the total $1 million and divide by the ten people, regardless of what any individual makes).

The median is the middle number in a set of numbers (statisticians call this set a "distribution"): half the observations are above it and half are below. Remember, the point of an average is to be able to represent a whole lot of data with a single number. The median does a better job of this when some of your observations are very, very different from the majority of them, what statisticians call outliers.

If we visit a room with nine people, suppose eight of them have a net worth of near $100,000 and one person is on the verge of bankruptcy with a net worth of negative $500,000, owing to his debts. Here's the makeup of the room:

Person 1:    $500,000

Person 2:    $96,000

Person 3:    $97,000

Person 4:    $99,000

Person 5:    $100,000

Person 6:    $101,000

Person 7:    $101,000

Person 8:    $101,000

Person 9:    $104,000

Now we take the sum and obtain a total of $299,000. Divide by the total number of observations, nine, and the mean is $33,222 per person. But the mean doesn't seem to do a very good job of characterizing the room. It suggests that your fund-raiser might not want to visit these people, when it's really only one odd person, one outlier, bringing down the average. This is the problem with the mean: It is sensitive to outliers.

The median here would be $100,000: four people make less than that amount, and four people make more. The mode is $101,000, the number that appears more often than the others. Both the median and the mode are more helpful in this particular example.

There are many ways that averages can be used to manipulate what you want others to see in your data.

Let's suppose that you and two friends founded a small start-up company with five employees. It's the end of the year and you want to report your finances to your employees, so that they can feel good about all the long hours and cold pizzas they've eaten, and so that you can attract investors. Let's say that four employees-programmers-each earned $70,000 per year, and one employee-a receptionist/office manager-earned $50,000 per year. That's an average (mean) employee salary of $66,000 per year (4  $70,000) + (1  $50,000), divided by 5. You and your two friends each took home $100,000 per year in salary. Your payroll costs were therefore (4  $70,000) + (1  $50,000) + (3  $100,000) = $630,000. Now, let's say your company brought in $210,000 in profits and you divided it equally among you and your co-founders as bonuses, giving you $100,000 + $70,000 each. How are you going to report this?

Table of Contents

Introduction: Thinking, Critically ix

Part 1 Evaluating Numbers

Plausibility 3

Fun with Averages 11

Axis Shenanigans 26

Hijinks with How Numbers Are Reported 43

How Numbers Are Collected 75

Probabilities 97

Part 2 Evaluating Words

How Do We Know? 123

Identifying Expertise 129

Overlooked, Undervalued Alternative Explanations 152

Counterknowledge 168

Part 3 Evaluating the World

How Science Works 181

Logical Fallacies 198

Knowing What You Don't Know 211

Bayesian Thinking in Science and in Court 216

Four Case Studies 222

Conclusion: Discovering Your Own 251

Appendix: Application of Bayes's Rule 255

Glossary 257

Notes 263

Acknowledgments 283

Index 285

What People are Saying About This

Biomedical Informatics, Ohio State University - Gregg Gascon

Insightful and entertaining—an excellent work.

author of The Power of Habit and Smarter, Faster, Better - Charles Duhigg

Daniel Levitin's field guide is a critical thinking primer for our shrill, data-drenched age. It's an essential tool for really understanding the texts, posts, tweets, magazines, newspapers, podcasts, op-eds, interviews and speeches that bombard us every day. From the way averages befuddle to the logical fallacies that sneak by us, every page is enlightening.

Professor of Genetics, Genomics, and Development, UC Berkeley - Jasper Rine

Just as Strunk and White taught us how to communicate better, the Field Guide to Lies is an indispensable guide to thinking better. As Big Data becomes a dominant theme in our culture, we are all obliged to sharpen our critical thinking so as to thwart the forces of obfuscation. Levitin has done a great service here.

Magician - Patrick Martin

Well researched, and provides a valuable guide to assist the public with a methodology for evaluating the truth behind this cacophony of information that constantly inundates.

former Vice President, Market Research and Analysis, Prudential Financial, Statistician, U.S.D.A. - Morris Olitsky

Hits on the most important issues around statistical literacy, and uses good examples to illustrate its points. I could not put this book down. Reading it has been a pleasure, believe me. I am so impressed with Levitin's writing style, which is clear and simple, unlike much of the murky stuff that is written by statisticians and many others.

Tarkington Chair of Teaching Excellence and professor of law at Vanderbilt Law School - Edward K. Cheng

This is a wonderful book. It covers so many of the insights of science, logic, and statistics that the public needs to know, yet are sadly neglected in the education that most of us receive.

Senior Lecturer and Policy Fellow, Rockefeller Center, Dartmouth College, author of Naked Economics - Charles Wheelan

The world is awash with data, but not always with accurate information. A Field Guide to Lies does a terrific job of illustrating the difference between the two with precision—and delightful good humor.

From the B&N Reads Blog

Customer Reviews