Yves here. One of my pet peeves is how people treat arguments that incorporate figures as being more solid than ones that are based on qualitative analysis…even though the data were typically developed using methods that make them far less solid than they seem (see an article I wrote on that issue in 2006).
This post sets forth some of the most commonly-used “lying with figures” tricks and should help you learn to recognize them and not be taken in.
By Larry Schwartz, a Brooklyn-based freelance writer with a focus on health, science and American history. Originally published at Alternet
Americans, as P.T. Barnum once noted, are not all that difficult to fool, and our nation’s somewhat weak math skills don’t help. A Pew Research Center report issued last year, which studied test results of 15-year-olds, ranked the United States 35th in the world in math. Not only has this weakness in understanding numbers created opportunities for mass exploitation by Big Pharma and other industries, it has led to needless and mostly unwarranted fear. While Americans don’t understand math, be assured that corporations do, and they happily use it to mislead and obfuscate in the name of selling their products.
Big Pharma doesn’t only target consumers with its misleading advertising; it also targets your doctor. And why not? Sadly, a medical degree doesn’t necessarily mean your doctor is a numbers whiz. In a report in the journal Psychological Science in the Pubic Interest on doctors’ ability to analyze relevant statistics, they were asked, “If my mammogram is positive, what are the odds that I actually have cancer?” Doctors were given all the information needed to answer that question accurately, and a startling number of them still got it wrong. In fact, only 20 percent of them got it right. (The answer, by the way, is a 10 percent chance.) Of those who got it wrong, 60 percent erred drastically on the side of doom, saying the chances of having cancer were 80 to 90 percent. So if your doctor was in that group, and you got a positive mammogram result, she would have told you that you almost certainly have cancer.
The pharmaceutical business, no surprise, is a big numbers abuser. How many advertisements have we seen touting the wonders of a particular drug? “Lowers risk of heart attack by 50 percent!” Well, yes it does lower the risk by half. Dig a little deeper and you discover that your risk of heart attack has dropped from two in a million all the way down to one in a million. That’s a 50 percent drop! Of course, your original two-in-a-million risk wasn’t all that risky, and side effects from the drug might include a few nasty things, but hey, details.
This is known as reporting test results in relative, rather than absolute, numbers. Big Pharma is well aware that saying your risk drops from two in a million to one in a million isn’t so remarkable. They also know that using phrases like “50 percent less risk” will fool most of the people most of the time. They can defend themselves by pointing out that they aren’t outright lying, after all. An osteoporosis drug once claimed to reduce hip fractures by the same whopping 50 percent. Again, technically true, and it sounds impressive. Unmentioned was that out of all untreated osteoporosis sufferers, only about two percent are at risk for hip fractures. So the drug reduced the risk of hip fractures from two percent of all osteoporosis victims to one percent of all of them, from two in 100 to one in 100. Doesn’t sound all that fabulous when stated in those terms, especially when taking into account the often higher cost of many of these drugs.
The concept of relative and absolute risk is important. Big Pharma loves relative risk and hates absolute risk. Relative sells pills. Absolute, not so much. Any medication can claim to cut your relative risk of getting a disease by huge percentages: 10%, 50%, even 100%. But if your absolute chance of even getting the disease is tiny, than the relative risk, no matter how impressive sounding, is also small. Why take the medicine if you are probably not getting all that much benefit from it? Or why not take the generic lower-cost med that offers more or less the same results?
Cooking the numbers isn’t confined to prescription drug marketing. It wasn’t that long ago Colgate was advertising that 80 percent of dentists recommended its toothpaste. And it sure seemed convincing! Eight out of 10 dentists recommended Colgate, and one was supposed to extrapolate that only 2 out of 10 recommended other brands. That’s a landslide. Better buy Colgate!
Hold on. A closer look at the study which yielded that result showed dentists were asked what brands they would recommend and were allowed to choose as many as they wanted. So yes, eight out of 10 dentists chose Colgate. But eight out of 10 may also have chosen Crest or Aim or any number of other brands. No surprise, the ad was eventually banned as being misleading.
In the late 1990s, Centrum, the vitamin maker, made the alarming claim that nine out of 10 Americans were not getting all the nutrients they needed from what they were eating. Ninety percent of us were deficient! We better buy Centrum to make sure we get the proper nutrition. And that is essentially the message of all vitamin manufacturers.
The numbers, however, have obscured the truth. Centrum got those numbers from a completely unscientific survey taken between 1976 and 1980 in which Americans were asked what they ate on the day of the survey. Only nine percent of the participants said they remembered eating their recommended daily allowance of fruit and vegetables. So nine out of 10, or a whopping 90 percent, did not eat their daily allowance that day. Left unsaid was that any one of the 90 percent “deficient” may have eaten more then their daily allowance the day before, or would eat more the day after. A one-day survey is not an adequate sample to indicate overall diet. Not to mention the fact that adequate nutrition can be obtained from other sources and need not be measured in recommended daily servings. For instance, a 15-minute walk in the sun might get you plenty of vitamin D. But information like that doesn’t sell any vitamins. And so far no one has figured out how to charge for sun rays.
A misunderstanding of numbers can also lead to fear-motivated behavior, and you can be sure the marketers are aware of that fact. In 1995, a warning was sent to almost 200,000 doctors and pharmacists in the U.K. that a new iteration of a popular birth control pill could increase the risk of life-threatening blood clots by 100 percent. That sounds terrifying, right? 100 percent! It was enough to get many women to discontinue using the pill. That action helped contribute to 13,000 abortions the following year. The actual risk? The older generation of the pill had a one in 7,000 risk of blood clot. The new generation had a two in 7,000 risk. A 100 percent increase, yes, but in absolute terms, the risk went up from .014% to .029%.
This past year it was announced that the World Health Organization was classifying bacon as a group I carcinogen. Bacon-lovers despaired. It seems that eating two slices of bacon increases the risk of colorectal cancer by 18%. Misunderstood by the public was that this was the relative risk, not the absolute risk. More plainly, taking in the entire population, the risk of getting colorectal cancer at all is about 5%. So if you eat two slices of bacon, your 18% increase of risk is 18% more than your absolute 5% chance. If you are a typical math-adverse American, your head is spinning right now, but what this means is that your overall risk goes from five in 100 all the way up to six in 100. Not exactly a jaw-dropper when put in those terms.
Politicians and their marketers have learned their business lessons well when it comes to using numbers to strike fear in the populace and sway opinion. Politicians routinely spew numbers that, under scrutiny, either don’t add up or are just plain wrong. They know most voters can’t, or won’t bother to add them up or find out the actual truth, and that when people hear numbers, they naively believe they are hearing something scientific, or based in reality. Michelle Bachmann told us in 2013 that “70 cents of every dollar spent on food stamps goes to bureaucrats.” The fact that only one third of one percent goes to bureaucrats shouldn’t stand in the way of a good statistic. Rand Paul told us last year that, “nine out of 10 businesses fail,” as he sought to blame President Obama for wasting tax money on businesses like Solyndra, the solar panel maker. The actual failure rate is about 50 percent after five years.
And then there is the Donald, whose supporters tend to believe because he is a successful, and as he says, “very rich” businessman. In trying to appeal to his conservative base, Trump ran off a series of murder statistics in a tweet this year that was striking in its claims. Only 16 percent of Caucasians were killed by other Caucasians. (Wrong. 82 percent is the correct figure.) Ninety-seven percent of African Americans were killed by other African Americans. (Wrong. 90 percent is the correct figure.) Eighty-one percent of Caucasians were killed by African Americans. (Wrong. Only 15 percent.) Just two percent of African Americans were killed by Caucasians. (Wrong. 8 percent.) Trump used his false statistics to cast black people as the culprits in white murders when the truth is white people are mostly killed by other white people. And even more true is that most murders happen among people of the same race.
In another statement, Trump called for the cessation of all Muslim immigration, based on a survey he cited which said that 25 percent of those polled agreed that “violence against Americans here in the United States was justified as part of the global jihad,” and that 51 percent agreed that “Muslims in America should have the choice of being governed according to Shariah.” Utter hogwash. That “poll” was conducted by a virulently anti-Muslim organization, the Center for Security Policy. Beyond that, the poll itself was fatally flawed, being only an opt-in online survey with a small sample size of only 600, using a question format (agree/disagree questions) that statistically people have been shown to answer “agree” to, and targeting a population (U.S. Muslims) many of whom are first-generation with limited English proficiency. Moreover, the survey participants were U.S. Muslims, meaning their answers, accurate or not, had no bearing on the Muslims Trump was proposing to bar.
If it sounds like we’re picking on Republicans, it’s because there is cause. A study from the non-partisan Center for Media Studies in 2013 came to the conclusion that Republicans lie three times more than Democrats. And a favorite method of political lying is the misstatement of numbers. Politicians and industry and charlatans get away with it because numbers sound impressive to people who don’t understand them, and this includes the mainstream media that reports the numbers without challenge, as if they were fact. If we hear fancy sounding numbers and math intimidates us, we tend to accept the truth of what we are hearing because math has street cred. Mathematicians are smart, and if someone strings together a bunch of numbers and looks confident, we will, more often than not, accept their “smartness.” To paraphrase a famous wizard, “Pay no attention to those numbers behind the curtain!” And so we don’t, especially if the alternative is to do our homework and better understand the math.