Previous Month | RSS/XML | Current

WEBLOG

January 11th, 2022 (Permalink)

The Pandemic of Pseudoknowledge1

You've probably heard about the false claims about COVID-19 made by Associate Justice Sonia Sotomayor during oral argument before the United States Supreme Court a few days ago. If not, here are relevant passages from the transcript:

…[W]e are now having deaths at an unprecedented amount. … Counsel, those numbers show that Omicron is as deadly and causes as much serious disease in the unvaccinated as Delta did. Look at the hospitalization rates that are going on. We have more affected people in the country today than we had a year ago in January. We have hospitals that are almost at full capacity with people severely ill on ventilators. We have over 100,000 children, which we've never had before, in serious condition and many on ventilators.2

Almost none of this is true, but I'm not going into details since it's already been fact checked by others3. Moreover, it's not just incorrect, but some of it is "wildly incorrect", to quote The Washington Post's Fact Checker, who gives her four Pinocchios, the maximum number.

Where did this misinformation come from? The claim that "we have over 100,000 children…in serious condition and many on ventilators" may come from the fact that there are over 100K cases of COVID-19 among children, but few of these are serious and it's doubtful that any, let alone "many", are on ventilators.

Clearly, Sotomayor is not simply ignorant; if she were, she would either have been silent or simply said that she didn't know. Instead, she uttered egregious falsehoods. She also certainly wasn't lying; what motive would she have for doing so, especially given the fact that the falsehoods were quickly debunked by fact checkers? What's left? She genuinely believed factual claims that were not just false, but outrageous.

Sotomayor is not the only offender on the court; here's Associate Justice Stephen Breyer during the same hearing:

I mean, you know, 750 million new cases yesterday or close to that is a lot. I don't mean to be facetious.2

That is a lot: a lot more than it is. Presumably, Breyer meant to say "thousand" instead of "million"―he got the number correct earlier and later. A thousand, a million, a billion―what's the difference? It's a lot! And I do mean to be facetious. Breyer's innumerate claim is not as bad as Sotomayor's because it appears to have been a slip. Still, to confuse thousands and millions and not be corrected by anyone present seems to show a lack of number sense, that is, a feel for the relative sizes of different numbers.

Nonetheless, this degree of innumeracy and pseudoknowledge at the highest levels of American government is alarming. I don't know why, but neither counsel nor the other justices challenged these falsehoods. Was this because they didn't realize that the claims were false, or were they afraid to challenge and perhaps publicly embarrass Sotomayor or Breyer?

For almost two years now we've had a pandemic of misinformation about the coronavirus and its subsequent variants, despite―or rather because of―the constant coverage from the major news media. As a result, pseudoknowledge about COVID-19 reaches to the highest levels of the U.S. government. Nine people will make a decision about how the government handles this disease that will affect the lives of a third of a billion Americans. Given that the court is being asked to adjudicate a case involving COVID-19, you might expect them to do their homework and be well-informed about it. Instead, they're well-misinformed.

I've been pointing out for the last year and a half that public health officials and the major news media have been doing a terrible job of informing the American people about the coronavirus and its variants4. What little survey evidence I've seen indicates that the effect has been largely to grossly misinform the general public. The current incident, as minor as it is in other ways, shows that the misinformation goes to the highest levels of the U.S. government and to our "best and brightest".

I don't know the solution to this problem, but the first step in solving a problem is to recognize that you've got one. Houston, we have a problem!


Notes:

  1. What is pseudoknowledge? It is a false belief that masquerades as knowledge.
  2. "Nat. Fed'n of Indep. Bus. v. Dept. of Labor", Supreme Court of the United States, 1/7/2022
  3. I've put the following in alphabetical order by first author's last name, but if you're in a hurry, read the Annenberg first or only:
  4. See:

January 7th, 2022 (Permalink)

Credibility Checking, Part 1: Compare & Contrast

What is credibility checking? It's a process that lies halfway between simply questioning a factual claim and giving it a complete factual check-up. Since a thorough fact check takes time and effort, a credibility check is a way to test whether a claim is likely to be worth the investment. Unlike a fact check, a credibility check uses only two things: what you already know and ballpark estimation. As a result, checking credibility usually takes no more than a few minutes.

What is credibility?

The following words are synonyms: credible, plausible, and believable. I will use all three interchangeably, mainly for the sake of variety, but I call the technique I'm describing "credibility checking." All three words can be used to describe both claims and the people who make them. A credible witness is one whose testimony is plausible, and plausible testimony enhances the believability of a witness. In the following, I will be examining only the credibility of claims.

Literally, a claim is credible, plausible, or believable, if it is able to be believed. However, perhaps the best way to understand it is by contrast with its antonym, namely, "incredible". Often, the word "incredible" is used in the sense of "surprisingly good" or just "very good", but I mean it in its literal sense of "unbelievable" or "implausible". A claim is incredible if there is good reason in logic or common knowledge to think that it cannot be true.

For instance, the claim that the 7 × 13 = 27 is incredible1. If I multiplied and got that result, I would be sure that I had made a mistake. A feeling for what is credible and incredible in mathematics is called "number sense", which is an aspect of numeracy. Number sense is an important skill because mistakes in calculation are unavoidable, even if you use a calculator, since a misplaced decimal point or an incorrect leading digit will cause a miscalculation. A seemingly small error in data entry can cause an enormous mistake in the result of a calculation, and to catch such mistakes requires at least an order-of-magnitude (OoM) estimate of what the result should be.

A claim is plausible if it is not incredible, that is, if there's no prima facie reason to think that it cannot be true. That a claim is incredible does not automatically mean that it's false, since some seemingly incredible things have turned out to be true. However, an incredible claim requires very strong evidence to establish its truth. For instance, if my great aunt Bertha claimed to have a mermaid in her backyard swimming pool, I would have to see it in person to believe it. "Extraordinary claims require extraordinary evidence", as Carl Sagan liked to say2, and incredible claims are extraordinary. Similarly, that a claim is credible is no reason by itself to think that it is true, since many credible claims are false. For example, it's plausible that I have a great aunt Bertha, but I don't.

Healthy Skepticism

A healthy skepticism is a prerequisite for plausibility checking. What is a healthy skepticism? It's a golden mean3 between gullibility and cynicism. When confronted with a factual claim, the gullible person will accept it without question, at least if it fits in with that person's prejudices. In contrast, a cynic will reject the claim out of hand without bothering to check it, especially if it conflicts with the cynic's prejudices. Confronting the same claim, the healthy skeptic will question it, but not reject it without further examination. The skeptic asks: "Is that right? What's the evidence for it?" Moreover, unlike the unhealthy cynic, the skeptic is willing to accept the claim if sufficient evidence is found for it.

Given that your skepticism is healthy, the first thing you will do is a credibility check. Since you're confronted with hundreds, or even thousands, of new factual claims every day, there's no way that you can fact check them all. At best, you may be able to check a few claims each day. So, how do you decide which to check? Credibility checking.

An Example

In 1983, Senator Orrin Hatch made the following claim: "…[L]ast year the Federal Bureau of Investigation [FBI] approximated the number of children abducted by strangers to be 50,000.4" This claim is backed by impressive credentials: it comes from a United States Senator who attributes it to the FBI, a government agency that collects crime statistics among other things. So, you'd expect it to be right, and you may understandably be inclined to accept it without checking. However, Hatch was a politician who may have had political motives for making the claim, and there's always the possibility of a mistake, such as an accidentally added zero or a misunderstanding of what the number quantified5. For this reason, it's a good idea to give it a quick credibility check. But how do you do that?

Use What You Know

You probably know more about many things than you think you do, but unless you are an expert in historical crime statistics, it's unlikely that you'll know whether 50,000 children were kidnapped by strangers in the U.S. in 1982. However, you can use what you do know to estimate what you don't.

For instance, do you have some idea of how many people die in automobile accidents every year in the U.S.? Is it hundreds? Is it thousands? Tens of thousands? Hundreds of thousands? You don't need to know the exact number; all you need is an estimate of its OoM, that is, whether it is in the tens, hundreds, thousands, etc. If you haven't already done so, mentally estimate the number of deaths from car accidents in a typical year.

I don't know what you estimated, but here's one way to think this through. We'll estimate both a minimum and a maximum OoM, then the number we want will be in between.

Judging from experience and media reports, the number of fatalities in my own state―which is a typical state―must be at least in the hundreds, and maybe as much as a thousand per year. So, nationwide the number will be at least in the thousands, and probably the tens of thousands, given that there are fifty states. For this reason, let's set the minimum at tens of thousands.

Could the number be as high as the millions? Surely not; there would be much more concern about road safety than there is now if it were. What about hundreds of thousands? This is not so clear, but consider the fact that COVID-19 killed roughly 400,000 people both last year and the year before6. Surely, if the number of traffic deaths was near this or greater, we would be hearing more about it than we do. So, let's set the maximum at tens of thousands.

This means that both our minimum and maximum are in the tens of thousands, that is, at least 10K and at most 100K. How close is this estimate to your own? More importantly, how close is it to the actual statistics on deaths from automobile accidents?

Nowadays, the number of automobile fatalities tends to range between 30-35K a year; for instance, it was over 32K in 2013, according to the C.D.C.7. In the late twentieth century, it tended to be somewhat higher, usually running between 40 and 50 thousand8. So, this was a good ballpark estimate: while it's not precise, it's at the right OoM, and that's all we need.

Compare & Contrast

One way to check the credibility of what you don't know is to compare it to what you do know. You don't know how many American children were abducted by strangers in 1982, but you now know approximately how many people of all ages died in traffic accidents in that year.

The number of traffic fatalities that year was in the tens of thousands, which means that the claimed number of child abductions is of the same OoM. How does this affect its credibility? Have you ever noticed that every major car accident in your area is reported by the local news media? Wouldn't the news media report every missing child and give it even more attention than a car accident, even a fatal one? How many such stories have you noticed in your local press?

Do you know anyone who was killed in an automobile accident? Have you had any friends or family members who died in this way? Was anyone you went to school with, or worked with, a traffic fatality? Compare this with kidnapping by strangers. Was any friend of yours, family member, fellow student, or co-worker kidnapped by a stranger as a child?

Do you remember AMBER alerts9? This is an alert that is broadcast to the community when a child goes missing. They're still a thing. Now, not every child who is missing has been kidnapped by a stranger, but every child abducted by a stranger will be missed at some point by parents or other adults, so there ought to be an alert. How many AMBER alerts did you notice in the past year?

Conclusion

This credibility check suggests that the claim that 50,000 children were abducted by strangers in 1982 is implausibly high, but it does not prove that it's wrong. However, that's not the purpose of a credibility check; rather, the failure of the claim to pass this check is a reason to go ahead with a full fact check.

Next Month in Part 2: Divide & Conquer


Notes:

  1. See: "Abbott & Costello 7 × 13 = 27", YouTube, 9/6/2015.
  2. Carl Sagan, Broca's Brain: Reflections on the Romance of Science (1980) p. 73. Sagan adapted this saying from one of Marcello Truzzi's; see: Quote…Unquote.
  3. The "golden mean" is the name given by the Roman poet Horace to Aristotle's doctrine that virtue is a mean between two extremes, both of which are vices; for instance, bravery is a mean between rashness and cowardice. See: Antony Flew, A Dictionary of Philosophy (Revised second edition, 1984).
  4. Senator Orrin G. Hatch, "Statement on Child Kidnapping & Victimization", in "Child Kidnaping. Oversight Hearing Inquiry into the Priorities and Practices of the FBI in Child Kidnaping Cases before the Subcommittee on Juvenile Justice of the Committee on the Judiciary, United States Senate, Ninety-Eighth Congress, First Session", 2/2/1983, p. 14.
  5. Another possible source of confusion comes from the fact that the scope of "last year" is ambiguous: is it last year that the 50,000 children were abducted, or last year that the F.B.I. made the claim? If the latter, then what was the time frame of the statistic? Was it also last year, or some longer period of time? In the text above, I assume that 50,000 children were supposed to have been abducted in 1982, that is, "last year".
  6. Julie Bosman, Amy Harmon, Albert Sun, Chloe Reynolds & Sarah Cahalan, "Covid deaths in the United States surpass 800,000.", The New York Times, 12/15/2021.
  7. "Motor Vehicle Crash Deaths", Centers for Disease Control and Prevention, 7/6/2016.
  8. "Motor Vehicle Safety Data", Bureau of Transportation Statistics, accessed: 1/7/2022.
  9. "About AMBER Alert", United States Department of Justice, 10/20/2019.

Previous Month | RSS/XML | Current