Taxonomy of the Logical Fallacy of Appeal to Misleading Authority Alias:
  • Appeal to Authority
  • Argument from Authority
  • Argumentum ad Verecundiam
    Translation: "Argument from respect/modesty" (Latin)
  • Ipse Dixit
    Translation: "He, himself, said it" (Latin)


[I]t is not what the man of science believes that distinguishes him, but how and why he believes it. His beliefs are tentative, not dogmatic; they are based on evidence, not on authority or intuition.


Source: Bertrand Russell, A History of Western Philosophy (Book-of-the-Month Club, 1995), p. 527.
Click to enlarge


Authority A believes that P is true.
Therefore, P is true.


Cheating by the Soviets

Barry Schweid of the Associated Press, in his efforts to criticize President Reagan's space-based defense against Soviet missiles, came up with a report from some Stanford University group that claimed to find little evidence of cheating by the Soviet Union on arms-control treaties.

Where were they when Secretary of Defense Caspar Weinberger and George Shultz, secretary of state, and several members of our military forces went on TV and described and enumerated the different times and ways that the Soviet Union has cheated on the 1972 Anti-Ballistic Missile Treaty?

Does Schweid really believe that the group at Stanford is more knowledgeable about U.S. arms-control policy than all our military experts, with Congress thrown in for good measure? If I thought that was true, I wouldn't sleep much tonight. And I doubt if he would either.

Source: Middleton B. Freeman, Louisville, "Letters From Readers", The Courier-Journal, April 1, 1987.



We must often rely upon expert opinion when drawing conclusions about technical matters where we lack the time or expertise to form an informed opinion. For instance, those of us who are not physicians usually rely upon those who are when making medical decisions, and we are not wrong to do so. There are, however, four major ways in which such arguments can go wrong:

  1. An appeal to authority may be inappropriate in a couple of ways:
    • It is unnecessary. If a question can be answered by observation or calculation, an argument from authority is not needed. Since arguments from authority are weaker than more direct evidence, go look or figure it out for yourself.

      The renaissance rebellion against the authority of Aristotle and the Bible played an important role in the scientific revolution. Aristotle was so respected in the Middle Ages that his word was taken on empirical issues which were easily decidable by observation. The scientific revolution moved away from this over-reliance on authority towards the use of observation and experiment.

      Similarly, the Bible has been invoked as an authority on empirical or mathematical questions. A particularly amusing example is the claim that the value of pi can be determined to be 3 based on certain passages in the Old Testament. The value of pi, however, is a mathematical question which can be answered by calculation, and appeal to authority is irrelevant.

    • It is impossible. About some issues there simply is no expert opinion, and an appeal to authority is bound to commit the next type of mistake. For example, many self-help books are written every year by self-proclaimed "experts" on matters for which there is no expertise.
  2. The "authority" cited is not an expert on the issue, that is, the person who supplies the opinion is not an expert at all, or is one, but in an unrelated area. The now-classic example is the old television commercial which began: "I'm not a doctor, but I play one on TV...." The actor then proceeded to recommend a brand of medicine.
  3. The authority is an expert, but is not disinterested. That is, the expert is biased towards one side of the issue, and his opinion is thereby untrustworthy.

    For example, suppose that a medical scientist testifies that ambient cigarette smoke does not pose a hazard to the health of non-smokers exposed to it. Suppose, further, that it turns out that the scientist is an employee of a cigarette company. Clearly, the scientist has a powerful bias in favor of the position that he is taking which calls into question his objectivity.

    There is an old saying: "A doctor who treats himself has a fool for a patient," and a similar version for attorneys: "A lawyer who defends himself has a fool for a client." Why should these be true if the doctor or lawyer is an expert on medicine or the law? The answer is that we are all biased in our own causes. A physician who tries to diagnose his own illness is more likely to make a mistake out of wishful thinking, or out of fear, than another physician would be.

  4. While the authority is an expert, his opinion is unrepresentative of expert opinion on the subject. The fact is that if one looks hard enough, it is possible to find an expert who supports virtually any position that one wishes to take. "Such is human perversity", to quote Lewis Carroll. This is a great boon for debaters, who can easily find expert opinion on their side of a question, whatever that side is, but it is confusing for those of us listening to debates and trying to form an opinion.

    Experts are human beings, after all, and human beings err, even in their area of expertise. This is one reason why it is a good idea to get a second opinion about major medical matters, and even a third if the first two disagree. While most people understand the sense behind seeking a second opinion when their life or health is at stake, they are frequently willing to accept a single, unrepresentative opinion on other matters, especially when that opinion agrees with their own bias.

    Bias (problem 3) is one source of unrepresentativeness. For instance, the opinions of cigarette company scientists tend to be unrepresentative of expert opinion on the health consequences of smoking because they are biased to minimize such consequences. For the general problem of judging the opinion of a population based upon a sample, see the Fallacy of Unrepresentative Sample.

To sum up these points in a positive manner, before relying upon expert opinion, go through the following checklist:

  • Is this a matter which I can decide without appeal to expert opinion? If the answer is "yes", then do so. If "no", go to the next question:
  • Is this a matter upon which expert opinion is available? If not, then your opinion will be as good as anyone else's. If so, proceed to the next question:
  • Is the authority an expert on the matter? If not, then why listen? If so, go on:
  • Is the authority biased towards one side? If so, the authority may be untrustworthy. At the very least, before accepting the authority's word seek a second, unbiased opinion. That is, go to the last question:
  • Is the authority's opinion representative of expert opinion? If not, then find out what the expert consensus is and rely on that. If so, then you may rationally rely upon the authority's opinion.

If an argument to authority cannot pass these five tests, then it commits the fallacy of appeal to misleading authority.


Since not all arguments from expert opinion are fallacious, some authorities on logic have taken to labelling this fallacy as "appeal to inappropriate or irrelevant or questionable authority", rather than the traditional name "appeal to authority". For the same reason, I use the name "appeal to misleading authority" to distinguish fallacious from non-fallacious arguments from authority.



Reader Response:

John Congdon writes:

A thought about the "Appeal to Misleading Authority" fallacy. In your section on this fallacy, you propose a five-point checklist for determining if an appeal to authority is appropriate. I would suggest a sixth: "Is there opinion available from this expert on this subject?". In other words, does the authority actually support the appeal, or are the authority's words being taken out of context or otherwise misunderstood?

I have seen similar checklists for evaluating arguments from authority which include such a point. However, there are two reasons why there is no such point on my list:

  1. Certainly, when evaluating an appeal to authority for cogency, the first step one should take is to verify that the authority is cited correctly. If the authority's position is either misquoted, misrepresented, or misunderstood, then the argument will be uncogent due to a false premiss. However, having a false premiss is not, in my view, a logical fault, but is a factual fault. Factual errors can be just as important as logical errors, but they are distinct types of error.
  2. Quoting out of context is a fallacy in its own right. It is uncommon to treat quoting out of context as a separate fallacy, but I do so because it is a more common error than a number of traditional fallacies. Because I treat it separately, I don't include it in the checklist for appeal to misleading authority. However, it is true that quoting out of context often occurs in appeals to authority, so it is something to watch out for.


My thanks to Dr. Gary Foulk for critiquing this entry, though I did not take all of his advice. Also, thanks to readers Stephen Beecroft, Brandon Milam, and Sarah Natividad for criticizing the Analysis of the Example, which led me to revise it. An additional thank you to Theo Clark for calling the Russell quote to my attention.

Analysis of the Example:

The example commits the fallacy of Ad Verecundiam because most of the authorities cited are not disinterested (problem 3). Weinberger and Shultz were members of Reagan's cabinet, and could be counted on to support his proposals. Similarly, members of the armed forces are not encouraged to disagree with the Commander-In-Chief, especially when the services stand to benefit from the proposal. The one exception in this letter is Congress, which was controlled by the opposition party, but this evidence is added as an afterthought and is difficult to assess. In contrast, the Stanford University group cited by Schweid was disinterested, so far as we can tell from the letter.

Some readers have suggested that the Stanford University group might be ideologically biased against a space-based missile defense. However, even if true—and there is no evidence from the letter that it is—members of the administration were likely to be ideologically biased in favor of such a system. So, ideological bias is a tie between the two sources. The problem with the administration authorities is not ideological bias, but institutional bias in favor of an employer. The Stanford University group lacks that kind of bias.

Reader Responses:

  • "The analysis author misses a few critical points, specifically the question of 'Is the authority an expert on the matter? If not, then why listen?' and 'Is the authority's opinion representative of expert opinion?,' focusing only on the fourth item on your checklist, the matter of bias. The writer in the example given specifically asks why one should consider the Stanford Group (who is not named specifically, nor their members described, unfortunately) to be more 'expert' than a few different branches of the US government. As such, he ignores the writer's argument and attempts to make it one of bias, which is all well and good assuming that one considers both Stanford and the US government equally expert on the topic of Soviet weapon practices.

    "This is one of my pet peeves considering how often one sees on TV and in print 'That man is biased, thus his word cannot be trusted even though he is a noted researcher in the field. This layman, however, is perfectly innocent, and happens to agree with me, so let's listen to him.' Particularly since it presumes that any interest begets bias enough to spin falsehoods, and that someone can do all sorts of fact finding and research without being the least interested (in the bias sense) in their theories. In other words, people argue over who is getting paid by whom, not actual facts or data."―Eric J. Hammer

    You're right that we don't know anything from the letter about the Stanford group other than that they were from Stanford and they were a group. It's possible that they were a group of Stanford students or, more likely, a politicized group of Stanford faculty who were opining outside of their areas of expertise. However, the point of the example is not that the Stanford group was correct or should be believed, but that the letter to the editor makes a poor argument against them. If the Stanford group were in fact unqualified, or were not disinterested, then the letter writer should have challenged their qualifications or disinterestedness, rather than citing other experts who were clearly not disinterested.

  • "I think this is missing from your checklist: If the military is wrong about an opinion such as 'the Soviets are cheating' bad things happen to the military (careers ruined, etc.). If the Stanford group is wrong, nothing bad happens to Stanford. So when giving an opinion the military risks a lot of prestige but Stanford risks nothing. What this boils down to is the military would be held accountable where Stanford would not. This seems like it would be a separate checklist question. Because if someone is very accountable it would make an expert opinion more important."―Mitch Hunt

    Accountability is certainly something to think about when evaluating expert opinion, especially when trying to decide between "dueling experts". That said, I don't see why accountability would be all on one side in the example. If the Stanford group were publicly shown to be wrong, then that could damage its reputation, the reputations of its individual members, as well as the reputation of Stanford University by association.

    Moreover, the accountability of the military is a two-edged sword. The military is ultimately accountable to the civilian government, specifically the executive branch, and the head of that branch is also the Commander-in-Chief. So, a military career could certainly be ruined by taking a public position in opposition to the President's stated policy. In this case, I think that the issue of accountability gives additional support to the Stanford group.