A Taxonomy of Fallacies

As anyone who reads this blog regularly knows, I’m a big fan of Venn diagrams. Lately, I’ve been thinking quite a lot about cognitive errors, errors in reasoning, and logical fallacies, for reasons which only coincidentally happen to coincide with the political primary season–far be it from me that the one might be in any way whatsoever connected to the other.

Anyway, I’ve put together a simple taxonomy of common fallacies. This is not, of course, an exhaustive list of fallacies; compiling such a list would surely try the patience of the most saintly. It is, however, intended to show the overlap of argumentative fallacies (arguments which by their nature and structure are invalid), logical fallacies (errors in logical reasoning), and cognitive biases (errors of human reason and our general cognitive processes).

As usual, you can clicky on the picture to embiggen it.

A quick and dirty overview of the various fallacies on this chart:

Ad Homenim: A personal attack on the person making an argument. “You’re such a moron! Only an idiot would think something like that.”

Loaded Question: An argument which presupposes its own answer, presupposes one of its own premises, or presupposes some unsupported assumption in the way it’s phrased. `”Have you stopped beating your wife yet?”

Appeal Tu Quoque: Tu quoque literally means “you also.” It’s an argument that attempts to discredit an argument not on the basis of how valid the argument is, but on the basis of some perceived inconsistency or hypocrisy in the person making it. “You say that a vegetarian diet is more healthy than a diet that is rich in red meats, but I’ve seen you eat steak so you clearly don’t even believe your own argument. Why should I?”

Guilt By Association: Also called the “association fallacy,” this is an argument which asserts that an association exists between two things which means they belong to the same class. It can be made to discredit an argument by attacking the person making it (“Bob says that we should not eat meat; the Radical Animal Liberation Terror Front supports Bob’s argument; therefore, Bob’s argument is invalid”) or to create an association to support an assertion that can not be supported on its own merits (“John is black; I was mugged by a black person; therefore, John can not be trusted”).

Straw Man: An argumentative technique that ignores a person’s actual argument and instead rebuts a much weaker argument that seems related to the original argument in some way (“Bob thinks we should treat animals with respect; the idea that animals are exactly the same as people is clearly nonsense”).

False Analogy: An argumentative technique that creates an analogy between two unrelated things and then uses the analogy to attempt to make an assertion (“The government is like a business. Since the function of a business is to make money, the government should not enact policies that do not generate revenue”).

Cherry Picking: A tactic which presents only information that supports an argument, even if other information doesn’t support it, or even the information which is presented is shown out of context to make it appear to support the argument (“Vaccination causes autism. Andrew Wakefield published one paper that shows vaccination causes autism, so it must be so–even though hundreds of other experiments and published papers show no connection, and Wakefield’s paper was determined to be fraud and retracted”).

Just World Fallacy: The tendency to believe that the world must be just, so that when bad things happen the people who they happen to must have done something wrong to bring them about, and when good things happen, the person who they happened to must have earned them. It’s both a cognitive bias (we tend to see the world this way on an emotional level even if we consciously know better) and an argumentative tactic (for example, a defense attorney defending a rapist might say that the victim was doing something wrong by being out at night in a short dress, and therefore brought the attack upon herself). Part of what makes this so cognitively powerful is the illusion of control it brings about; when we believe that bad things happen because the people they happened to were doing something wrong, we can reassure ourselves that as long as we don’t do anything wrong, those things won’t happen to us.

Appeal to Probability: An argumentative tactic in which a person argues that because something could happen, that means it will happen. Effective in large part because the human brain is remarkably poor at understanding probability. “I might win the lottery; therefore, I simply need to play often enough and I am sure to win, which will solve all my money problems.”

Fallacy of False Dichotomy: Also called the “fallacy of false choice” or the “fallacy of false dilemma,” this is an argumentative fallacy that sets up the false premise that there are only two possibilities which need to be considered when in fact there are more. “Either we cut spending on education or we rack up a huge budget deficit. We don’t want a deficit, so we have to cut spending on education.”

Fallacy of Exclusive Premises: Also called the “fallacy of illicit negative,” this is a logical and argumentative fallacy that starts with two negative premises and attempts to draw an affirmative conclusion: “No registered Democrats are registered Independents. No registered Independents vote in a closed primary. Therefore, no registered Democrats vote in a closed primary.”

Appeal to Ignorance: Also called the “argument from ignorance,” this is a rhetorical device which asserts that an argument must be true because it hasn’t been proven to be false, or that it must be false because it hasn’t been proven to be true (“we can’t prove that there is life in the universe other than on our own planet, so it must be true that life exists only on earth”). Many arguments for the existence of a god or of supernatural forces take this form.

Affirming the Consequent: A logical fallacy which asserts that a premise must be true if a consequence of the premise is true. Formally, it takes the form “If P, then Q; Q; therefore P” (for example, “All dogs have fleas; this animal has fleas; therefore, this animal is a dog”).

Denying the Antecedent: A logical fallacy that asserts that some premise is not true because a consequent is not true. Formally, it takes the form “If P, then Q; not P; therefore, not Q.” For example: “If there is a fire in this room, there must be oxygen in the air. There is no fire in this room. Therefore, there is no oxygen in the air.”

Affirming the Disjunct: Sometimes called the “fallacy of false exclusion,” this logical fallacy asserts that if one thing or another thing might be true, and the first one is true, that must mean the second one is false. For example, “Bob could be a police officer or Bob could be a liar. Bob is a police officer; therefore, Bob is not a liar.” The fallacy asserts that exactly one or the other must be true; it ignores the fact that they might both be true or they might both be false. (Note that in Boolean logic, there is an operator called “exclusive or” or “XOR” which does mean that either one thing or the other, but not both, could be true; this is not related to the logical fallacy of affirming the disjunct.)

Fallacy of Illicit Affirmative: This is the flip side of the fallacy of exclusive premises. It affirms a negative consequent from two affirmative statements. “All true Americans are patriots; some patriots are willing to fight for their country; therefore, there must be some true Americans who aren’t willing to fight for their country.”

Fallacy of Undistributed Middle: A logical fallacy that asserts that X are Y; something is a Y; therefore, something is an X. For example, “All Southern Baptists are Christians; Bob is a Christian; therefore, Bob is a Southern Baptist.” This fallacy ignores the fact that “all X are Y” does not imply that all Y must be X.

Base Rate Fallacy: A logical fallacy that involves failing to apply general information about some statistical probability (the “base rate” of something being true) to a specific example or case. For example, given information which says that HIV is three times more prevalent among homosexuals than heterosexuals, and given the information that homosexuals make up 10% of the population, most people who are told “Bob has HIV” will erroneously conclude that it is quite likely that Bob is gay, because they will consider only the fact that gays are more likely to have HIV but will not consider the “base rate” that gays make up a relatively small percentage of the population. This fallacy is extremely easy to make because of the fact that the human brain is so poor at understanding statistics and probability.

Post Hoc Ergo Propter Hoc: This is Latin for “After the fact, therefore, because of the fact.” Sometimes called the “fallacy of false cause,” it’s a logical fallacy which asserts that if one thing happens and then something else happens, the first thing caused the second thing to happen (“My child had a measles vaccine; my child was diagnosed with autism; therefore, the vaccine caused the autism”). Our brains are highly tuned to find patterns and to seek causation, to the point where we often see it even when it does not exist.

Regression Bias: This is a fallacy that’s closely related to the post hoc, ergo propter hoc fallacy in that it ascribes a false cause to an event. In this particular case, things which normally fluctuate statistically tend to return to a mean; a person may see cause in that regression to the mean even where none exist. For example, “Bob had an amazing string of successes when he was playing basketball. Then he appeared on the cover of Sports Illustrated. Afterward, his performance was more mediocre. Therefore, appearing on the cover of the magazine must have caused him to perform more poorly.” Since even good athletes will generally return to their baseline after particularly exceptional (or particularly poor) performance, appearing on the cover of the magazine is likely to be unconnected with the athlete’s performance regressing to that athlete’s normal baseline.

Argumentum Ad Nauseam: A rhetorical strategy in which a person continues to repeat something as true over and over again, even after it has been shown to be false. Some radio commentators are particularly prone to doing this: “Sandra Fluke wants the taxpayers to pay for contraception. She argues that it is the responsibility of the taxpayer to pay for her contraception. Sandra Fluke believes that contraception should be paid for by the taxpayer.”

Argument from Scripture: An argument which states that if some element in a source being cited is true, then the entire source must be true. This fallacy does not apply exclusively to holy texts or Biblical scriptures, though it is very often committed in religious arguments.

Begging the Question: Similar to the loaded question fallacy, this is an argument in which some argument assumes its own premise. Formally, it is an argument in which the conclusion which the argument claims to demonstrate is part of the premise of the argument. “We know that God exists because we see in nature examples of God’s design.” The premise of this argument assumes that nature is designed by God, which is the conclusion that the argument claims to support.

Circular Argument: This argumentative tactic is related to begging the question, but slightly different in that it uses one argument to claim to prove another, then uses the truth of the second argument to support the first. A lot of folks consider circular reasoning to be the same thing as begging the question, but they are slightly different in that the fallacy of begging the question contains the conclusion of an argument as one of its premises, whereas circular reasoning uses argument A to prove argument B, and then, having proven argument B to be true, uses argument B to prove argument A.

Appeal to Emotion, Force, or Threat: An argumentative tactic in which, rather than supplying evidence to show that an argument is correct, the person making the argument attempts to manipulate the audience’s emotions (“You must find Bob guilty of this murder. If you do not find him guilty, then you will set a dangerous murderer free to prey on your children”).

False Attribution: An argument in which a person attempts to make a position sound more credible either by attributing it to a well-known or respected source, or using a well-known and respected source’s comments out of context so as to create a false impression that that source supports the argument. As Abraham Lincoln said, more than 90% of the quotes used to support arguments on the Internet can’t be trusted!

Association Fallacy: A generalized form of the fallacy of guilt by association, an association fallacy is any argument that makes any assertion that some irrelevant similarity between two things demonstrates that those two things are related. “Bob is good at crossword puzzles. Bob also likes puns. Therefore, we can expect that Jane, who is also good at crossword puzzles, must like puns too.” Because our brains are efficient at categorizing things into groups, we are often prone to believing that categorizations are valid even when they are not.

Vividness Fallacy: Also called the “fallacy of misleading vividness,” this is the tendency to believe that especially vivid, dramatic, or exceptional events are more relevant or more statistically common than they actually are, and to pay special attention or attach special weight to such vivid, dramatic events when evaluating arguments. A common rhetorical strategy is to use vivid examples to create the impression that something is commonplace when it is not: “In New Jersey, a Viet Nam veteran was assaulted in a bar. In Vermont, an Iraqi vet was mugged at knifepoint. American citizens hate veterans!” It is effective because of a cognitive bias called the “availability heuristic,” which causes us to misjudge the statistical importance of an event if we can think of examples of that event.

Entrenchment effect: Also called the “backfire effect,” this is a tendency of people who, when presented with evidence that disproves something they think is true, will often tend to form a greater attachment to the idea that it must be true. I’ve written an essay about framing and entrenchment here.

Sunk Cost Fallacy: An error in reasoning or argument which holds that if a certain investment has been made in some course of action, then the proper thing to do is continue on that course of action so as not to waste that investment, even in the face of evidence that shows that course of action to be unlikely to succeed. In rhetoric, people will often make arguments to support a tenuous position on the basis of sunk cost rather than on the merits of the position; “We should continue to invest in this weapons project even though the engineers say it is unlikely to work because we have already spent billions of dollars on it, and you don’t want that money to be wasted, do you?” These arguments often succeed because people form emotional attachments to a position in which they feel they have made some investment that is completely detached from the value of the position itself.

Appeal to Authority: Also known as the argument from authority, this is an argument that claims that something must be true on the basis that a person who is generally respected or revered says it is true, rather than on the strength of the arguments supporting that thing. As social animals, we tend to give disproportionate weight to arguments which come from sources we like, respect, or admire.

Black Swan Effect: Also called the black swan fallacy, this is the tendency to discount or discredit information or evidence which falls outside a person’s particular range of experience or knowledge. It can take the form of “I have never seen an example of X; therefore, X does not exist;” or it can take a more subtle form (called the “confirmation fallacy”) in which a statement is held to be true because no counterexamples have been demonstrated (“I believe that black swans do not exist. Here is a swan. It is white. Here is another swan. It is also white. I have examined millions of swans, and they have all been white; with all these examples that support the idea that black swans do not exist, it must be a very reliable statement!”).

Confirmation Bias: The tendency to notice, remember, and/or give particular weight to things that fit our pre-existing beliefs; and to not notice, not remember, and/or not give weight to anything that contradicts our pre-existing beliefs. The more strongly we believe something, the more we notice and the more clearly we remember things which support that belief, and the less we notice things which contradict that belief. This is one of the most powerful of all cognitive biases.

Attention Bias: A cognitive bias in which we tend to pay particular attention to things which have some sort of emotional or cognitive resonance, and to ignore data which are relevant but which don’t have that resonance. For example, people may make decisions based on information which causes them to feel fear but ignore information that does not provoke an emotional response; a person who believes “Muslims are terrorists” may become hyperaware of perceived threatening behavior from someone he knows to be Muslim, especially when that perception reinforces his belief that Muslims are terrorists, and ignore evidence which indicates that that person is not a threat.

Choice Supportive Bias: The tendency, when remembering a choice or explaining why one has made a choice, to believe that the choice is better than actually was, or to believe that other options are worse than they actually were. For example, when choosing one of two job offers, a person may describe the job she chose as being clearly superior to the job she did not accept, even when both job offers were essentially identical.

Expectation Bias: Also sometimes called “experimenter’s bias,” this is the tendency of people to put greater trust or credence in experimental results which confirm their expectations than in results which don’t match the expectations. It also shows in the tendency of people to accept without question evidence which is offered up that tends to support their ideas, but to question, challenge, doubt, or dismiss evidence which contradicts their beliefs or expectations.

Pareidolia: The tendency to see patterns, such as faces or words, in random stimuli. Examples include people who claim to see the face of Jesus in a piece of toast, or who hear Satanic messages in music albums that are played backwards.

Rhyming Effect: The tendency of people to find statements more credible if they rhyme than if they don’t. Yes, this is a real, demonstrated cognitive bias. “If the glove don’t fit, you must acquit!”

Framing effect: The tendency to evaluate evidence or to make choices differently depending on how it is framed. I’ve written an essay about framing and entrenchment here.

Ambiguity Effect: The tendency of people to choose a course of action in which they know the exact probability of a positive outcome over a course of action in which the exact probability is not known, even if the probability of a positive outcome is generally somewhere around the same, or if the possible positive outcome is better. There’s an interactive demonstration of this effect here.

Fortunetelling: The tendency to make predictions about the outcome of a choice, and then assume that the prediction is true, and use the prediction as a premise in arguments to support that choice.

44 thoughts on “A Taxonomy of Fallacies

  1. I would suggest a different example of the ‘sunk cost fallacy’ it the often-cited-in-wars ‘We must continue so that they didn’t die in vain!’… for which you would get a lot of ‘but, (insert your favorite other false argument)’

  2. I would suggest a different example of the ‘sunk cost fallacy’ it the often-cited-in-wars ‘We must continue so that they didn’t die in vain!’… for which you would get a lot of ‘but, (insert your favorite other false argument)’

  3. I will freely admit that I was working on this at about 1 AM and it was getting harder and harder to come up with examples for the fallacies. 🙂 Some of them do need better examples. I know I’ve heard folks make the fallacy of illicit affirmative in real-world arguments, but for the life of me I couldn’t think of an example. (If you’ve got one, I’m all ears!)

    The Wakefield example, unfortunately, I hear all…the…time. In the last year or two I’ve probably heard five or six folks claim that Wakefield’s paper proves that there is a link between vaccination and autism, and then express absolute confidence that every other published study is unreliable because they are all part of a coverup. I’ve even heard it argued that Wakefield’s fraud conviction and the retraction of his paper are part of that orchestrated coverup. So it’s an extreme example, but it still reflects my experience in real-world arguments.

    If I recast this as a poster, I’ll be able to get rid of the lines completely and list the names of the fallacies right inside the diagram. I couldn’t do that here and have it be readable on the Web.

  4. I will freely admit that I was working on this at about 1 AM and it was getting harder and harder to come up with examples for the fallacies. 🙂 Some of them do need better examples. I know I’ve heard folks make the fallacy of illicit affirmative in real-world arguments, but for the life of me I couldn’t think of an example. (If you’ve got one, I’m all ears!)

    The Wakefield example, unfortunately, I hear all…the…time. In the last year or two I’ve probably heard five or six folks claim that Wakefield’s paper proves that there is a link between vaccination and autism, and then express absolute confidence that every other published study is unreliable because they are all part of a coverup. I’ve even heard it argued that Wakefield’s fraud conviction and the retraction of his paper are part of that orchestrated coverup. So it’s an extreme example, but it still reflects my experience in real-world arguments.

    If I recast this as a poster, I’ll be able to get rid of the lines completely and list the names of the fallacies right inside the diagram. I couldn’t do that here and have it be readable on the Web.

  5. Great list!

    A note about the ad hominem fallacy. It is more than a personal attack; it is an attack against a argument made by calling attention to some characteristic of the person making it. As I once posted to a climate change forum:

    Saying that the Heartland Institute is a corporate front group is not an ad hominem attack; it is a statement of fact.

    Saying that the Heartland Institute’s claims that global warming is a myth are wrong because it is a corporate front group is an ad hominem attack.

    However, saying that the Heartland Institute’s claims that global warming is a myth are wrong because they’ve been debunked many, many times by climate scientists is not an ad hominem attack; it is a statement of fact.

  6. Great list!

    A note about the ad hominem fallacy. It is more than a personal attack; it is an attack against a argument made by calling attention to some characteristic of the person making it. As I once posted to a climate change forum:

    Saying that the Heartland Institute is a corporate front group is not an ad hominem attack; it is a statement of fact.

    Saying that the Heartland Institute’s claims that global warming is a myth are wrong because it is a corporate front group is an ad hominem attack.

    However, saying that the Heartland Institute’s claims that global warming is a myth are wrong because they’ve been debunked many, many times by climate scientists is not an ad hominem attack; it is a statement of fact.

  7. Nicely done!

    Having grown up in Australia, I was familiar with black swans long before I encountered the Black Swan Effect. It continues to amuse me a great deal – not least because white swans are still the weird-looking ones to me!

  8. Nicely done!

    Having grown up in Australia, I was familiar with black swans long before I encountered the Black Swan Effect. It continues to amuse me a great deal – not least because white swans are still the weird-looking ones to me!

  9. I love it, but it makes my eyes hurt. I can’t track the lines to their appropriate x-spots. I actually can’t–I try to focus in and track them back, and the lines go all wiggly before I get there. Maybe this is just some peculiar characteristic of my vision, but if you do a future edition, I wonder if there’s a better way to mark the locations of the fallacies.

  10. I love it, but it makes my eyes hurt. I can’t track the lines to their appropriate x-spots. I actually can’t–I try to focus in and track them back, and the lines go all wiggly before I get there. Maybe this is just some peculiar characteristic of my vision, but if you do a future edition, I wonder if there’s a better way to mark the locations of the fallacies.

  11. I would suggest color-coding the textual name of the fallacy to match the region in the Venn diagram they are in. That way, tracing the lines back is not quite as needed to figure out where things go.

  12. I’ve made a new example for the “All dogs are mammals; all mammals are animals; therefore dogs aren’t animals” that I think makes a closer real-world match.

  13. I’ve made a new example for the “All dogs are mammals; all mammals are animals; therefore dogs aren’t animals” that I think makes a closer real-world match.

  14. The Black Swan Effect

    There is a tension between the Black Swan Effect and at least two other logical fallacies on your list. I have seen people using the Black Swan Effect as part of an Appeal to Ignorance and Appeal to Probability.

    One way I’ve gotten around an excessive emotional attachment to my arguments his to try to have a very minimal set of principles that I think are important. Sometimes there are subsidiary principles that seem to derive from the others. But generally there are special cases in which they do not apply.

    This makes it easier to discard ideas that are demonstrably wrong. As long as the new idea doesn’t conflict with the carefully chosen set of principles I’ve decided are important there is no good reason to hold onto old ideas.

  15. The Black Swan Effect

    There is a tension between the Black Swan Effect and at least two other logical fallacies on your list. I have seen people using the Black Swan Effect as part of an Appeal to Ignorance and Appeal to Probability.

    One way I’ve gotten around an excessive emotional attachment to my arguments his to try to have a very minimal set of principles that I think are important. Sometimes there are subsidiary principles that seem to derive from the others. But generally there are special cases in which they do not apply.

    This makes it easier to discard ideas that are demonstrably wrong. As long as the new idea doesn’t conflict with the carefully chosen set of principles I’ve decided are important there is no good reason to hold onto old ideas.

  16.        I wish more people got rhetoric training. You need to post this in every LJ community, and then re-post it about once a month, until people stop doing 90% of them, at least, in every debate.

    K.

  17.        I wish more people got rhetoric training. You need to post this in every LJ community, and then re-post it about once a month, until people stop doing 90% of them, at least, in every debate.

    K.

  18. i think 8 is better. not for the average user, though ;( but for people who can handle memorizing a few new keyboard shortcuts to get around the interface, i think some of the new features are worth it…

  19. i think 8 is better. not for the average user, though ;( but for people who can handle memorizing a few new keyboard shortcuts to get around the interface, i think some of the new features are worth it…

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.