Tuesday, October 07, 2014
Analytical Thinking — Logic Errors 101
Improving learning and memory entails developing learning and thinking competencies. Chief among these competencies are logical analysis, insightfulness, and the ability to remember what has been learned. Most of my posts have dealt with improving memory. Here, I start a two-series posting on the other two basic learning competencies: critical thinking and creativity. In this post, I will summarize the more common thinking errors that corrupt analysis. Most of the thinking errors below can be described as "specious," which means superficially plausible but actually wrong.
AD HOMINEN ARGUMENT: discounting a position or conclusion on the basis of the credentials of the person who makes it, rather than the merits of the argument itself. Example: Your professor has no credibility to teach on cognitive neuroscience, because he is a veterinarian. Such statements not only fail to acknowledge whatever truth is in the teaching, but also fail to consider that the person has more experience and knowledge than his or her label would imply. See "all-or-none thinking" below.
Two related argumentation tactics for discounting one's credibility are politicization and ridicule. In politicization, the tactic is to align one's position with the majority, as if that conferred more logical validity. With ridicule, the tactic aims to prevent serious consideration of the ridiculed position and create a default endorsement of the protagonist's position.
ALL-OR-NOTHING THINKING: thinking of things in absolute terms, like “always”, “every” or “never”. Nuance is absent.
ANTHROPOMORPHISM: to attribute qualities and properties to non-people that only people can have. Example: “the purpose of evolution is to ….” Evolution happens, but not because it has a purpose.
APPEAL TO AUTHORITY: attempts to justify the conclusion by quoting an authority in its support. This also includes greater readiness to accept ideas if they come from someone with the appropriate credentials rather than the intrinsic merit of the ideas.
APPEAL TO CONSENSUS: positions defended on the basis that many people hold the same view. This is sometimes called the “Bandwagon Fallacy.” Correctness of a position does not depend on who or how many hold it. See also the comments above about politicization
APPEAL TO IGNORANCE: using an opponent’s inability to disprove a conclusion as proof of the conclusion’s errors. Absence of evidence is not evidence of absence.
APPEAL TO FINAL CONSEQUENCES: claiming validity for one’s position on the basis of the expected outcome or consequence (also known as a "teleological" argument or sometimes as a "circular argument"). Example: people have free will because otherwise they can’t be held responsible for bad behavior. This is not evidence for the assertion, but merely a statement of a supposed consequence.
ARGUMENT SELECTIVITY: using arguments supporting your position while glossing over the weaknesses and leaving out important alternative arguments. This is often called “cherry picking.” A related argumentation error is to ignore the valid ideas of another while focusing on their ideas that are easier to attack. A related inappropriate selectivity is rejecting an idea altogether just because some part of it is wrong.
A variation of this error is “false dichotomy,” where a set of valid possibilities is reduced to only two.
BEGGING THE QUESTION: an argument that simply reasserts the conclusion in another form. This is a common fallacy where one tries to explain an idea or position by restating a description in a different way. The restatement is still a description, giving the illusion that this new way of stating something explains it.
BIASED LABELING: how one labels a position can prejudice objective consideration of the position. For example, calling a position “Science-based” does not necessarily make it true. Or, conversely, calling a position “colloquial” does not necessary invalidate it.
CIRCULAR REASONING: reasoning where a belief in a central claim is both the starting point and the goal of the argument.
CLOUDING THE ISSUE (OBFUSCATION): using multiple complex ideas and data of unclear relevance to overwhelm the capacity for reason, yet giving the impression of authority and reason— in other words, "baffling them with B. S." Simple language in the service of lucid thought is the sign of superior intelligence.
COGNITIVE SHORTCUT BIAS (Einstellung). This is the dogged tendency to stick with a favored view or argument for a position, and ignoring, in the process, other more fruitful possibilities. Even chess masters, for example, may use an established gambit when a better tactic is available.
CONFIRMATION BIAS. People have a natural tendency to notice only the facts that support their position while discounting those that do not — in other words, believing what you want to believe.
CONFUSING CORRELATION WITH CAUSATION. When two things happen together, and especially when one occurs just before the other, people commonly think that one thing causes the other. Without other more direct evidence of causation, this assumption is invalid. Both events could be caused by something else. In case people need convincing, just remind them of this example: rain and lightning go together, but neither causes the other.
CONFUSING FORCE OF ARGUMENT WITH ITS VALIDITY: repeating erroneous argument does not validate it. Saying it louder doesn’t help either.
DEDUCTION FALLACIES: a valid deductive argument must have consistent premises and conclusions (both must be either true or both false). Failure to be consistent produces “non sequiturs,” that is, conclusions that are not logical extensions of the premise.
EMOTIONAL REASONING: Making decisions and arguments based on how you feel rather than objective reality. This is an emotional "knee jerk" kind of thinking, often the first thing that comes to mind, which often precludes or overwhelms rational analysis. This error is common in political discourse. People who allow themselves to get caught up in emotional reasoning can become completely blinded to the difference between feelings and facts. For example, scientists sometimes unduly value a position because it is “parsimonious,” or elegant, or easily understood (or even complex and sophisticated).
EXLUSIVITY CONFUSION. When several apparent ideas or facts are examined, it is important to know whether they are independent, compatible, or mutually exclusive. Example: concepts of evolution and creationism, as they are typically used, are mutually exclusive. However, stated in other ways, they might be more compatible.
FALSE ANALOGY: explaining an idea with an analogy that is not parallel, as in comparing apples and oranges.
HALO EFFECT: Generalizing merit to an idea or thought on the basis of irrelevant merit of something else. For example, if you like someone, you are more likely to be accepting of their thinking and minimize the defects. Similarly, favorable first impressions are more likely to yield positive impressions later.
INTUITIVE THINKING: relying on a "gut feeling" without fact checking. Example:
A bat and ball costs $1.10.
The bat costs $1.00 more than the ball.
How much does the ball cost?
Most people, even students at the most selective universities, give the wrong intuitive answer of 10 cents. The right answer is 5 cents. Do the math.
JUMPING TO CONCLUSIONS. This error occurs under a variety of situations. The most common cause is failure to consider alternatives. An associated cause is failure to question and test assumptions used to arrive at a conclusion.
MAGNIFICATION & MINIMIZATION. Exaggerating negatives and understating positives. Be aware of how easy it is for you and others to exaggerate the positives of a position and understate the negatives.
MISSING THE POINT. Sometimes this happens unintentionally. But frequently recognition that one’s argument is weak creates the temptation to shift focus away from the central issue to related areas where one can make a stronger argument.
NOT LISTENING. Have a clear notion of the issue and the stance that others are taking. If you have to read another’s mind or “read between the lines,” seek clarification lest you end up putting your words in somebody else’s mouth.
OVER-GENERALIZATION. It is illogical to assume that what is true for one thing is true for something else. Example: some scientists studying free will claim that the decision-making process for making a button press is the same for more complex decisions.
Clayton, C. W., (2007). The Re-Discovery of Common Sense: a Guide to: The Lost art of Critical Thinking. iUniverse. Lincoln, Nebraska.
Gilovich, T. (1993) How We Know What Isn't So: The Fallibility of Human Reason in Everyday Life. The Free Press, New York.
Hughes, W. et al. (2010) Critical Thinking, Sixth Edition: An Introduction to the Basic Skills. The Broadview Press. Peterborough, Ontario, Canada.
Kahnemann, Daniel (2011). Thinking Fast and Slow. Farrar, Straus, and Girous. New York.
Read reviews of my books at http://WRKlemm.com. Follow me on Twitter @wrklemm.