thinking errors
We would all benefit from thinking more about how we think.

Everybody makes thinking errors. But two things have happened recently that prompt me to write on this subject. One is politics, because in the U.S. it seems like almost everything has become politicized. Politicians, of whatever stripe, seemingly can't speak more than 30 seconds without making a thinking error. Sometimes it is deliberate, sometimes they are just clueless.

The other prompt comes from the new approach to science-education standards, as promoted by "A Framework for K-12 Science Education" and the "Next Generation Science Standards" (NRC 2012, Achieve, Inc. 2013). Practicing scientists know that the really important part of "doing" science is creative and critical thinking, and it is refreshing to see that science-education policy makers are trying to make such thinking a more prominent feature of K-12 science education. The new standards require students to perform such thinking tasks as "define problems, plan investigations, analyze and interpret data, construct explanations, engage in argument from evidence, and communicate information." These can be demanding challenges for students, who are not trained in formal logic and who have probably had little explicit instruction in how to think. Teachers may not be ready for this new approach to teaching science, because teacher-education programs do not include formal logic courses.

Science thinking errors also apply to ordinary life. So, in the spirit of my own 50 years of experiences as a researcher, let me summarize some of the more common important thinking errors that even I and my colleagues sometimes make. Hopefully, this listing will help improve the sloppy thinking of politicians and help teachers recognize weaknesses in student argumentation and help students learn to recognize and avoid such errors. All of us could benefit from thinking more about how we think.

Common Thinking Errors

AD HOMINEN ARGUMENT: discounting a position or conclusion on the basis of the person who makes it, rather than the merits of the argument itself.

ALL-OR-NOTHING THINKING: thinking of things in absolute terms, like "always", "all" or "never".

ANTHROPOMORPHISM: to attribute qualities and properties that only people can have to non-people. Example: "the purpose of evolution is to ...." Evolution happens, but not because it has a purpose. Only people have purposes.

APPEAL TO AUTHORITY: attempts to justify the conclusion by quoting an authority in its support. Even experts can be wrong. Conclusions should be affirmed by evidence.

APPEAL TO CONSENSUS: arguments defended on the basis that many people hold the same view. This is sometimes called the "Bandwagon Fallacy." Correctness of a position does not depend on who or how many hold it.

APPEAL TO FINAL CONSEQUENCES: claiming validity for one's position on the basis of the expected outcome or consequence (also known as a teleological argument). Example: people have free will because otherwise they can't be held responsible for bad behavior.

APPEAL TO IGNORANCE: using an opponent's inability to disprove a conclusion as proof of the conclusion's correctness. Sometimes wrong ideas are so entrenched or hard to disprove that people of special ability are needed to make the case against such ideas.

APPEAL TO LACK OF EVIDENCE: evidence is needed to affirm a concept, but lack of evidence can occur in situations where there is such evidence that has not yet been discovered. Absence of evidence is not evidence of absence.

ARGUMENT SELECTIVITY: using arguments supporting your position while glossing over the weaknesses and leaving out important alternative arguments. This is often called "cherry picking." A variation of this error is "false dichotomy," where a set of valid possibilities is reduced to only two. A related inappropriate selectivity is rejecting an idea altogether just because some part of it is wrong.

BEGGING THE QUESTION: an argument simply reasserts the conclusion in another form. This usually occurs when there is a lack of good evidence.

BIASED LABELING: how one labels a position can prejudice objective consideration of the position. For example, calling a position "Science-based" does not necessarily make it true. Or, conversely, calling a position "colloquial" does not necessary invalidate it.

CIRCULAR REASONING: reasoning where a belief in a central claim is both the starting point and the goal of the argument.

CONFIRMATION BIAS: People have a natural tendency to notice only the facts that support their position while discounting those that do not - in other words, believing what you want to believe.

CONFUSING CORRELATION WITH CAUSATION: When two things happen together, and especially when one occurs just before the other, students commonly think that one thing causes the other. Without other more direct evidence of causation, this assumption is invalid. Both events could be caused by something else. In case students need convincing, just remind them of this example: rain and lightning go together, but neither causes the other.

CONFUSING FORCE OF ARGUMENT WITH ITS VALIDITY: repeating erroneous argument does not validate it. Saying it more elegantly or louder doesn't help either.

DEDUCTION FALLACIES: a valid deductive argument must have consistent premises and conclusions (both must be either true or both false). Failure to be consistent produces "non-sequiturs," that is conclusions that are not logical extensions of the premise.

EMOTIONAL REASONING: Making decisions and arguments based on how you feel rather than objective reality. People who allow themselves to get caught up in emotional reasoning can become completely blinded to the difference between feelings and facts. For example, scientists sometimes unduly value a position because it is "parsimonious," or elegant, or easily understood (or even complex and sophisticated), etc.

EXCLUSIVITY CONFUSION: When several apparent ideas or facts are examined, it is important to know whether they are independent, compatible, or mutually exclusive. Example: concepts of evolution and creationism, as they are typically used, are mutually exclusive. However, stated in other ways, they might be more compatible.

FALSE ANALOGY: explaining an idea with an analogy that is not parallel, as in comparing apples and oranges.

JUMPING TO CONCLUSIONS: This error occurs under a variety of situations. The most common cause is failure to consider alternatives. An associated cause is failure to question and test assumptions used to arrive at a conclusion.

MAGNIFICATION & MINIMIZATION: exaggerating negatives and understating positives. One should be aware of how easy it is to exaggerate the positives of a position and understate the negatives.

MISSING THE POINT: Sometimes this happens unintentionally. But frequently recognition that one's argument is weak creates the temptation to shift focus away from the central issue to related areas where one can make a stronger argument.

NOT LISTENING: Have a clear notion of the issue and the stance that others are taking. If you have to read another's mind or "read between the lines," seek clarification lest you end up putting your words in somebody else's mouth.

OVER-GENERALIZATION: It is illogical to assume that what is true for one is true for all. Example: some scientists studying free will claim that the decision-making process for making a button press is the same for more complex decisions.

UNSUPPORTED ASSERTION: Recognize when a claim is made without supporting evidence. This also occurs when one confuses a judgment or opinion for a fact.

References

National Research Council (NRC). 2012. A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academies Press.

Achieve, Inc. 2013. Next generation science standards. www.nextgenscience.org/next-generation-science-standards.