Psychological Dividends: On the Necessity of Critical Thinking

47e2cb0e6a5b3dbc4b55ed9c37767ece

Knowledge can produce any change in the universe that’s compatible with its laws. — David Deutsch

Logic is a virtue only when it’s maintained as a method for reasoning. In addition, reasoning is a process rather than an abstraction. In other words, the rigorous application of logic is not exclusive to philosophical idealism.

Knowing how to think is invariably more important than knowing what to think. Processes matter. Likewise, in today’s onslaught of information overload, knowing what to get rid of can be as essential as knowing what to keep  (e.g., a way of scrutinizing the landscape of our mind to eradicate what neuroscientist and psychologist Dean Buonomano described as “brain bugs”).

Formal logic consists of three basic rules of engagement that are operationally independent but mutually cohesive when analyzing propositions to develop a reliable framework of epistemology.

1. Inductive reasoning: Specific premise to a general conclusion.
2. Deductive reasoning: General premise to a specific conclusion.
3. Abductive reasoning: Most likely explanation given all available data.

However, regarding the seemingly infinite abyss of logical fallacies and their increasing regularity in daily conversation, there are five particular travesties of cognition that I encounter as a clinician more than I care to document during any given session. In addition, given today’s inauspicious trend of factual relativity and a blatant disregard for expertise, the need for intellectual vigilance has become something of a moral emergency among those still concerned with the concept of truth.

1. The fallacy of illicit transference is an informal fallacy that is committed when an argument assumes there is no difference between a term in the distributive (referring to every member of a class) and collective (referring to the class itself as a whole) sense. This fallacy occurs within two categorical errors: What is true of the part is true of the whole (composition), or what is true of the whole is true of the part (division).

Examples: {A} This politician in corrupt; therefore all politicians are corrupt (composition). {B} This agency is known for malfeasance; therefore any employee of this agency is untrustworthy (division). * Anomaly hunting is a common, supplemental approach to this fallacy in which an individual searches for confirmation of a belief while ignoring information that refutes their belief.

2. Post hoc, ergo propter hoc is a logical fallacy that infers the premise that if something occurs after an event, it must be caused by the event; used to indicate that a causal relationship has erroneously been assumed from a merely sequential one.

Example: The WTC 7 building in New York City (north of the Twin Towers) was known to contain private, financial banking records and collapsed shortly after the initial 9/11 attacks; therefore an attempted cover-up of fraudulent banking practices explains why 9/11 was an inside job orchestrated by the government via controlled demolition. Obviously, correlation does not prove causation. However, efforts to preoccupy oneself with erroneous associations often persist long after additional evidence has been produced to falsify such claims (e.g., the Backfire Effect).

3. Just-World Hypothesis (aka the Just-World Fallacy) is the assumption that a person’s actions are inherently inclined to bring morally fair and fitting consequences to that person, to the end of all noble actions being eventually rewarded and all evil actions eventually punished.

Example: People get what they deserve. This idea also derives from the presupposition that the world is an “equal playing field,” or that a person has unmitigated free will to “choose otherwise” (also known as a fundamental attribution error).

4. Argumentum ad populum is a logical fallacy that occurs when something is considered to be true or good solely because it is popular.

Example: Millions of people agree with my viewpoint; therefore, it must be right.

5. The Nirvana Fallacy is the informal fallacy of comparing actual things with unrealistic, idealized alternatives. It can also refer to the tendency to assume that there is a perfect solution to a particular problem (e.g., the perfect solution fallacy).

Examples: {A} Seat belts are a bad idea because people are still going to die in car crashes; therefore wearing a seat belt is an unecessary precaution. {B} Either there is a perfect solution to ending gun violence, or we shouldn’t do anything about it at all.

Alleviating the tyranny of confirmation bias prevents us from assuming the answers before investigating the questions. The facile satisfaction of asserting a comfortable narrative to explain complex or uncomfortable circumstances may be alluring, but it’s not a reliable way to understand the world and can result in the collateral damage of equal-opportunity credulity. In contrast, the psychological dividends available from exercising critical thinking skills allow us to remain honest while providing the most effective strategies for comprehending, accepting, and adapting to the nature of reality.

*Recommended reading: Crimes Against Logic by Jamie Whyte

Nocebo Left Behind

The first principle is that you must not fool yourself, and you are the easiest person to fool. ― Richard Feynman

Most everyone is familiar with the placebo effect (e.g., taking a sugar pill will make your symptoms diminish if you believe the drug is real). During a placebo experiment, the expectation of the patient is the mechanism that affects the outcome of treatment rather than the nature of the drug. The power of suggestion alone can have a strong influence on our experience despite the content of the proposed intervention. Conversely, but less familiar, is the nocebo effect (e.g., taking a sugar pill will make you feel worse if you believe the drug is real and has consequential side-effects). The word nocebo in Latin essentially means “I will be harmful.” For example, if a discussion between a doctor and patient includes the topic of experiencing negative side-effects when ingesting a substance, the likelihood that those symptoms will be manifested increases. There are valid public safety reasons for clinical trials, peer-reviewed data, and understanding the chemistry involved whenever medication is administered (e.g., pharmacodynamics, pharmacokinetics) to minimize undesirable side-effects. Likewise, comprehensive studies are also conducted for the perceived efficacy of less invasive, non-pharmaceutical treatment modalities. However, outcomes based on objective treatment measurements can be subjectively or suggestively influenced. How much does personal opinion impact the safety of treatment and how much does the power of suggestion dictate the experience of treatment?

As a psychotherapist who does not specialize in biochemistry, pharmacology, or internal medicine, I realize that presenting many treatment dilemmas is beyond my domain of expertise (hence armchair deductions). Nonetheless, what remains interesting to me is the psychology of motivated reasoning, anecdotes as evidence, suggestive priming, subjectivity, correlation/causation errors, and logical fallacies. In fact, one of my pet peeves in the museum of logical fallacies is know as the fallacy of illicit transference. The fallacy of illicit transference contains two subsets of fallacious reasoning. The first being the fallacy of composition, which involves believing that what is true of the parts is also true of the whole (e.g., negative experiences for a few people will equal negative experiences for all). In the opposite direction, but on the same continuum, the fallacy of division refers to the belief that what is true of the whole is also true of the parts (e.g., the treatment facility is corrupt and that clinician lied to me … therefore, all treatment facilities are corrupt and all clinicians lie). I cannot tell you how frequently these fallacies are generated and how appealing they can be when the mind seeks resolution in lieu of cognitive dissonance. Throw in a bit of anecdotal allure and you’ve got a two-for-one in the department of erroneous reasoning (e.g., my aunt received treatment last year for her arthritis and then complained of experiencing severe migraines for several months … so I’ll never trust a doctor again). Understandably, people want to feel empowered as consumers and value their health just as much as their time, but jumping to conclusions based on limited information is evidence of yet another fallacy: hasty generalization. Correlation may not equal causation any more than snoring would infer lucid dreaming. Asking critical questions and being an advocate for your health is admirable, but asserting unfounded claims while irresponsibly influencing others is not. Of course, as we will see, most controversies are not born in a vacuum.

Sturgeon’s law, named for the science fiction author Theodore Sturgeon, is an amusing proposition that boldly states “ninety percent of everything is crap.” Despite the dubious mathematical accuracy of this summation, a very useful critique could be applied to the ways in which people seek information online (i.e., Google University Syndrome). With enough motivated reasoning, a person can uncover selective sources and tendentious articles via data mining that reinforce a previous conviction without involving the requisite fact checking. To make matters worse, most individuals do not know what qualifies as empirically valid information and can easily be seduced by “mavericks” who emphasize their academic or occupational qualifications as experts only to mislead anyone who does not have a sufficient background in their field of interest (often motivated by selling bogus products, exposing “the system,” selling books, and promulgating “natural” remedies). In other words, conspiracies are easy to manufacture, but long-term studies based on collective evidence and peer-reviewed data are challenging. With the world at our keyboard, the likelihood of being persuaded by specious claims can render us exponentially gullible. In addition, the local consensus of laypersons should never be persuasive in the way that a global consensus of highly-trained professionals should be regarding any scientific-based inquiry. For example, the opinion of most citizens in a small town regarding alternative forms of cancer treatment does nothing to invalidate the facts about legitimate treatment compiled by the World Oncology Network. Research takes time and “miracle cures” are probably less than miraculous once you take time to do the research. Similarly, a handful of op-ed articles written about vaccines for The Huffington Post are not going to obliterate peer-reviewed findings in Oxford’s International Immunology medical journal or dismantle CDC data collection systems for vaccination statistics.

Certainly, we shouldn’t be dismissive of physiological subjectivity and it’s often necessary to compare evidential information with personal experience while being mindful of our bodies. After all, no one can tell you what it feels like to be in pain or when to trust your intuition when it’s symptomatically obvious that you should (there have been plenty of false negatives during diagnostic examinations). Nonetheless, there remains no dispassionate control group of the mind once we have committed ourselves uncritically to a controversial narrative. How can we remain vigilant about factual data in the face of suggestive influences or learn to wait until all the evidence is available before assuming epistemic victory? Not every association is significant enough to warrant causation, and correlations should only be a guide for ruling out other possible (often more likely) variables. Wanting to believe something is not the best compass for truth and personal experience does not equal universality.

Publicly constructed fear mongering that encourages unenlightened self-interest is a pernicious form of cultural propaganda and it can have dire ethical consequences. Remember, there is a difference between informed speculation and unhinged credulity. We must also be willing to relinquish false propositions or behavioral trends when faced with a comprehensive analysis that indicates a different conclusiondespite our emotional investment in particular outcomes or our uncomfortableness with a current paradigm. Most importantly, we must be willing to listen carefully to those who have dedicated their entire lives to research and education regarding scientific-based subject matter and make sure that their findings have been supported on an international level. Specialty alone is not enough to validate knowledge, but overwhelming agreement from many specialists will benefit the reliability of any methodology when that process is also made verifiable, falsifiable, and repeatable.

When a psychic tells you to beware of bad karma, our pattern-seeking mind is going to be on high alert. If anything goes wrong, the psychic has been substantiated and we feel our money was well spent. Of course, this psychic knows there ain’t no business like the nocebo business.