We have a Two System way of thinking โ System 1 (Thinking Fast), and System 2 (Thinking Slow).
System 1 is the intuitive, "gut reaction" way of thinking and making decisions. System 2 is the analytical, "critical thinking" way of making decisions. System 1 forms "first impressions" and often is the reason why we jump to conclusions. System 2 does reflection, problem-solving, and analysis.
We spend most of our time in System 1.
Most of us identify with System 2 thinking. We consider ourselves rational, analytical human beings. Thus, we think we spend most of our time engaged in System 2 thinking.
Actually, we spend almost all of our daily lives engaged in System 1 (Thinking Fast). Only if we encounter something unexpected, or if we make conscious effort, do we engage System 2 (Thinking Slow). Kahneman wrote:
Systems 1 and 2 are both active whenever we are awake. System 1 runs automatically and System 2 is normally in comfortable low-effort mode, in which only a fraction of its capacity is engaged. System 1 continuously generates suggestions for System 2: impressions, intuitions, intentions, and feelings. If endorsed by System 2, impressions and intuitions turn into beliefs, and impulses turn into voluntary actions. When all goes smoothly, which is most of the time, System 2 adopts the suggestions of System 1 with little or no modification. You generally believe your impressions and act on your desires, and that is fine โ usually.So System 1 is continuously creating impressions, intuitions, and judgments based on everything we are sensing. In most cases, we just go with the impression or intuition that System 1 generates. System 2 only gets involved when we encounter something unexpected that System 1 can't automatically process.
When System 1 runs into difficulty, it calls on System 2 to support more detailed and specific processing that may solve the problem of the moment. System 2 is mobilized when a question arises for which System 1 does not offer an answer... System 2 is activated when an event is detected that violates the model of the world that System 1 maintains.
System 1 thinking seeks a coherent story above all else, and often leads us to jump to conclusions.
While System 1 is generally very accurate, there are situations where it can make errors of bias. System 1 sometimes answers easier questions than it was asked, and it has little knowledge of logic and statistics.
One of the biggest problems with System 1 is that it seeks to quickly create a coherent, plausible story โ an explanation for what is happening โ by relying on associations and memories, pattern-matching, and assumptions. And System 1 will default to that plausible, convenient story โ even if that story is based on incorrect information.
The measure of success for System 1 is the coherence of the story it manages to create. The amount and quality of the data on which the story is based are largely irrelevant. When information is scarce, which is a common occurrence, System 1 operates as a machine for jumping to conclusions.WYSIATI: What you see is all there is.
Kahneman writes extensively about the phenomenon of how people jump to conclusions on the basis of limited information. He has an abbreviation for this phenomenon โ WYSIATI โ "what you see is all there is." WYSIATI causes us to "focus on existing evidence and ignore absent evidence." As a result of WYSIATI, System 1 often quickly creates a coherent and believable story based on limited evidence. These impressions and intuitions can then be endorsed by System 2 and turn into deep-rooted values and beliefs. WYSIATI can cause System 1 to "infer and invent causes and intentions," whether or not those causes or intentions are true.
System 1 is highly adept in one form of thinking โ it automatically and effortlessly identifies causal connections between events, sometimes even when the connection is spurious.This is the reason why people jump to conclusions, assume bad intentions, give in to prejudices or biases, and buy into conspiracy theories. They focus on limited available evidence and do not consider absent evidence. They invent a coherent story, causal relationships, or underlying intentions. And then their System 1 quickly forms a judgment or impression, which in turn gets quickly endorsed by System 2.
As a result of WYSIATI and System 1 thinking, people may make wrong judgments and decisions due to biases and heuristics.
There are several potential errors in judgment that people may make when they over-rely on System 1 thinking:
- Law of small numbers: People don't understand statistics very well. As a result, they may look at the results of a small sample โ e.g. 100 people responding to a survey โ and conclude that it's representative of the population. This also explains why people jump to conclusions with just a few data points or limited evidence. If three people said something, then maybe it's true? If you personally observe one incident, you are more likely to generalize this occurrence to the whole population.
- Assigning cause to random chance: As Kahneman wrote, "statistics produce many observations that appear to beg for causal explanations but do not lend themselves to such explanations. Many facts of the world are due to chance, including accidents of sampling. Causal explanations of chance events are inevitably wrong."
- Illusion of understanding: People often create flawed explanations for past events, a phenomenon known as narrative fallacy. These "explanatory stories that people find compelling are simple; are concrete rather than abstract; assign a larger role to talent, stupidity, and intentions than to luck; and focus on a few striking events that happened rather than on the countless events that failed to happen... Good stories provide a simple and coherent account of people's actions and intentions. You are always ready to interpret behavior as a manifestation of general propensities and personality traits โ causes that you can readily match to effects."
- Hindsight bias: People will reconstruct a story around past events to underestimate the extent to which they were surprised by those events. This is a "I-knew-it-all-along" bias. If an event comes to pass, people exaggerate the probability that they knew it was going to occur. If an event does not occur, people erroneously recall that they thought it was unlikely.
Hindsight bias has pernicious effects on the evaluations of decision makers. It leads observers to assess the quality of a decision not by whether the process was sound, but by whether its outcome was good or bad... We are prone to blame decision makers for good decisions that worked out badly and to give them too little credit for successful moves that appear obvious only after the fact... When the outcomes are bad, [people] often blame [decision makers] for not seeing the handwriting on the wall... Actions that seemed prudent in foresight can look irresponsibly negligent in hindsight.
- Confirmation bias: Within WYSIATI, people will be quick to seize on limited evidence that confirms their existing perspective. And they will ignore or fail to seek evidence that runs contrary to the coherent story they have already created in their mind.
- Overconfidence: Due to the illusion of understanding and WYSIATI, people may become overconfident in their predictions, judgments, and intuitions. "We are confident when the story we tell ourselves comes easily to mind, with no contradiction and no competing scenario... A mind that follows WYSIATI will achieve high confidence much too easily by ignoring what it does not know. If is therefore not surprising that many of us are prone to have high confidence in unfounded intuitions."
- Over-optimism: People have a tendency to create plans and forecasts that are "unrealistically close to best-case scenarios." When forecasting the outcomes of risky projects, people tend to make decisions "based on delusional optimism rather than on a rational weighting of gains, losses, and probabilities. They overestimate benefits and underestimate costs. They spin scenarios of success while overlooking the potential for mistakes and miscalculations... In this view, people often (but not always) take on risky projects because they are overly optimistic about the odds."
What did we learn?
- System 1 (Thinking Fast) often leads individuals to make snap judgments, jump to conclusions, and make erroneous decisions based on biases and heuristics.
- System 1 is always-on, and constantly producing fast impressions, intuitions, and judgments. System 2 is used for analysis, problem-solving, and deeper evaluations.
- Most of the time, we go with System 1 recommendations because of cognitive ease. Sometimes, we evoke System 2 when we see something unexpected, or we make a conscious effort to slow down our thinking to take a critical view.
- System 1 seeks to produce a coherent and believable story based on available information. This often leads us to WYSIATI โ focusing on the limited available evidence, and ignoring important but absent evidence. WYSIATI can lead us to jump to conclusions about people's intentions, to assign causal relationships when there were none, and to form snap (but incorrect) judgments and impressions.
- WYSIATI and System 1 thinking can lead to a number of judgment biases, including The Law of Small Numbers, assigning cause to chance, hindsight bias, and overconfidence.
I can now apply some of this knowledge to situations where I see people (or when I catch myself) relying too much on System 1 thinking. We will never be able to avoid relying on System 1 thinking for most of our daily lives. The important thing is to recognize when I or when others are relying on it too much, and force more System 2 thinking into the situation.
Once you've learned about them (hopefully by age 15) they will serve you well.
BUT the problems is that for US citizens under age 48, they mostly haven't had those theories run through life and up against REALITY! Lose $1K in Vegas like a young fool? That was how the dumb folks from my generation learned these simple truths. But that's not applied to those who've grown up in the reality denying world that has existed since at least 4/19/95, where unexploded explosives still could still seen on videotape in the days of lies after OKC Bombing - but which was promptly dropped into the memory hole by the MSM and the dumbed down idiot proles. (Is it 4/19/20 ? Then it's been 25 years.) OKC was the prep that showed the sorry controlling FUKUSraHell Bastards that they could get away with 9/11/2001, too. (Yes, IsraHell was all over OKC, also.)
The current Covid1984 Team FUKUSraHell experiment us has taught us (and the scum controllers) that the people can be convinced by mere hysterical media reports to believe that they can learn enough to become statisticians and epidemiologists by watching the TV for one hour a day, for a week or two, despite the fact that they've never learned common sense (which is required long before one can handle statistics.)
Here's a repost quick example of how counter-intuitive statistics can be. Imagine this: You invite random people to a party who arrive one a minute. How many people will have to arrive (or minutes will have to pass) before the odds are better than 50/50 ("more likely than not') that two of them will have the same birthday? (NOT birthdate.) Well, that total will have to be less than 367 as there are only 366 possible birthdays, and at least two if the first two happen to have the same birthday.
So, think about it. How many folks must have arrived to where the odds are better than 50% that two will have the same birthday? and come up with your guess. The answer is below. (Written backwards, my recalled estimate.)
R.C.
thgie ytnewt (Approximate. Maybe add point five. The exact # is calculable; I just don't have it.
P.s., you can do this without having a house, and people walking in one a minute and throwing a party. (I'm confident that nearly half of americans would be unaware of that obvious, common sense point.)
Such as, look up a truly random group of birthdays of any group online, such as property owners or other public documents.
N