Thinking Fast And Slow




Your thoughts determine how you view life, and that's why you need to learn how to control them

Your brain is always looking for the most natural way out, and that could be the death of your ethical judgment making. Your brain substitutes effort for creativity, thanks to your System 1.

There are several factors like illusion, availability, and experience that influence your decisions. These influences lead you to make biased conclusions.

Thinking is not as easy as it seems; it is an art that must be mastered. The mastering of thoughts comes with understanding what constitutes your intuition and assumptions. Experiments and facts also shape the way you think, and this is why you should know when to switch between intuition and empirical facts.

Daniel Kahneman is a master of his trade. His knowledge of behavioral economics, cognitive thinking, and psychology put him up there with the very best. His approach toward the behavioral pattern of people is made easy with the use of relative examples and instances.

To find out how to switch from thinking slow to thinking fast, read further, and follow this well-detailed summary of “Thinking Fast and Slow” by Daniel Kahneman.




The two systems associated with your brain are responsible for how and when you think 

Psychologists Keith Stanovich and Richard West have established that humans have two primary systems for thinking:

• System 1 operates automatically and quickly without voluntary control, and with little effort;

• System 2 which allocates attention to the effortful mental activity of any kind.

Humans avoid an overload of information in their brains​ by consciously or subconsciously breaking up tasks into smaller steps. This is because we naturally prefer solutions that require minimum mental effort and strength. The moment a solution requires a lot of effort, the attention paid to it starts to decrease.

In a bid to balance the functions of the human mind, one of the primary functions of System 2 is to control, monitor, and oversee suggestions and actions that come from System 1. However, this action doesn’t rely much on facts; it’s based on faith and intuition.

The information System 1 provides to System 2 is usually wrapped around impressions. These impressions turn to beliefs that humans hold on to. The beliefs then turn to actions that create an identity for you.

The human brain is programmed to adjust to speed and stress. Cognitive ease comes into play because you behave in a different way when you are making an effort toward something with ease. When you’re strained, on the other hand, your brain will adjust, and you’ll likely do the job well, but you will lack the required creativity. Taking your time to work with cognitive ease increases your creativity.

Another function of System 1 is that it helps you shape what’s normal to you, what surprises you, and the cause of these things. It feeds off the information sent to your brain and makes assessments based on your reaction to things.

It should be noted though, that System 1 doesn’t necessarily make an accurate conclusion from the information sent to it. It doesn’t care about the true assessment of the information; it makes its own conclusions.

In making judgments, however, evaluating people is an automatic process that happens whether you intend it or not.

System 2 works hard to find an escape route for you, especially when you are stuck with finding a solution to a problem. When you can’t find a good answer to a question, System 1 automatically finds a similar question and feeds you the answer instead.

When System 1 faces a tough problem that it can’t find a solution for, it’ll seek the help of System 2, calling it into action to find a way around the problem immediately.

However, sometimes, your brain deceives you, and itself, into believing that a task is simple. Due to this fact, System 1 pops up, thinking it can handle it, but it can’t. As a result of this, you end up making mistakes.

This happens because your brain wants to save as much energy as possible when executing a task. Your brain takes the easiest route available for a task, thereby minimizing effort. The moment your brain miscalculates the required effort needed for a task to be executed, System 1 fails to perform its functions.

So, when System 1 feels it can handle a situation or problem, it won’t activate or call for the help of System 2. This means you will most likely end up using less creativity than needed because the task should have been originally handled by System 2.

Some of the characteristics of System 1 are:

• It neglects ambiguity and suppresses doubts;
• It distinguishes the surprising from normal;
• It creates a coherent pattern of activated ideas in associative memories;
• It is biased to believe and confirm.



You are bound to make judgments based on bias and intuition instead of facts and statistics

There is a tendency for you to jump to a conclusion about a scenario based on the sample provided. This logic applies to numbers too. You believe that samples that involve small numbers are drawn from a small population. The moment you are told that five million people work in the public sector in Bulgaria, you’re inclined to believe that Bulgaria’s population is small, just because of the sample. These are some of the things that shape how you view and evaluate your environment.

The anchoring effect occurs when a particular value for an unknown quantity influences your estimate of that quantity. This explains how humans place value on something because of what anchors the number associated with it.

Amos and Daniel invented the​ fortune wheel that was rigged to stop at 10 or 65. They then gathered students of the University of Oregon as participants for their experiment. After spinning the wheel which had been rigged to stop at either 10 or 65, the participants were asked two questions.


• Is the percentage of African nations among UN members larger or smaller than the number you just wrote?

• What is your best guess of the percentage of African nations in the UN?

Although the number on the wheel holds no significance to the question (even if the wheel wasn’t rigged), the participants still based their results to the questions on the number on the wheel.

The anchor for their number was determined mostly by the number they got from the wheel. Those who had 10 chose lesser numbers for the African nations, and those who had 65 chose higher numbers.

The ease with which you can come up with examples is often used to determine the frequency of events. Immediately an example comes to mind, your brain pieces together instances of a similar occurrence, and you’re then tricked into believing that such a situation is frequent.

A dramatic event temporarily increases the availability of its category. A train wreck that attracts media coverage will temporarily alter your feelings about traveling by rail. Your mind’s made to believe that the train wreck is a frequent occurrence, and this alters the way you view traveling on a train.

If not controlled, availability biases can cause problems. It’s not easy to stop, but you can resist its dominance by training yourself to identify it. By maintaining your vigilance against these biases, you can control your thinking better.




Inadequate and inaccurate history about the past tend to affect the present and the future

There’s a belief that the worst scenario of an occurrence is the highest form of it; therefore, the preventive measures that would be taken should be examined against that scenario. If, for example, a wall is being erected to stop a flood, the strength and height of the wall will most likely be built according to the last heaviest flood. Economist Howard Kunreunther believes that protective actions are usually designed to be adequate to the worst disaster experienced.

No thanks to the media, there are several misconceptions about the true extent of things, and how important or less important they are. Humans try to simplify life by believing in a world that is perfect in their heads.

By examining the two (evidence and base rates), it’s known that humans tend to put more focus on evidence than base rates. Base rates occur as a result of small information. When the information produced is not specific or detailed, base rates come into view, telling the brain to make a conclusion based on assumptions.

The moment more specific information is provided, the brain changes its focus, and concludes evidence, rather than the base rate.

Placing absolute belief in the conclusion drawn from detailed information can be misleading. It’s true that a tall and thin athlete would most likely play basketball, but it’s not always so. As long as you allow your conclusion to be drawn from descriptions, there will be times that you will end up making the wrong assumptions.

When it comes to descriptions, adding details is different from being specific. Simply adding details to a piece of information makes it more likely to be believed, but it doesn’t mean that the information is accurate. The idea behind a piece of good information is that it should be precise. Precision can come in short forms when it comes to describing a scenario. The ability to be specific helps you think fast




Statistics from data aid conclusion of information, but they are not as informative to you as real-life experiences

When you come from a very peaceful place where humans don’t eat animals, and then you’re told that humans are responsible for the death of 65% of animals across the world, you’ll find it hard to believe. Also, regardless of existing facts and statistics, a situation is not bound always to follow a regular pattern. That a tall serial killer has terrorized a community for a year doesn’t mean the next death is from the same serial killer. Existing facts help in the absence of any other information.

The reaction toward performance doesn’t necessarily generate from praise or blame. When you scold someone for a bad performance, it’s likely that the person’s next performance will be better. Also, when you praise someone for a good performance, there’s a huge chance that such a person’s performance would deteriorate next time. Now, the reaction to these performances is not as a result of how you scolded the person or praised the person. Natural fluctuations occur, and this means that a person who has had a bad performance would naturally want to improve, and a person who has had a good performance has most likely been lucky with such performance. Daniel argued that it is better to reward improved performance than punish a mistake.

Predictions and forecasts are a part of everyday life. Across all fields of study, experts and professionals make predictions based on calculations and facts available. Some predictions are intuitively based on recent occurrence, experience, or familiarity; by seeing certain cues, you can deduce what is bound to happen next. However, there are several other things that you can’t predict or forecast without showing bias. To produce unbiased predictions, you must start with the average information available, and then gradually work toward getting more data from existing guidelines, statistics, and facts.

Correcting your intuitive predictions is the work of System 2. System 2 is responsible for the hard work, so it goes to collect information from what you have experienced, and what you’re experiencing.



The Illusion of Understanding, Availability, ​and Validation deceives you into believing that you know much about the present and the future


According to Nasim Taleb, humans continually fool themselves by constructing flimsy accounts of the past, and believing they are true. He believes that the stories of the past shape the present, and inevitably the future as well. The illusion of understanding is a major determinant in the way people think and approach life. The knowledge of the past, rewritten to suit certain narratives, only talk about things that happened, neglecting things that didn’t happen. These narratives have created the belief in humans that as long as they can know the past, they can shape the future. But how well can a future be shaped when the knowledge about the past is a result of an inaccurate description?



A successful event gets people to believe that the processes to the event must equally be successful

As much as you believe you know your past, most of the stories you’ve heard only talk about the event and neglect​ the processes.

The validation of something by humans comes from belief most of the time. Even though there’s little to no evidence, humans tend to validate what they believe in because of influence, pressure, and blind faith. When you’re confident about something, it doesn’t mean that it is right or true; it only means that since your System 1 has found it easy to process the information, it jumps to conclusions. “What You See Is All There Is” (WYSIATI) is an approach employed by System 1 when it wants to do lazy work. The illusion of validation keeps human beings biased in judgment making.

Structured impressions that have a well-planned formula are better than intuition and assumptions. When you have a roadmap that buttresses your impressions, always take it over your intuition. Intuitions give you biased judgments while formulas give you a guideline towards approaching a task.

In normal situations and conditions, it is okay to trust your basic intuition. The moment you change the environment, however, you should become more careful and rely on facts. Change of environment means you have to learn new things, and since your intuition is primarily based on stored information, it won’t be advisable to rely on it.

The best-case scenario is not the only scenario worth studying. Humans tend to plan, initiate, and execute projects based on facts and stories available from the best-case scenario. To successfully plan, a lot of scenarios should be evaluated by taking into account similar cases that were successful or unsuccessful.

Optimism as a defense mechanism against doubt is an art that needs to be mastered. Having disregard for doubt breeds confidence as well as overconfidence. The best thing to do is to moderate your optimism by employing the premortem tactic. The premortem tactic allows you to assume that a project has failed. By pretending the project has failed, your brain works continuously to discover reasons why the project failed. In so doing, you’re exposed to probable faults and loopholes that you can fix.



The value you place on properties and happiness is determined by whether you own them or not

The value of something is determined by the current situation of the person who wants the thing. When you’re rich, the things you can afford hold value to you, but when you’re not, their value decreases because you can’t afford them.

The things you own naturally have value to you because you own them. The value you attach to whatever you own is greater than when you see the same thing elsewhere.

In terms of taking a risk and examining loss, humans always look for the safest possible bet regardless of the probable outcome. When it comes to gambling, you are unwilling to take risks, because when the probability of a loss is higher than that of a win, you instinctively want to avoid it. This is why humans are always more particular about avoiding losses than working hard to secure gains. As long as there is a chance for a loss, the brain begins working towards avoiding it.

​The possibility of a future occurrence is one way in which humans think ahead of time. However, when overestimated, these possibilities cause the brain to focus on events that are very unlikely to happen. These events are called rare events. It is quite rare for money to drop into your lap from the sky, but you still believe in its possibility because it has probably happened before. This belief influences your way of life.

Adopting risk policies to guide you against loss is a brilliant thing to do, but it should not be exaggerated. See a decision as one of the numerous options available to you. Don’t automatically dismiss an option because it contains loss.

Since rewards and punishments mold your reaction and motivate your actions, you should pay attention to it often.



The human mind evaluates statements based on how they are framed and laid down

A well-framed statement can go a long way in appealing to your mind, while a badly framed statement can cause a big issue.

As it has been established, simple and sole evaluations of a situation are the work of the System 1, while careful assessment and comparison is the work of System 2.

There are two selves in humans; they are:
• The experiencing self;
• The remembering self.

These two selves work hand in hand to help you make decisions. The experiencing self helps you make decisions based on whether you like an experience or not, and the remembering self helps you make a decision based on your reaction toward it when you first encountered it.

A lot of people don’t pay attention to their experiencing self; they are more particular about the memories they’ve collected over time, and then base their judgments on it.

Happiness is relative, and it has a wide range of definitions that can never be ignored. Human judgments are different, and so is the definition of happiness. The best way to live a good life is to understand that there’s no limit to what can make you, or anyone, feel happy.



Conclusion

Your thoughts are a process of past, current, and future events that shape your life. When you control the way you think and how well you think, you’ll find it easy to make better decisions. Understanding your two systems and training yourself to be able to control your thoughts gives you a better advantage. You can make better judgments about life and people; all you need to do is understand the way you think.

Discover your thought process, evaluate your judgments about people, and see if you are making a good conclusion or not. If yes, maintain it, if no, focus on working on your bias by seeing beyond your intuition.

0 Comments