The Undoing Project : The Friendship That Changed Our Minds



The inner workings of the human mind caused inefficiencies in picking baseball players

The Houston Rockets hired Daryl Morey because their manager Alexander was tired of the gut instincts basketball experts used to draft players. He believed Daryl’s statistical approach to decision making was what he needed. Morey went on to build a model that allowed you to explore the attributes in an amateur basketball player that led to professional success. The model allowed you to ignore personal opinions and ask the right questions. Down the line, he discovered the effect labels had on our perception of performance when a good player whose shirtless picture earned him the nickname “Man Boobs.” It turned out they valued him less because of the name they called him, and Morey then banned nicknames.

While merging the human mind and his model, it was also risky because of “confirmation bias.” When you form a near-instant impression, all other data organize itself around it. This theory means that you look for data that confirms your belief and ignore anything that doesn’t fit your view. He also became aware of what behavioral economists call the Endowment effect.

This effect means that when you own something, you raise its value, as opposed to when you didn’t own it. He also treated the case of “present bias” and “hindsight bias.” Present bias is the tendency to undervalue the future as opposed to the present when making a decision. Hindsight bias is the tendency of people to look at some outcome and assume it was predictable all along, ignoring obscure facts.


Daniel Kahneman and Amos Tversky, although similar, were different in many ways

Both men were in the army at a very young age, before deciding to pursue Psychology and Humanities respectively in the university. Daniel Kahneman had a theoretical interest in other people, and while in the university, he was fascinated by how living creatures behaved, which is called “behaviorism.” After graduating with a Psychology degree from Hebrew University, he was required to join the Israeli army. He then created a personality test that gave birth to the “Kahneman score” that made it easier to select people that would fare well as military men.

From this test, he was able to deduce that if you remove the gut feelings from judges, their judgments improved. Danny suffered from self-doubt, and was pessimistic most times, paying attention to what wouldn’t work, rather than what did. He also had the belief that there were too many ideas to stay glued to one, so “if one doesn’t work out, don’t fight hard to save it; find another.”

Amos Tversky was physically brave, even though he looked quite small and fragile. Others thought it was more that he was afraid to be called weak, so he made himself brave. While in school, Amos had a gift for math and science. Despite all this, and much to everyone’s surprise, he delved into the humanities. He kept hours like a vampire, would sleep during the day, and never did anything he felt didn’t matter. Amos was always the life of the party, an optimist, and didn’t accept social responsibility or understand the point of it all.


Humans are not rational in their choices

The study of behavioral economics and human responses to particular tests demonstrated that humans are indeed irrational and predictable in their decision making. In making decisions, instead of using all of the available data, we tend to rely on cues produced by our mind to jump to conclusions. Memories have little impact on our judgments. We remember how something starts, but what sticks or creates the mental picture is really how such an event ends. What happens in between matters less.

Our decision-making centers around stereotypes we have created in our minds and classifications enforces these stereotypes. An example is why an NBA draft with the physical features of a current champion in the league, may be given preference. We lean on little information to draw significant conclusions. We place so much belief in the idea that a small number is a representation of a fraction of a whole, so whatever traits it possesses, is representing the whole


Behavioral economics explains the role of heuristics or cognitive shortcuts in decision making

Although they are harmless, heuristics can prove to be deeply flawed. The first type of heuristics is Anchoring and Adjustment. It describes the fact that when we are estimating something, we tend to start from a readily available number or “anchor” and adjust the value until we find a plausible conclusion. An example is when subjects were asked to write down the last digits of their social security numbers, and mention how many African countries were participating in the United Nations. Turns out people with higher figures, leaned towards picking higher numbers of African countries.

Representativeness is another type of heuristic, where people tend to make conclusions or connect events that have no business being together or aren’t related. An example is an NBA player, who despite having great talents, was dismissed because he was Asian-American. Another heuristic type, Availability, draws people to assume the likelihood or possibility of an event occurring based on how they recall it happening


Humans are smarter when they look into details that they usually would not

In writing their paper for the Psychology of Prediction, Amos advised that one should never say yes when you’re asked to do anything right away. Give it some thought so you can make the right decisions. While trying to come up with their publication on people’s predictive ability, they first noted that judgment differs from prediction because prediction is an uncertain judgment. e.g., saying that a man looks like an Israeli officer, is a judgment while saying “he will make an excellent Israeli officer,” is merely a prediction.

Humans are prone to make judgments that have a lot of systematic errors in them because they don't put other factors to play. They carried out a test where some people were gathered to predict the course a person would go on to study in graduate school. They presented the subjects with the statistics of people that consider the field in percentages, and of course, people were able to make an inference. Much later, with a model of made-up features, they were still able to make predictions. They did this without paying attention to the statistics of people who studied the course in general.

The Cocktail party effect is the ability of people to filter a lot of noise for the sounds they wish to hear — as they do when they listen to someone at a cocktail party. Danny used this theory to determine which fighter pilot was better, because the ability to listen to forecast looming danger, and swiftly act, was very crucial in the wars.


The human mind contains errors that could lead to wrong judgments

Danny and Amos had already shown that people’s ability to judge possibilities is ruined by various mechanisms used by the mind when it faced uncertainty. They believed that they could use their new understanding of the systematic errors in people’s judgment to improve that judgment. And, thus, to improve people’s decision making. Some of the errors include:

Belief in the law of small numbers: People mistake even a tiny part of a thing for the whole. This error is not limited to ordinary people, and even statisticians make that mistake too. They tend to acknowledge that any given sample of a large population is more representative of that population than it is.

The Gambler’s fallacy: People believe that when a coin falls on its head several times in a row, the next time you flip it, you’ll most likely get a tail. In reality, a coin only has two sides, and the probability of any side coming up is a 50/50.

The power of ignorance: People don’t know what they don’t know, but they don’t bother to factor their ignorance into their judgments


The Isolation effect influences the direction a decision goes

The psychologists, in their study of the Isolation effect, came across different theories like the Value theory, Framing, and Loss Aversion.

The Value Theory explains how a response can be induced by dealing with a subject’s perception of gains and losses. Saying that someone will make a 10% loss is the same as he will have a 90% win. But, more people are inclined to agree with the latter than the former. Generally, everyone is more inclined to accept more money than less, just like one would choose less pain than more.

One phenomenon that explains the Value theory well is Framing. By changing the description of a situation, by making a win seem like a loss, you could cause people to flip their attitude toward risk entirely. The most famous demonstration of this theory was the Asian Disease Problem.

The test was two problems, which they gave, separately, to two different groups of subjects innocent of the power of framing.

Problem 1: Suppose that the U.S. is trying to prepare for the outbreak of an unusual Asian disease, which is presumed to kill 600 people. Two alternative programs have been proposed to tackle the disease. Take the estimate of the consequence of the agenda as follows.

If Program A is adopted, 200 people will be saved.

If Program B is adopted, there is a 1/3 chance that 600 people will be saved and a 2/3 chance that no one will be saved.

Which of the two programs would you favor?

The second group got this — if Program C is adopted, 400 people will die.

If Program D is adopted, there is a 1/3 chance that no one will die and a 2/3 likelihood that 600 people will die.

In the first group, they opted for option A, thinking that they had saved 200 people for sure. They forgot that it also meant 400 people would die. In the second group, the vast majority went for option D, which seemed less obvious that 400 people would die. They skipped the probability that 2/3 people will die in itself means the same thing


We maximize mental happiness, not utility

When people make decisions, they seek to increase or reduce the chances of emotional states such as happiness and regret, rather than the chances of success (utility) itself. Kahneman and Tversky decided to measure our regret-maximizing tendencies, which are that:

• We regret what has happened more than what we haven’t done yet.
• We regret things we choose to do differently, than those we would’ve usually done.
• We regret our actions more than the actions of others.
• We regret situations we feel we had more control over more.
• We feel more regret when we are closer to achieving something.

The Undoing Project was created to ‘undo’ certain events that cause regret. People that anticipate the disappointment they would feel from making a decision are more prone to make the seemingly right choices, without giving it much thought. The Rules of Undoing or the Imagination Rule states that:

• The more items you need to undo to create an alternative reality, the less likely your mind would want to undo them.
• An event becomes gradually less changeable as it recedes into the past.
• It is easier to undo the unusual parts of a story.

The Shadow Theory states that the context of alternatives or the possibility set determines our expectations, interpretations, recollection, and attribution of reality, as well as the affective states which it induces. Toward the end of his thinking on the subject, he concluded that: “Reality is a cloud of possibility, not a point.”


How the relationship between Daniel Kahneman and Amos Tversky ended

Amos didn’t like prizes. He felt it exaggerated the difference between two people and did more harm than good. Several articles were published, praising him as if he had done it alone. And when others spoke of their joint work, they would put Danny’s name second, if at all they mentioned it. He wrote back to publications telling them that if they couldn’t give them equal recognition, he’d prefer they removed his name.

He was once wrongly credited for noticing the illusory sense of effectiveness felt by Israeli Air Force flight instructors after they’d criticized a pilot. He wrote the author back, saying he was uncomfortable with the label “Tversky effect,” and in fact, it should be called the “Kahneman effect,” because Danny was the one who observed the pilots.

Amos received a lot of attention and got invites to conferences of economists, computer scientists, and linguists. Danny, on the other hand, couldn’t help but envy the attention Amos was receiving for the work they had done together. They started having problems in their relationship, such that they couldn’t work together again. Even at that, they kept publishing their books with both their names, so no one noticed.

Later, the two came together to write about something they called the “conjunction fallacy,” which argued or explained how and why people violated the simplest and the most basic quantitative law of probability. They explained that people chose a more detailed description, even though it was less probable because it was more “representative.”

Danny felt like the paper was less an exploration of a new phenomenon than a sort of weapon Amos planned to use to win his argument. Danny’s perception of their relationship was that of a Venn diagram where the rapid expansion of Amos’s circle had pushed Danny’s borders further away. Danny moved to Berkely, to get away from Amos, after which he fell into a depression soon after. He claimed that he had no ideas left.

They both had problems with the way they perceived each other. Amos would make comments that would seem insulting to Danny, and Danny felt he was too willing to accept a situation that put him in his shadow. Danny was bothered that Amos’s view of him had changed and they seemed like distorted minds who couldn’t even organize a chapter together.

There was this German psychologist Gerd Gigerenzer, who was bent on proving their work together as rubbish and Amos wanted to shut him up for good. He begged Danny as a friend to join him, even though Danny was more inclined to try to reason with the opposition. They quarreled a lot while writing a response to Gigerenzer. Amos’s words were too harsh for Danny, and Danny’s were too soft for Amos.

One night while staying in Amos’s apartment in New York, Danny had a dream. He said the doctors had told him he had only six months to live. Danny told his friend it would be terrible to spend his last six months working on garbage like an attack against a fellow psychologist. Amos stood on the fact that he felt it was the right thing. Not long after that, Danny asked why his friend hadn’t put him forward, as a member for any of the boards he was on, and Amos saw it as Danny being weak.

Danny, offended by the remark, said that’s not how friends behave and left. Three days later, Amos called Danny to inform him that doctors had identified growth in his eye. It was diagnosed as malignant melanoma, and his body was riddled with cancer. He had, at best six months to live. He ended the conversation saying ‘We’re friends, whatever you think we are.’” And that made Danny cave in.

After Amos died, the attention on Danny and their work together grew. They no longer referred to their work collectively as “Tversky and Kahneman,” but switched to “Kahneman and Tversky.” Then, Danny struggled to prove his worth. Not that of their work together. After waiting for several years, when he least expected it, he got a phone call. He had won a Nobel prize.


Conclusion

Danny and Amos were able to conclude that the human mind is malleable because it reaches conclusions based on heuristics and pre-thought stereotypes. Therefore, to improve the decision-making skills of people, the errors of the mind must be corrected.

Even though sometimes, correlations between data and events may occur, don’t jump into conclusions. Study and reason with all the information available





0 Comments