The two systems that guide our life
Psychologist Daniel Kahneman, one of the best known experts in cognition and pioneer of behavioral economics, studied for over four decades, decision-making mechanisms of the human brain and identified numerous cognitive errors that influence our decisions without us realizing it. In 2002, Kahneman was awarded the Nobel Prize for Economics for his work which has shown that man is not a "rational actor", as claimed many economists, but one subject to numerous pitfalls of intuition. Kahneman's Nobel was awarded a first, as for the first time the top prize for economics was awarded to a specialist in another field (in this case, psychology).
Kahneman argues that human thinking is controlled by two systems: System 1, which he called "fast thinking" (quick thinking) is unconscious, intuitive and requires no voluntary effort or control the system in February, called "slow thinking" (thinking slow) is conscious, uses deductive reasoning and requires much effort.
To observe the person in the picture is angry no need for a conscious effort, realizing this instant and involuntary, in an example of quick thinking, typical of the system 1. Instead, to solve a multiplication problem and operation of 17 to 25 is needed directing conscious attention to a voluntary effort without which the answer can not be obtained. The latter is an example of application of the second system.
System 1 is born, a consequence of the evolution and outcome of adaptation to environment over time, while system 2 is a specific component of man. In fact, what we perceive as self specific system is 2 - self conscious and rational, who manages beliefs, choices and decisions.
Although we live under the impression that System 2 is responsible for most decisions we make, our life is controlled largely by the first system. The reason? Every day we have to take many decisions, making it impossible to use the second system for most. Because rational decisions take time for analysis and inference, which consumes energy efforts, the second system is used infrequently.
In most cases, system 1 generates suggestions for the second (impressions, insights, intentions and feelings) that it adopts without modification. System 2 occurs when one system does not provide an immediate response (eg, if the problem 17 x 24) or when it detects an error will occur (as when we refrain to react in a wrong way difficult situation, the system control mechanism will generated two blocks system 1). The two are, however, limits: the researchers found that when a person is occupied with a problem that requires the use of two systems, the ability to self decreases, it is more likely to yield to temptation.
1 shows the system but some systemic errors, cognitive errors that lead, often, the adoption of wrong decisions. In the latest book, Thinking Fast and Slow, Daniel Kahneman exposes Dr. some of these errors of thought, hoping that by their determination to help others to identify and better understand their own decisions.
For system 1 is active all the time (unlike the two, which requires a conscious effort), are more prone to cognitive errors. An example of system autonomy 1:01 Müller-lyer is optical illusion in which two parallel lines seem to have different lengths. Even if we measure the two lines and convince ourselves (with the help of the two) that their length is the same, system 1 will continue to perceive as unequal.
Like optical illusions, cognitive illusions tend to be difficult to overcome, but the first step out from under the domination of these mistakes is thinking their awareness. When people are in a time of crisis, uncertain situations, decisions are taken by the system 1. Therefore, it is essential to know its weaknesses.
Cognitive mistakes that influence our decisions
The f key is to understand that there are people who are not affected by the weakness of the system 1. This is demonstrated by a simple test that Kahneman applied it a thousand times: "A baseball bat and a ball together cost $ 1.10. The bat costs a dollar more than the ball. How much is it? ". Even for the most intelligent students, such as those at Harvard and Princeton, more than half gave the obvious answer offered by one system, but also wrong 10 cents. The correct answer was, of course, 5 cents.
One of the most common cognitive errors is "the overconfidence bias" - the tendency to excessive trust in their own abilities. Statistics show that the chances of a new company founded in the U.S. to work for 5 years is approximately 35%. However, a survey among entrepreneurs showed that they tend to estimate the chances of success of a new company to 60% and 81% chances of their own companies. Kahneman argues that optimism is the engine of capitalism, which is confirmed by the fact that the leaders, inventors and others that influence the life of a large number of people tend to be optimistic, taking risks being convinced that they will succeed in their attempt.
Another cognitive error identified by Kahneman is "the planning fallacy" - estimation error in planning. Psychologist encountered this problem for the first time in 1970, when the Ministry of Education of Israel asked him to design a manual and a study program on the topic of decision making. Kahneman has formed a team of specialists, among whom was an expert in the design of programs, and after a year of work colleagues asked them to estimate how long they thought it was not necessary. Most estimated project completion in about two years, with a margin of error of 6 months. Then, Kahneman asked the expert how programs such projects lasted on average. He explained that their average duration was 7-10 years, and 40% of them complete. Though he knew it, even the expert forecast a period of work for another 2 years. Finally, the project was completed in 8 years and in the meantime the Ministry of Education was not interested.
Another example of the error in planning comes from the U.S.. A survey of homeowners indicated that they expected to spend on average 18,500 dollars a kitchen refurbishment. Real average cost rises, but at 39,000 dollars. An example of Scotland show that the differences can be even higher: in 1997 when it was revealed the plan for a new Parliament building, cost estimates amounted to 40 million pounds. In 2004, when construction was completed, the total cost was 431 million pounds.
Another mistake of thinking is what Kahneman calls "the availability bias" - the tendency to judge based on what comes easily to mind. A survey of Americans revealed that they believe that the probability of a fatal accident is 300 times higher than that of death from diabetes, although the actual rate is 1:4. Kahneman consider this a proof that the media influence how we perceive risks, which may have negative consequences on our lives. A study conducted after the terrorist attacks of 11 September 2001 showed that many Americans have chosen that year I drive long distances to take the plane instead. Of these, about 1,500 were killed in traffic accidents, underestimating the risk of traveling by car and it supraestimându that of a terrorist attack.
A serious cognitive error is "the anchor effect" - the effect of anchoring. One example that illustrates this "shortcut" thinking is a study of a group of German judge with over 15 years experience. In the experiment, they were read a description of a case in which the defendant was caught stealing in a shop, and before deciding punishment, judges were asked to throw two dice. They were rigged so that amounted to either 3 or 9. Then they were asked to decide the proper punishment for the defendant. Although dice should not affect the decision of experienced specialists, researchers found that nine judges whose dice însumaseră decided, on average, a sentence of eight months in prison and those whose dice totaling 3 gave an average , a sentence of five months in prison. The efficiency of this error led to constant exploitation in trade, to model price expectations of buyers. For example, a company could offer three versions of the same service, so the cheapest option seem more attractive compared to the alternatives more expensive than if you were only proposed. For the same reason, the auctions are usually set a starting price.
Another important aspect of the system 1 is that, when faced with a difficult question, it tends to provide the answer to another question, simple, without us realizing. Professor Kahneman gives the example of a study conducted on a group of German students. Some of them have received the following two questions: "How happy are you?" And "How many romantic encounters you had last month?". Others received the questions in reverse order. If in the first case there is no correlation between the responses in the second case could observe a correlation between the number of meetings and the happiness shown by students. Professor Kahneman explains: "In order to respond correctly to the question" How happy are you? ", We need to think more. Students were asked first about romantic encounters have not felt the need to think, because they have substituted the answer to this question with the answer to another - "how happy are my love life?". Students are aware that their love life is not the only important aspect to them, but system 1 gave an easy answer, and they used it. "
Regarding happiness, says Dr. Kahneman, memories play an important role. People do not own one, but two: experimental self (the experiencing self) and self memory (Remembering the self).
Most people are guided by the second. To illustrate, Dr. Kahneman readers a question: would you be willing to pay for a great vacation, but at the end of which you should drink a potion that would erase any memory of the trip and you also remain free photos and videos? Probably not.
To illustrate the difference between memories and experiences, Kahneman tells the dialogue she had with a member of the public after a lecture. The caller told him about when listening to a symphony exceptional captivated at the end of which there was a huge noise, because the disc was scratched. "The end broke my audition," he said. Kahneman explained that, in fact, the experience was not destroyed, because he enjoys music for 20 minutes. Had been affected, indeed, the memory of this experience.
The confusion of the two issues is a cognitive error that can have unpleasant consequences. This was demonstrated in an experiment the volunteers were subjected to two painful experiences. Then asked to choose one of them to be repeated, they chose the most painful of them overall, which last longer but it ended with a less intense pain, because it leaves a good memory . "Self experimental tends to have a strong enough voice when we plan our activities. When people make decisions not wonder 'what I feel and for how long? "And tend to neglect in favor of living memory that will remain," says Kahneman. "In the past we learn, usually to maximize the quality of future memories, not the quality of future experiences. I call this "the tyranny of memory self, '" writes Kahneman.
How can we avoid the mistakes in thinking?
Only if we understand the weaknesses will be more ready to identify errors of thinking, even if we do not we'll always be avoided. "To counter one system errors are, in principle, a simple solution: recognize signs suggesting that we are in a delicate situation, slowing decision making and appeals system 2" suggests psychologist. "We can do this all the time, but when it comes to an important decision must reflect and ask ourselves whether we fall into a trap of thinking," adds Kahneman.
Another thing you need to take into account is that using system 2 involves consumption of resources, and in their absence one system takes over. An example is a study conducted in Israel on a group of 8 judges decide which function parole of convicted persons. Predominant decision rejecting parole requests, only 35% of them approved. Researchers found that favorable decisions are taken mostly prisoners immediately after lunch judges, approval rate dropping back gradually to a minimum level recorded immediately before the next break. The conclusion of the researchers was that tired and hungry judges tend to choose the most common decision that does not involve the use of two systems: the refusal of parole.
Even if we know the mistakes of thinking, we are not safe from them. "My thinking is prone to commit these mistakes as before to study them," explained Daniel Kahneman. What can we do in this case? Kahneman's advice is to use the current vocabulary of phrases that describe these mistakes and then call friends. For people it is easier to recognize the mistakes of others than their own discussions with relatives, using a vocabulary that includes hints of cognitive errors, will help us avoid. "The purpose of my book is to raise the current conversation, to make people think more complex decisions when judging others. If we had a society in which people use a richer language when discussing these things, I think it would have an indirect effect on our decision, because we always understand the opinions of others about us, "concludes Kahneman.
No comments:
Post a Comment