Back to all posts
Natalia Rossingol

Thinking, Fast and Slow: a Ten Minute Summary

Required reading for anyone interested in how we think! In this summary of Thinking, Fast and Slow, we'll dive into the concepts that have made Daniel Kahneman's book an absolute classic of modern psychology.

If you meet a shy, helpful man who has little interest in the day-to-day goings on of the world, and who also has a need for order and structure – what would you think he’s likely to be: a librarian or a farmer?

Of course, he does sound like a stereotypical librarian. But don’t forget about statistics: there are more than 20 farmers for each male librarian in the United States. So yes, he’s probably a farmer.

We often overestimate our abilities to make the right decision and correct judgments – and Daniel Kahneman, psychologist and winner of a Nobel Prize in economics, can explain why. 

In his book “Thinking, Fast and Slow” Kahneman reveals the limitations of intuition – our “inner voice” which is romanticized as something almost sacred but which nevertheless often lets us down. 

Describing our mental life from the viewpoint of “two Systems” – which is, fast and slow thinking, he tells us why we shouldn’t blindly believe everything that comes to our mind.

Here is our short summary of the book.

Part I. Two systems

To begin with, we should mention that the term “systems” which Kahneman uses to explain our thoughts and actions is not used in the traditional sense. The systems he talks about are not entities consisting of parts that interact with each other.

Kahneman does this on purpose: when describing them as agents, he lets readers perceive them as fictional characters with individual personalities, abilities, and limitations, which makes it easier to understand what he wants to say.

Consider two examples:

1. You see a woman’s face. Her mouth is wide open, as if she’s about to shout; her eyebrows are frowned. It takes you less than a second to make a conclusion that she's angry. That’s an instance of fast thinking – and that’s System 1.

2. You see a problem 17×24. You immediately know it’s a multiplication problem, and you can solve it – with paper and pencil, or without, but relatively slowly. This is a case of slow thinking – and this is System 2.   

As we can see, System 1 represents our automatic thinking. It operates quickly, with little or no voluntary control. Unlike it, System 2 concerns effortful mental activities. And even though we identify ourselves with System 2, assuming that it’s our conscious self that helps us make choices and decisions, it’s not so.

System 1 generates ideas for System 2: intuitions, impressions, and feelings. It processes situations, and usually is good at it since our initial reactions are generally appropriate. 

However, System 1 is biased and prone to making systematic errors. The task of System 2, which is in charge of self-control, is to overcome the impulses of System 1:

“System 2 is the only one that can follow rules, compare objects on several attributes, and make deliberate choices between options. The automatic System 1 does not have these capabilities. System 1 detects simple relations (“they are all alike,” “the son is much taller than the father”) and excels at integrating information about one thing, but it does not deal with multiple distinct topics at once.”

Attention and effort

We believe that our thoughts and actions are chosen by System 2, but in fact, they are the result of System 1's work. 

This phenomenon has a long evolutionary history: the ability to respond to threats immediately and take self-protective actions is still encoded in our memory. For example, if your car starts skidding on oil, you’ll respond before you’re fully conscious of it.

However, to make serious judgments, we need to simultaneously maintain several ideas in our memory – and that does require effort. Combining intuition with knowledge makes us work harder, and often we don’t really do it.  

A bias to believe and confirm

System 1 is gullible and biased to believe, while System 2 doubts and checks everything. The problem is that System 2 is often busy, or even lazy, so when it comes to forming an opinion or making a prediction, the work is done by System 1.

Unfortunately, the associative nature of System 1 makes it test hypotheses the wrong way – not refuting them, like philosophers do, but deliberately searching for data that is compatible with the beliefs people already have. 

Not only does System 1 accept information uncritically, it also tends to exaggerate the likelihood of extreme and improbable events. If someone asks you to predict the likelihood of a tsunami in California, you will probably overestimate the disaster – for the simple reason that the images that come to your mind are very dramatic and terrible.

Assessing normality

We have a very clear idea of what's normal in our own personal world. And System 1 is responsible for the way our mind links between events, circumstances, and outcomes that co-occur. These links form the pattern of the structure of events, and determine our interpretation of the present, as well as expectation of the future.

Everything that is normal doesn’t surprise us, and vice versa. 

For example, the sentence “How many animals of each kind did Moses take into the ark?” can be misleading. Of course, you know that it was not Moses who took animals into the ark but Noah. But both Moses and Noah are close enough in category (i.e. they are both biblical characters). So the sentence does not necessarily surprise you enough to become alert and notice the flaw in the question. 

This peculiarity of our perception can be an obstacle to an objective interpretation of the world.

Ease, mood, and intuition 

Good mood, intuition, creativity, and gullibility – the components of System 1 – go in a cluster. It means that when you’re in a good mood, you’ll rely more on your intuition and be more creative – but at the same time, you will be less vigilant and more prone to making logical errors as your System 2 will be turned off.

According to Kahneman, there is a link between positive affect and cognitive ease – the ease with which we perceive information. 

Thinking that everything that requires more effort is not worth our attention, we get into a trap, losing good opportunities. So if, for example, you dismiss a business plan because the font is hard to read, reconsider your decision. 

Part II. Heuristics and biases

Heuristics, an approach to problem-solving based on previous experiences, leads to numerous mistakes in judgment – and to biases.

Kahneman discusses these biases from the viewpoint of the two Systems, saying that people who let themselves be guided by System 1 are more susceptible to biases.

Cause and chance

People tend to think that everything has its cause. But causal thinking excludes the chance of “the randomness of truly random events.” Randomness is often taken for regularity – and that means we can complicate things.

For example, during the bombing of London in World War II, there was a suspicion that German spies were located in the unharmed areas, because the gaps looked too conspicuous. 

Causal thinking made people draw the wrong conclusion: statistical analysis revealed that the bombing actually was random, even if it didn’t look like it.

The law of small numbers

According to this law, people will often assume that a small sample size can correctly represent a bigger picture. However, it makes us see the world simpler than it is. 

To counteract this illusion, we should not trust statements made on the basis of a small amount of data. For example:

“Yes, the studio has had three successful films since the new CEO took over. But it is too early to declare he has a hot hand.”

Anchors

The anchoring effect consists in considering the value of something before estimating it: any number that is provided to you as a solution to an estimation problem will be an anchor – because your own estimates will be close to it.

For example, if the price of a house is high, you will consider it as more valuable – even if you’re trying to resist the influence of the anchor (which is the original price.)

There is a technique of how to resist the anchoring effect. For example, if the seller sets a high price for a house, making a totally outrageous proposal, you shouldn’t react with an equally outrageous counteroffer. 

What Kahneman suggests is unusual but effective: instead, leave the room, or at least threaten to do so, if the other side insists on these numbers. This way, by mobilizing your System 2, you will both protect yourself and make it clear to the other person that this just doesn’t work for you.

The science of availability

Our personal experiences and vivid images from the past are more present in our minds than incidents that happened to other people – even if these incidents are similar to what happened to us. If a judge makes a mistake that affects you, it will bother you more than a similar incident you read about in a newspaper.

This is why self-biases are so common: we remember our contribution better than the contributions of others – even if they are equally valuable as ours.

There is also such a phenomenon as availability cascade – a chain of events that may start from a minor event (or from more serious events reported by media) and lead to public panic. 

According to the authors, the most significant practitioners of availability cascades are terrorists: even though the number of casualties from terror is small, as compared to other causes of death, and even though the possibility of an attack is small, the images of disasters repeated by the media make us panic. The reason is the repetition and vividness of images that speak to System 1.

The failure of System 2 to process information on time makes us believe our first impressions and fall victim to the biases mentioned above – and many others. System 2 requires special training. We should learn to doubt our first impressions – this is possible, but tiresome.  

Part III. Overconfidence

“Paradoxically, it is easier to construct a coherent story when you know little, when there are fewer pieces to fit into the puzzle. Our comforting conviction that the world makes sense rests on a secure foundation: our almost unlimited ability to ignore our ignorance.”

People have a strong erroneous conviction that they can understand the past. Logically, this makes them believe they can predict the future. This is an illusion.

Kahneman says he’s heard many people saying that “they knew the 2008 crisis was inevitable.” Pay attention to the choice of words – they said “knew”, not “thought” or “felt”. This is not true. They could have thought about it, but they didn’t have any information it was going to happen for sure.

The truth is, we cannot reconstruct our beliefs correctly. Very often we succumb to the hindsight bias – “I-knew-it-all-along” effect which results from our inability to remember how we were surprised when we learned about the result of a certain event.  

The illusion of pundits

The future is unpredictable – but the ease with which the past is explained makes us think it’s not. Think about those pundits in business and politics who are invited to TV shows to comment on recent events and predict the future. The irony, says Kahneman, is that those who know most are less reliable – because they are overconfident.

Philip Tetlock, a psychologist at the University of Pennsylvania, interviewed 284 people – experts commenting on political and economic trends. They were asked questions both in areas they specialized in and in those where they had less knowledge. The results were interesting: they showed experts were not able to make correct predictions.

However, as the author points out, “it’s not the experts fault – the world is difficult.” This is why, firstly, errors of prediction are inevitable, and secondly, subjective confidence should not be trusted.

Intuition and formulas

Intuition is not reliable; but what is?

Orley Ashenfelter, the Princeton economist, made a statistical formula to predict wine prices. That formula, taking into account a property and age, also included three features of the weather: the average temperature over the summer season, the amount of rain during harvest, and the amount of rain during the previous winter.

The formula is very accurate: the correlation between his predictions and the actual prices is over 90 percent.

Hostility to algorithms

Yes, formulas are effective and reliable, but people don’t like them. The main reason is our tendency to value decisions made by humans more than those made by machines – or in any other artificial way.

On the one hand, we really can make the right decisions, especially if we are specialists in a narrow area. Psychologists can have hunches about the reaction of their patients during therapy sessions – and yes, very often they will be right. 

However, this concerns only short-term predictions, underlines Kahneman. When it comes to long-term ones, we usually fail. This is why he recommends using formulas wherever possible.

Optimists

We cannot choose to be optimists or pessimists – this attitude is inherited. Optimists are very confident, and can sustain this feeling in others; confidence also helps them act when needed. This can lead us to a hypothesis that it’s optimists who have the largest influence on our lives, but is that so?

The resilience to overcome difficulties, characteristic of optimists, is definitely a good thing. But their style also includes a less virtuous feature: taking credit for success, they never take blame for failures. This can give them a distorted picture of the things.

Part IV. Choices

A theory of utility, created by the Swiss mathematician Bernoulli, states that if someone has more wealth, they will take more risks if the rewards are high. 

But Kahneman says that his theory is missing a very important part – a reference point. This component is included into the prospect theory, together with the concept of loss aversion. This way, the prospect theory is a more complex and reliable one.

To present how the prospect theory work, Kahneman and his colleague Amos Tversky created The Fourfold Pattern – a table that represents scenarios of high and low probability, explaining how people attach values to gains and losses in each case:

1.   Top left cell: high probability, gains (95% chance to win $10 000); risk averse. This is what Bernoulli talked about: when people consider prospects to achieve a large gain, they are averse to risk. Besides, they are willing to accept less than an accepted value of a gamble.

2.   Bottom left cell: low probability, gains (5% chance to win $10 000); risk seeking. This cell explains why lotteries are so popular: when you have a chance to win, even if it’s very small, you can’t resist a temptation.

3.   Bottom right cell: low probability, losses (5% chance to lose $10 000); risk averse. Here we can see how insurance companies make money: people pay not only for protection but more to eliminate a worry.

4.   Top right cell: high probability, losses (95 chance to lose $10 000); risk seeking. This is where people can make dangerous choices: “Risk taking of this kind often turns manageable failures into disasters.”    

Regret

Speaking about choices, Kahneman also mentions regret because it’s often experienced by decision makers. We feel regret because we realize there have been alternatives to the choice we made – if it was the wrong one, of course. But when it comes to regret, bias can appear in many contexts.

Let’s consider the following example: Mr. Brown never picks up hitchhikers but yesterday he did, and got robbed.  Mr. Smith always picks up hitchhikers; yesterday he also picked someone up - and also got robbed.

Mr. Brown will experience greater regret, because this event was abnormal to him. At the same time, for this same reason – because Mr. Brown did something abnormal to him – people would blame him more than Mr. Smith. Which is ridiculous: after all, Mr. Smith exposes himself to robbery much more often. So maybe he should regret it more?

Making a decision, we often anticipate pain, trying to avoid it at the same time. But susceptibility to regret should not poison our lives: there is a way to adjust to it. When things go badly, just remind yourself that you actually did think about the possibility of regret when making a choice. 

Part V. Two selves

In addition to his two Systems, Kahneman speaks about Two Selves – the experiencing self and the remembering self. They are different, and they are in a position of conflict – because we often confuse them.

For example, if a person listens to a symphony on a disc that is scratched at the end and it produces a horrible sound, he will probably say that the ending ruined the whole experience. But that’s not true: the experience took place, and the bad ending couldn’t ruin it – because it already happened. Saying that it did, the person takes memories for experience:

“It is a compelling cognitive illusion—and it is the substitution that makes us believe a past experience can be ruined. The experiencing self does not have a voice. The remembering self is sometimes wrong, but it is the one that keeps score and governs what we learn from living.”

Rules of memory determine which choice we liked more in the past – and then, when we have to make a similar choice again, we will choose the thing we liked more. Sometimes this can lead us to really dumb choices.

In one of their experiments, Kahnehan and his colleagues had people immerse their hands in cold water: the first episode took 60 seconds, with water at 14° Celsius. The longer one took 90 seconds, and after the first 60 seconds, the temperature of water rose by 1° which caused a slight decrease in pain.

Then the participants had to continue the experiment, and they were given a choice which session to repeat – the 60 second or the 90 second one. 

From the viewpoint of the experiencing self, the longer trial was worse. However, 80% of participants who reported their pain diminished during the second phase chose to repeat it – even though they had to experience 30 seconds of unnecessary pain.

This experiment demonstrated the power of the remembering self: the second trial was more favorable because the participants liked it more. Here we can also see the neglect of duration – the difference between 60 and 90 seconds was ignored.

Our remembering self often makes us feel worse than we should. So if you say your experience – for example, a vacation – was horrible, ask yourself: was there a good part? Maybe you’re overthinking a bad moment that happened to you at the very end, like a horrible sound at the end of a symphony?  

Intuition is misleading – yet it’s very hard to resist our nature. It seems like a simple task to resist System 1: recognize the signs that you’re in a trap and resort to System 2. Unfortunately, it doesn’t work this way – especially in stressful situations, when we react immediately and involuntarily.

This is why Kahneman says he wrote this book not for decision makers, but for critics. 

Indeed, it’s easier to gossip about someone’s wrong decisions than to make your own. But there is a good part: not being able to hear the voice of our own doubt, we can still predict the reaction of others. And often it saves us from making unnecessary mistakes.

 

SIGN-UP FOR MORE
Enjoy the post? Sign up for the latest strategies, stories and product updates.

You might also like

Try Runn today for free!

Join over 10k users worldwide.
Start scheduling in less than 10 minutes.
No credit card needed