• Alex Vikner

Thinking, Fast and Slow: How to Make Better Decisions

Daniel Kahnemanan expert in psychology of decision-making and a Nobel Prize winner in behavioural economicsteaches us about how we think and act in this dense yet brilliant book.


1. The Two Systems

We have 2 primary systems for thinking. System 1 is fast and intuition-based while System 2 is thoughtful, slow, and decided. Success relies upon slow thinking and fast movement.


System 1 provides the impressions that often turn into our beliefs. Our actions and emotions can be primed by events that we are not aware of. In other words, ideas prompt other ideas later on without our conscious involvement.


System 2 is responsible for monitoring and controlling suggestions from System 1 but is often lazy and places too much faith in intuition.


Cognitive strain mobilises System 2: you will make less errors but but also be less creative. On the other hand cognitive ease makes you more creative but also more prone to errors.


The main function of System 1 is to maintain and update a model of your personal world, which represents what is normal in it.


System 1 jumps to conclusions based on limited evidence. Intuitive thinking operates around WYSIATI (what you see is all there is) which is how we can make sense of partial information in a complex world. However, WYSIATI is also behind a lot of biases we fall victim to.


We are naturally drawn to easy solutions that demand as little mental effort as possible. We avoid mental overload by dividing our tasks into multiple easy steps and commit intermediate results to long-term memory instead of the easily overloaded working memory.


When making judgements, we tend to either compute more information than needed, or we attempt to match the underlying scale of intensity across dimensions a.k.a. intensity matching. This phenomenon yields predictions that are as extreme as the evidence they are based on.


All forms of intuitive estimation are impacted by bias that generates extreme answers. This occurs because the estimator starts with their experience and then uses intensity matching.


When we can't find a satisfactory answer to a hard question quickly, System 1 will find a related question that is easier and answer it instead.


2. Heuristics and Biases

Heuristics are mental shortcuts we use to solve problems. Think of biases as prejudices.


The law of small number refers to the incorrect belief of thinking that small samples ought to resemble the population from which they are drawn.


The anchoring effect occurs when a particular value for an unknown quantity influences your estimate of that quantity. For example, if you were asked whether Gandhi was more than 114 years old when he died you will end up with a much higher estimate of his age than you would if the anchoring question referred to a death at 35.


The availability heuristic is the process of judging frequency by the ease with which instances come to mind. For example, we are currently living in the least violent time in history. Yet most people are shocked when they hear that. Why do we hear about wars, viruses, murders and crime every day? Because this is also the best reported time in history.


We try to simplify our lives by creating a world that is much tidier than reality; in the real world, we often face painful tradeoffs between benefits and costs.


Another bad tendency we have is to overweight evidence and underweight base rates. To discipline our intuitions, we need to recognise that base rates matter and that the diagnosticity of evidence is often times exaggerated.


Adding detail to scenarios makes them more persuasive, but less likely to come true. Such conjunction fallacy occurs when we judge two events together being more probable than one of the events in a direct comparison. Essentially we replace probability with plausibility.


We are more likely learn something from an individual case or example than from facts and statistics. Indeed, our unwillingness to deduce the particular from the general is matched only by our willingness to infer the general from the particular.


Assuming that something has returned to normal because of corrective actions taken while it was abnormal is called regression fallacy. This fails to account for natural fluctuations. To illustrate this, you can draw circle on a blackboard and ask some friends to throw a piece of chalk at the center without looking. Those that did well on the first try tend to do worse on their second try and conversely. The change in performance occurred naturally!


Rewards for improved performance work better than punishment of mistakes!


3. Overconfidence

When looking back at a problem, the solution seems so simple and obvious! That is called hindsight bias. This often leads to us overestimating our ability to have predicted an outcome that could not possibly have been predicted.


In business writing, demand for illusory certainty is met in two genres of: histories of the rise and fall of particular individuals and companies, and analyses of differences between successful and less successful firms. However, the significant role of luck is often ignored!


To maximise predictive accuracy, final decisions should be left to formulas, especially in low-validity environments. Whenever we can replace intuition with a structured, yet simple formula, we should at least consider it.


Under normal conditions, an expert’s intuition can be trusted, but when dealing with less regular environments, be more sceptical.


The planning fallacy describes our tendency to plan projects based on best-case scenarios and without taking into account statistics of similar cases.


Inadequate appreciation of of the uncertainty of the environment inevitably leads many of us to take risks that we should avoid. We become overconfident as a result of WYSIATI. When we estimate a quantity, we rely on information that comes to mind and construct a coherent story in which the estimate makes sense.


4. Choices

Bernoulli’s expected utility model lacks the idea of a reference point. That is because the value of something is largely dependent on a person’s current situation.


Prospect theory teaches us that in mixed gambles where both a gain and a loss are possible, we are naturally risk-averse meaning that we dislike losing more than we like winning. But in bad choices, when a sure loss is guaranteed, we are more likely to seek risk.


Linked to this is the endowment effect which posits that we naturally assign more value to things just because we own them.


Us humans, like most animals, have brains that contain a mechanism designed to give priority to bad news. As a result, we tend to work harder to avoid losses than we do to secure gains. This asymmetric intensity of the motives to avoid losses and achieve gains shows up almost everywhere. For example, the fear of failing to reach a goal is much stronger than the desire to exceed it.


In fact, we're just as risk-seeking in the domain of losses as we are risk-averse in the domain of gains.


We often overestimate the probabilities of unlikely events which causes us to overweight them in our decisions. That is why terrorism is so effective: it induces an availability cascade.


To avoid exaggerated caution induced by loss aversion, we should take a broad frame and think as if the decision is just one of many.


Rewards and punishments shape our preferences and motivate our actions. Different mental accounts keep track of all this.


People expect to have stronger emotional reactions such as regret to an outcome that is produced by action than to the same outcome when it is produced by inaction.


Single evaluations call upon the emotional responses of System 1, whereas comparisons involve more careful assessment, typically by System 2.


Different statements can evoke different reactions depending on how they are framed. For example these two descriptions are of the same outcome: (1) the 1-month survival rate is 90%; (2) there is 10% mortality in the first month. Yet, 84% of physicians favour #1. Why? Because System 1 is rarely indifferent to emotional words. "Mortality" is bad, "survival" is good. 90% survival is encouraging while 10% mortality is frightening.


5. Two Selves

We have an experiencing self that knows only the present moment, and a remembering self which keeps score and governs what we learn in order to make decisions.


Most people are indifferent to their experiencing self, only caring about the memories collected in order to fuel different narratives.


When we judge the unpleasantness of painful experiences, we neglect duration and instead focus how we felt at its peak, meaning the most intense point, and at its end (peak-end rule). This causes a bias that favours a short period of intense joy over a long period of moderate happiness. This can explain why you might prefer immediate over delayed gratification.


Remember

Information without action is meaningless. Try to implement some of these lessons in your life to make more mindful and unbiased decisions.