"Every significant choice we make in life carries with it some uncertainty."
- Thinking Fast and Slow, page 270
"The premise of this book is that it is easier to recognize other people's mistakes than our own."- Thinking Fast and Slow, page 28
Though this weighty tome (500+ pages including appendices) is scientific in nature, Kahneman’s thrust is pointedly human. A recurrent theme throughout is the immeasurable benefit of relationships. In a deep exploration of our fundamentally non-rational thinking, I was pleasantly surprised to be directed to friendship as, not just the primary source of happiness (“It is only a slight exaggeration to say that happiness is the experience of spending time with the people you love and who love you”) but a useful tool for self-exploration and personal growth.
Kahneman introduces two systems, the two ways we think, which he simply calls System 1 and System 2. They are, essentially, our unconscious emotional self (System 1) and our conscious logical self (System 2.)
System 1 is automated and intuitive. When you see two objects, System 1 determines which is closer, which is larger, which is human and which is not. You don’t have to do anything, it just happens. System 1 reads the emotions on another person’s face. When you see “2 + 2” it says “4” without the effort of calculation. When you see the phrase “the capital of France” it blurts out “Paris” without being asked. It’s reading these words right now.
System 2 is cognition with intent. When you see “17 x 32” System 1 tells you this is a math problem, and even indicates whether you can solve it easily or will need a pencil or a calculator. If so, System 2 will be doing the work. Finding the shortest route to Duluth or defining “abstruse” or filling out your tax forms: all System 2 tasks.
System 1 is always on, always looking for other answers, busy as a bee. It is everything we do automatically, instinctively, unconsciously, effortlessly. Because it is unconscious and effortless, it constantly serves up answers, whether we’ve asked for them or not. It creates our prejudices. It lets us drive without paying constant attention to every detail. System 1 tells us the couch we’re buying will fit in our living room because it looks so small in the showroom. It tells us that going 5 miles per hour over the speed limit will make up the 5 minutes we’re late for work.
System 2 is lazy. So lazy, in fact, that we are incapable of fully escaping our heuristic biases. Even Kahneman, who knows these things better than any of us, admits that though he knows the facts, he sees in himself regular examples of the wrong thinking caused by System 2’s lazy dependence on System 1’s ability to quickly and authoritatively serve up answers — whether or not they’re correct.
The only recourse is to incite System 2 to do the work necessary to vet System 1’s input. If we’re smart, we’ll measure the living room and the couch before we buy it. System 2 can tell us that a 96” couch won’t fit in an 88” space, no matter what System 1 says about how small it looks in the showroom. System 2 can do the math to calculate that speeding by 5 mph will, in even perfect circumstances, only shave 30 seconds off our commute, despite our continued urge to drive faster.
Like optical illusions, where we can convince ourselves that, yes, the two lines are the same length, but can never stop seeing them as different lengths, our cognitive illusions cannot fully be overcome. They can, however, be observed by others. The last page of the book points out that it is “much easier to identify a minefield when you see others wandering into it than when you are about to do so.”
Intimate trusting relationships are our greatest protection against the mistakes, prejudices, biases of System 1. When we have a support network of friends and family we have access to observers who can nudge us to check with System 2 when it seems System 1 is leading us astray.
The insights in Thinking, Fast and Slow are certainly worth the time and effort it takes to read, and go far beyond the two Gems I detail below.
"[T]he [bias] errors that individuals make are independent of the errors made by others, and (in the absence of a systematic bias) they tend to average to zero."- Thinking Fast and Slow, page 84
In The Wisdom of Crowds James Surowiecki shows that, while guessing the number of jelly beans in a jar is hard for individuals, large groups tend to be remarkably accurate. One requirement, though, is that members of the group not be allowed to influence each other.
Reinforcing research reported in Jonah Lehrer’s Imagine Kahneman makes a suggestion for gathering the best insights for decision-making by groups. Before an issue is discussed, all members of the group should write a brief summary of their perspective. Rather than giving preference to those who speak loudest or most eloquently this makes better use of the diversity of knowledge within the group.
"A question we considered early was how many instances must be retrieved to get an impression of the ease with which they come to mind. We now know the answer: none."- Thinking Fast and Slow, page 130
If you were to list 6 specific instances where you were polite to people, you would finish the task feeling rather good about yourself. Suppose you were to list 12 instances instead. Twice as good?
Not even close. In fact, you would probably end up with the distinct impression that you are not polite enough.
The ease with which we recall events contributes to a powerful bias: we assume that ease of recall equals frequency. When specific instances are easy to recall, we consider the event more common overall than when specific instances seem difficult to recall. We’re likely to overestimate the frequency of celebrity divorce, for instance, because the media ensures that each celebrity divorce becomes memorable.
When we’re listing 6 instances of our good manners they are about as easy to retrieve as we expect. The next 6, however, are frequently a bit harder to dredge up, simply because we don’t usually commit those events to memory. Our unconscious System 1 is surprised and interprets this surprise as evidence of infrequency: we’re not polite as often as we thought we were.
Shifting to System 2 by being aware of the ease of recall bias and accepting in advance that additional instances of a behaviour will be difficult to remember eliminates the bias.
Imagine how your next T-chart, pros on the right, cons on the left, will be changed by focusing on the number and quality of entries on each side rather than the ease of recall.
I’ll close with selected quotes which gave me that goose bumps “write that down now” feeling:
Daniel Kahneman (Hebrew: דניאל כהנמן) (born March 5, 1934) is an Israeli-American psychologist and winner of the Sveriges Riksbank Prize in Economic Sciences in Memory of Alfred Nobel. He is notable for his work on the psychology of judgment and decision-making, behavioral economics and hedonic psychology. With Amos Tversky and others, Kahneman established a cognitive basis for common human errors using heuristics and biases (Kahneman & Tversky, 1973; Kahneman, Slovic & Tversky, 1982; Tversky & Kahneman, 1974), and developed prospect theory (Kahneman & Tversky, 1979). He was awarded the 2002 Nobel Memorial Prize in Economics for his work in prospect theory. In 2011, he was named by Foreign Policy magazine to its list of top global thinkers. In the same year, his book Thinking, Fast and Slow, which summarizes much of his research, was published and became a best seller.