Thinking Fast and Slow is fundamentally a book about biases. So, the first question is, what is a bias? A bias, as Kahneman defines it, is a systematic error in thinking. The important word here is systematic. Not all errors are biases. For an error to be a bias, it has to be reproducible — that is to say, it must recur predictably in particular circumstances. What those errors are, and, just as importantly, which circumstances cause those errors to occur, is the topic of this book.
So why did Kahneman decide to study biases? His interest goes back to a seminar that he had with an “up and coming” psychological researcher named Amos Tversky. Their research started with a study of whether people are intuitively good statisticians. It turned out that people are actually quite poor at calculating statistics intuitively. In fact, even trained statisticians will overestimate the power of small sample sizes or the significance of a given statistical, despite having years of experience in the field.
Until that point, the congnitive science view of the mind held that people were basically rational, and that departures from rationality could be attributed to emotion. Kahneman and Tversky’s results with statisticians were the first chink in that worldview. As they pursued their research, they would find that people were not basically rational, and that departures from rationality were actually the norm in many situations, even when strong emotions weren’t involved. This finding opened up new avenues of research in psychology, sociology, economics and many other sciences, and eventually earned Kahneman the Nobel Memorial Prize in Economics.
Chapter 1: The Characters of the Story
Kahneman builds upon the two-systems view of the mind initially proposed by Keith Stanovich and Richard West. In this view, our cognition can be divided into 2 semi-autonomous systems, titled, unsurprisingly, System 1 and System 2. System 1 operates quickly, automatically, and unconsciously, whereas System 2 is slow, effortful, and conscious. Stated otherwise, System 1 is the system that in use most of the time, for day to day activities, whereas System 2 is the system that’s in use when we’re conscious of thinking or mental effort. This thinking effort can actually be measured. When System 2 goes into action, people’s pupils dilate, their muscles tense, and their blood pressure, heart rate and respiration all increase.
Because of this, our brains prefer to run System 1 activities whenever possible, and switch to System 2 only when there is a stimulus that System 1 cannot handle. However, this leads to a paradox, because in order to detect such a stimulus, we have to be paying some level of attention to our surroundings. This paradox is the fundamental source of bias: our brains miss a stimulus, and, in addition, do not know that they missed that stimulus. We don’t know, and we don’t know that we don’t know.
The “invisible gorilla” video demonstrates this perfectly. The video consists of two basketball teams, wearing white and black shirts, respectively, passing the ball amongst themselves. Viewers are asked to count the number of passes made by the team wearing white. In the background, a researcher wearing a gorilla suit walks into the scene, thumps her chest and walks out. When the viewers are asked whether they saw anything unusual in the film, more than half deny they saw anything besides two teams passing basketballs around. Moreoever, even when pressed, they deny that there was anything unusual going on and insist that they would have noticed something outlandish as a person in a gorilla suit walking across the background. Yet, when the same video is shown, without the accompanying instructions, the gorilla is plain to see for everyone. This demonstrates the core premise of the book: not only do humans have cognitive blind spots, but we’re also blind to where those blind spots are.
So, what is to be done? Are we doomed to a lifetime of mistakes and irrationality, caused by cognitive illusions that we are not even aware of? Unfortunately, we cannot just turn off System 1. It runs unconsciously and automatically as long as we are awake. And even if we could, we wouldn’t want to turn off System 1. System 1 is fast, efficient, and, most of the time, correct. The goal of Thinking Fast and Slow is to help us recognize the situations in which System 1 is not correct, so we can approach those situations with System 2 “on standby”, ready to override our initial, incorrect impulses.
Finally, it is important to note that System 1 and System 2 are not literal, physical systems in the brain. Rather, they’re abstractions for modes of thinking, allowing us to refer to fast, intuitive thinking and slow, effortful thinking with the snappier titles System 1 and System 2, thus making it easier to reason further about them.