I came relatively late to the pop-science subgenre of behavioral economics. It was 2013, and the inimitable Maia Bittner had just loaned me her copy of Predictably Irrational. Dan Ariely’s book is an entertaining (if merciless) skewering of the fantastic nature of rational thought, but it left me with more questions than answers. That we’re not the rational, self-interested actors of mainstream economics–Thaler’s Econs, or Pareto’s Homo economicus–wasn’t in doubt. But which of the mechanisms buried inside our all-too-Human brains underpin our predictably irrational behaviors?
Thinking Fast and Slow provides some answers. In Daniel Kahneman’s telling, our thoughts are the product of a two-stage engine dependent both on intuition (the limbic brain of Paul MacLean, which Kahneman calls “System 1”), and the deliberate (though slower) machinations of our conscious mind (“System 2”).
System 1 automatically models familiar situations and deploys a variety of heuristic techniques to enable swift, subconscious decisions. This sort of mechanical thinking enables an expert pianist to thoughtlessly translate sheet music into a deft touch of the keys. It’s also the system we use every day to cope with decisions that need a quick (and generally correct) model of the world.
System 2 requires more effort. When you or I are struggling to play the piano, our intense concentration is System 2 at work. This sort of thinking is more attentive, more deliberate, and better-able to check its own reasoning, but–as anyone that’s suffered the pain of learning will know–it’s small surprise that we’ve evolved faster ways to think.
Together, the two systems strike an important balance. Most of our thinking takes place in System 1 as fast and as effortlessly as possible, while System 2 still allows us to absorb new information and think critically about it.
The trouble begins when our intuition tackles a new situation without enlisting our slower, more deliberate mind. In his bright prose, Kahneman recounts the clever (and often devious) experiments that he and collaborator Amos Tversky used to reveal our cognitive shortcomings. Biases abound, illusions feel real, and overconfidence is the order of the day. It’s an unsettling read–at least, it should feel unsettling–but the results will delight armchair psychologists and needle freshwater economists alike.
The summary of all this research is the glum conclusion that “System 1 is not readily educable.” Overcoming errors of intuition, then, means engaging the slow, effortful System 2. But since these same errors often arise precisely as we feel under pressure to act, we often lack the cognitive capacity needed to recognize (and extricate) ourselves from the “cognitive minefield.”
Recognizing this Kahneman proposes that, “Humans, unlike Econs, need help to make good decisions, and there are informed and unintrusive ways to provide that help.”
The prescription goes like this: organizations seeking to make good decisions, whether in business or society at large, must recognize and correct the inherent biases of their individual members. Whether through checklists, forecasts, or other quality-control mechanisms, these communities must actively seek to improve their collective decision making.
This final dagger through the fairytale of rational thought gives unexpected affirmation to our ability to help one another improve. While I didn’t pick up the book expecting a think-piece on social cohesion, the idea of treating irrational decisions as an organizational challenge was one of the most memorable takeaways.
If the book’s volume feels off-putting, fear not: if you’re willing to take the science for granted, the cliffnotes in the ten-page conclusion hit the salient points. But if you’re as interested in learning about the research as much the increasingly familiar result, Thinking Fast and Slow offers a window into the limits of our animal brains. I enjoyed it immensely. I hope you will, too!