I recently finished reading Daniel Kahneman’s Thinking, Fast and Slow. Four stars out of five. It’s a fascinating book full of insights from decades of research, interspersed with anecdotes about how those ideas developed over time. Great for the non-specialist. It’s also long, weighing in at 500 pages. Read it — but learn to read quickly.
I won’t try to summarize or draw general lessons here (plenty of other reviews have done so already). I just want to highlight one idea that stood out as I’ve been thinking about management and decision-making in contexts of complexity: the flaws of confidence, intuition and expert judgment.
Much of the book explores biases in human reasoning, especially the quick judgments we make (using what Kahneman calls System 1, in contrast to slower and more deliberate reasoning of System 2). Yet these quick judgments can also be powerful, especially when they come from experienced professionals. The standard anecdote to illustrate this is the story of the firefighting commander who pulled his team out of a building just moments before it collapsed, based on the intuition that something was wrong. Such examples come from the research of psychologist Gary Klein, and have been popularized by writers like Malcom Gladwell.
So on the one hand, we have the successes of intuition. On the other hand, the failures of cognitive biases. What distinguishes the two? In a chapter on expert intuition, Kahneman recounts his work with Klein to find the border between intuition’s pitfalls and promises.
When to trust intuition: Not confidence, or even experience, but context too
Based on Kahneman’s other work on cognitive biases, he and Klein knew that one’s confidence in a belief is poorly correlated with the belief’s accuracy. We tend to be confident in stories that come to mind easily (availability heuristic), regardless of how strongly the evidence supports them. Many studies have shown that experts in a variety of fields are just as susceptible to this as laypeople. Someone’s experience and technical knowledge are insufficient reasons to trust their intuition.
To dig deeper, Kahneman and Klein worked from a psychological model of recognition-primed decision making. In this description, intuition results from a process of recognizing patterns and similarities to past situations that have either been experienced directly or learned from others. (Of course, not all intuition is about a quick decision made in a collapsing building. Often it arises as a “gut instinct” influencing a more deliberate analytical process. It serves different ends but operates similarly in both cases. In both cases, one of intuition’s strengths is to quickly weight different types of data from different sources into a coherent judgment. “Thinking fast” is about the speed of the thinking, not the tempo of the situation we are in.)
However, this model can only lead to valid intuitions under certain conditions. First, a regular and predictable environment is needed, and second, prolonged practice allows an expert to learn these regularities. Here’s how Kahneman summarizes it:
This means the context matters. In an irregular or unpredictable environment, we should always be skeptical of intuitive judgments (including our own). Firefighters, chess-players, doctors, and even athletes face repeated situations that are fundamentally orderly enough to be predictable and allow the acquisition of skill. Political commentators and stock brokers? Not so much. Nassim Taleb explored similar ideas in his analysis of the financial industry in Fooled by Randomness: expertise is often an illusion better explained through chance and survivorship bias.
For better or worse, this line of reasoning takes us toward the pessimistic view that intuition is basically useless in situations of complexity. I’m not sure if I fully believe that.
Can complexity science help?
A 2008 working paper from ODI (h/t Harry Jones) mentioned that complexity concepts and related tools (such as agent-based modelling) can provide support for intuition. Complexity tools serve to make situations more navigable, helping us to study and understand them in ways we couldn’t before. I think the gist of the argument is that we can combine the insights of complexity tools with our intuition in a given situation. This leverages the strength I mentioned above: intuitive judgment allows us to combine data of multiples types/sources and make decisions when more rigorous analytical models might be impractical.
However, the tools and lenses of complexity science stop short of actually simplifying situations that are otherwise complex. We never get the regularity that (according to Kahneman and Klein’s model) our intuitions need in order to become trustworthy. So if we’re going to use complexity tools to guide our intuitions, we still need to proceed cautiously — or perhaps not at all.
Here are a few other excerpts from the book that readers of this blog might appreciate. I especially like this bit about how organizational processes and bureaucracy might lead to better decisions — precisely because they force slow decisions. Of course, the quality of the output depends on the design of the process. The image of organization-as-decision-factory is one I may use again:
On the topic of complexity, I could have written a whole other post on this bit about the “knowability” of the world: