Thinking, Fast and SlowI recently finished reading Daniel Kahneman’s Thinking, Fast and Slow. Four stars out of five. It’s a fascinating book full of insights from decades of research, interspersed with anecdotes about how those ideas developed over time. Great for the non-specialist. It’s also long, weighing in at 500 pages. Read it — but learn to read quickly.

I won’t try to summarize or draw general lessons here (plenty of other reviews have done so already). I just want to highlight one idea that stood out as I’ve been thinking about management and decision-making in contexts of complexity: the flaws of confidence, intuition and expert judgment.

Much of the book explores biases in human reasoning, especially the quick judgments we make (using what Kahneman calls System 1, in contrast to slower and more deliberate reasoning of System 2). Yet these quick judgments can also be powerful, especially when they come from experienced professionals. The standard anecdote to illustrate this is the story of the firefighting commander who pulled his team out of a building just moments before it collapsed, based on the intuition that something was wrong. Such examples come from the research of psychologist Gary Klein, and have been popularized by writers like Malcom Gladwell.

So on the one hand, we have the successes of intuition. On the other hand, the failures of cognitive biases. What distinguishes the two? In a chapter on expert intuition, Kahneman recounts his work with Klein to find the border between intuition’s pitfalls and promises.

When to trust intuition: Not confidence, or even experience, but context too

Based on Kahneman’s other work on cognitive biases, he and Klein knew that one’s confidence in a belief is poorly correlated with the belief’s accuracy. We tend to be confident in stories that come to mind easily (availability heuristic), regardless of how strongly the evidence supports them. Many studies have shown that experts in a variety of fields are just as susceptible to this as laypeople. Someone’s experience and technical knowledge are insufficient reasons to trust their intuition.

To dig deeper, Kahneman and Klein worked from a psychological model of recognition-primed decision making. In this description, intuition results from a process of recognizing patterns and similarities to past situations that have either been experienced directly or learned from others. (Of course, not all intuition is about a quick decision made in a collapsing building. Often it arises as a “gut instinct” influencing a more deliberate analytical process. It serves different ends but operates similarly in both cases. In both cases, one of intuition’s strengths is to quickly weight different types of data from different sources into a coherent judgment. “Thinking fast” is about the speed of the thinking, not the tempo of the situation we are in.)

However, this model can only lead to valid intuitions under certain conditions. First, a regular and predictable environment is needed, and second, prolonged practice allows an expert to learn these regularities. Here’s how Kahneman summarizes it:

Memory also holds the vast repertory of skills we have acquired in a lifetime of practice, which automatically produce adequate solutions to challenges as they arise, from walking around a large stone on the path to averting the incipient outburst of a customer. The acquisition of skills requires a regular environment, an adequate opportunity to practice, and rapid and unequivocal feedback about the correctness of thoughts and actions. When these conditions are fulfilled, skill eventually develops, and the intuitive judgments and choices that quickly come to mind will mostly be accurate. (p. 416)

This means the context matters. In an irregular or unpredictable environment, we should always be skeptical of intuitive judgments (including our own). Firefighters, chess-players, doctors, and even athletes face repeated situations that are fundamentally orderly enough to be predictable and allow the acquisition of skill. Political commentators and stock brokers? Not so much. Nassim Taleb explored similar ideas in his analysis of the financial industry in Fooled by Randomness: expertise is often an illusion better explained through chance and survivorship bias.

For better or worse, this line of reasoning takes us toward the pessimistic view that intuition is basically useless in situations of complexity. I’m not sure if I fully believe that.

Can complexity science help?

A 2008 working paper from ODI (h/t Harry Jones) mentioned that complexity concepts and related tools (such as agent-based modelling) can provide support for intuition. Complexity tools serve to make situations more navigable, helping us to study and understand them in ways we couldn’t before. I think the gist of the argument is that we can combine the insights of complexity tools with our intuition in a given situation. This leverages the strength I mentioned above: intuitive judgment allows us to combine data of multiples types/sources and make decisions when more rigorous analytical models might be impractical.

However, the tools and lenses of complexity science stop short of actually simplifying situations that are otherwise complex. We never get the regularity that (according to Kahneman and Klein’s model) our intuitions need in order to become trustworthy. So if we’re going to use complexity tools to guide our intuitions, we still need to proceed cautiously — or perhaps not at all.



Here are a few other excerpts from the book that readers of this blog might appreciate. I especially like this bit about how organizational processes and bureaucracy might lead to better decisions — precisely because they force slow decisions. Of course, the quality of the output depends on the design of the process. The image of organization-as-decision-factory is one I may use again:

Organizations are better than individuals when it comes to avoiding errors, because they naturally think more slowly and have the power to impose orderly procedures. Organizations can institute and enforce the application of useful checklists, as well as more elaborate exercises, such as reference-class forecasting and the premortem. At least in part by providing a distinctive vocabulary, organizations can also encourage a culture in which people watch out for one another as they approach minefields. Whatever else it produces, an organization is a factory that manufactures judgments and decisions. Every factory must have ways to ensure the quality of its products in the initial design, in fabrication, and in final inspections. The corresponding stages in the production of decisions are the framing of the problem that is to be solved, the collection of relevant information leading to a decision, and reflection and review. An organization that seeks to improve its decision product should routinely look for efficiency improvements at each of these stages. The operative concept is routine. Constant quality control is an alternative to the wholesale reviews of processes that organizations commonly undertake in the wake of disasters. There is much to be done to improve decision making. One example out of many is the remarkable absence of systematic training for the essential skill of conducting efficient meetings. (p. 417-418)

(Emphasis added.)

On the topic of complexity, I could have written a whole other post on this bit about the “knowability” of the world:

I have heard of too many people who “knew well before it happened that the 2008 financial crisis was inevitable.” This sentence contains a highly objectionable word, which should be removed from our vocabulary in discussions of major events. The word is, of course, knew. Some people thought well in advance that there would be a crisis, but they did not know it. They now say they knew it because the crisis did in fact happen. This is a misuse of an important concept. In everyday language, we apply the word know only when what was known is true and can be shown to be true. We can know something only if it is both true and knowable. But the people who thought there would be a crisis (and there are fewer of them than now remember thinking it) could not conclusively show it at the time. Many intelligent and well-informed people were keenly interested in the future of the economy and did not believe a catastrophe was imminent; I infer from this fact that the crisis was not knowable. What is perverse about the use of know in this context is not that some individuals get credit for prescience that they do not deserve. It is that the language implies that the world is more knowable than it is. (p. 201)

(Emphasis added.)

  1. Great post. I was a psychology major before turning to international development studies. I love this stuff – when you really realize that disciplines overlap like this you learn to keep your eyes open.

    I wonder how this influences areas where split second decisions count a lot? First thing that came to mind was health in humanitarian situations.

    Please do write about “knowability”. There are a lot of important implications there, especially in the language part as you emphasised.


  2. On my way home from a speech I gave about the limits of what data can tell us and the need to not dismiss our intuition in grantmaking, I read recently about a new field of medicine, neurogastroenterology. Scientists are studying the web of neurons lining the gastrointestinal tract as an independent brain. Gives “trust your gut” a whole new meaning. And man, I wish I had read it before the speech!


  3. […] Just how useful is intuition in a complex situation?”>Dave Algoso […]


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: