Chapter 18, “Taming Intuitive Predictions” starts by clarifying the difference between those kind of instant “Blink” decisions that come from expertise (see Gladwell, 2005) and instant decisions that come from heuristics or intuition. The heuristics of intuition often come into play when a question is too hard to answer so people often, unconsciously, substitute an easier question… or consciously – a preferable one. We’ve seen a lot of this on the news lately, when politicians or regular people are asked the hard questions that they either can’t answer or don’t want to answer. Possibly out of unconscious self-preservation, they reinterpret the question as a simpler one, one they are more comfortable with, or more practiced, and reply from there. How often we hear, “that was not the question.” Kahneman states, that intuitive judgements are often made with “high confidence” even when based on “weak evidence”(p. 185).
Kahneman’s examples are far more complicated and surprising. They are wonderful examples of how people who may have some limited amount of knowledge in a particular domain are able to access some pertinent bit of information which may be remotely associated to the more complicated and difficult question. The leftover unanswered part of the question may then be altered to an easier associated question through their use of heuristics and the resultant emotion of confidence (referred to as, “intensity matching”) helps to develop their answer. The person “eventually settles on the most coherent solution” with a fair degree of conviction despite a lack of evidence and obvious substitution of the question (p.187). This is becoming fairly commonplace in politics while the viewer frustratingly watches and wonders if the speaker substituted the question on purpose or unconsciously.
So, how to correct for these sort of intuitive predictions or judgements? I would suggest a regression analyses with careful determination of predictive factors but Kahneman provides an interesting rough regression type of formula that suggests one uses “estimates” of averages and of correlations and “impressions of evidence”. He states you may still have error but it will be smaller and less biased or extreme. He also suggests that any attempt to correct intuitive predictions will take effort which won’t likely be pursued unless “stakes are high and when you are particularly keen not to make mistakes” (p.192). He further notes to always remember: regression to the mean when estimating; the importance of base rates; and the overconfidence of intuitive predictions.
He also notes that some extreme predictions may be warranted, depending on the application, such as when venture capitalists are looking for those few extreme cases to invest in or when a conservative banker is overly cautious in an attempt to avoid any possible chance of loaning money to someone who may declare bankruptcy. Kahneman sums up the chapter nicely when he states, “Extreme predictions and a willingness to predict rare events from weak evidence are both manifestations of System 1”… and “Following our intuition is more natural and somehow more pleasant, than acting against them.” Whereas, “Regression is … a problem for System 2. The very idea of regression to the mean is alien and difficult to” understand and “a causal interpretation” will likely be given “that is almost always wrong” (p.194-195).
References
Kahneman, D. (2011). Taming Intuitive Predictions. In Thinking Fast and Slow (pp. 156-165). New York, NY: Farrar, Straus and Giroux.
Gladwell, M. (2005). Blink: The Power of Thinking Without Thinking. Boston, MA: Little, Brown And Company.