Gary Klein, a psychologist and chief scientist at Applied Research Associates in Fairborn, Ohio, has for the past 25 years studied how people make real-life, critical decisions under extreme time pressure. In his 2009 book Streetlights and Shadows: Searching for the Keys to Adaptive Decision Making (MIT Press), Klein discusses 10 surprising ways effective thinkers deal with ambiguous situations. Staff writer Bruce Bower, who writes on Page 26 of this issue about risk and decisions, recently spoke with Klein about good decision making.
What is tacit knowledge and why do you consider it so important?
Unlike explicit knowledge of names, facts and rules, tacit knowledge is being able to do things without being able to explain how. Tacit knowledge feels mysterious when we use it. It’s a fountain for our intuitions. With experience, we learn to see things that others don’t notice. Knowing when to make a left turn in traffic separates experienced drivers from 16-year-olds. Only an experienced lawyer knows how to read a contract to spot potential problems for a client. Tacit knowledge includes the ability to recognize typical and unusual situations based on one’s experience. Good decision makers then construct a mental story to understand what’s going on. Laying out all the evidence or following standard procedures interferes with tacit knowledge.
Does it ever make sense to jump to conclusions?
It always makes sense to jump to conclusions. It’s impossible not to jump to conclusions; we’re wired for speed that way. People direct their search for information about ambiguous situations based on expectations informed by their experience. We still need to test our initial expectations against the current situation to avoid fooling
ourselves. But we shouldn’t follow advice to keep an open mind. How will we know when we have seen all the evidence? How long are we supposed to wait while we leave our minds on idle? Keeping an open mind makes it harder to figure out what’s happening.
We have to stop treating decisions as gambles. Successful decision makers actively manage a situation and shape their options rather than passively awaiting the outcome of a gamble that has specific probabilities, risks and benefits.
What is the role of making mistakes in effective decision making?
No one wants to make mistakes, but there is great value in them. My colleagues and I have found that the only time people become open to formulating better mental models of a situation is after having made a critical mistake. In one case, an expert Navy F-4 fighter pilot wanted to advance his career by learning to fly another aircraft, the A-6. But he badly botched a series of A-6 aircraft carrier landings required for certification. Only then did he consider the landing signal officer’s advice to think about differences between the F-4 and the A-6. He realized that a small shift in where he sat in the A-6 cockpit caused him to misalign the plane with the carrier runway. The next day he nailed his landings.
Is it possible to predict and protect against rare and unpredictable Black Swan events that many people have no previous experience with and can’t imagine before they happen?
Most people will be victimized by a Black Swan event, especially those who think that they can analyze information so carefully that they will be protected. Even when warning signs are clear, risk calculations can be terribly inaccurate and deceptive. Today’s global financial crisis was triggered by miscalculations that observers had warned about for years. Even in the early stages of the crisis, many people still thought that we were OK. These types of events are hard to comprehend, so we tend to explain them away.
Organizations in particular need to prepare to react and adapt quickly to unexpected events. Think of this as risk management by discovery.
What do you tell those who have lost trust in experts and the whole notion of expertise?
One way to tell real experts from people who look like experts is to ask them about the last mistake that they made. The expert is still chewing over that last mistake and asking “What should I have been watching for?” The nonexpert will dismiss a mistake as due to bad luck or say that it wasn’t a mistake at all, but due to uncontrollable circumstances.
I hate the notion of the mind as an assembly line where data get refined and processed until we can “connect the dots.” That trivializes the expertise of people, from firefighters to intensive-care unit nurses, who know how to identify the dots in the first place so that they can make sense of a situation.