Wednesday, April 5, 2023

Decision-Making and Risk

Hasard Lee gives us an exciting read about decision-making in his upcoming book, The Art of Clear Thinking. It’s easy to get lost in the Top Gun stories and miss his important lessons about clear thinking. Many of us don’t face life-death situations that require split second decision making. Thus, the author intersperses business and other examples to elucidate his points. He teaches us that the fighter pilot’s mantra applies to all of us: “No situation is so bad I can’t make it worse.” Especially if I don’t follow the author’s guidelines. Those who have studied leadership response mechanisms might be familiar with Boyd’s OODA Loop or Deming’s PDCA and such. Lee simplifies these mechanisms into a more easily remembered key.

His experience complements the teaching while adding an important aspect that cripples a lot of us: the lack of mental toughness, which can be taught. We might find ourselves in traffic situations or business situations where things are spiraling out of control. Everything we try fails. All we can foresee is disaster. Stress is maxed. Our clear thinking is diminished. We get “tunnel vision” and become so desperate to gain an edge that we actually can sabotage our desired result: more money, more time, more social connections, etc because we become asset wasters, time wasters and socially awkward in our desperation. Hasard Lee gives us some tools to identify and counter stress that clouds our thinking. 

For those of us who have evolved workplace cultures into high performance, highly reliable teams understand many of the key elements here: get input from diverse thinkers, get as much info as you can and then execute well. As importantly, also debrief the implementation with everyone being equal. I learned this from seeing flight deck crews on aircraft carriers chide a superior officer for poor orders or decisions. In a manufacturing workplace, we incorporated this by making sure our machine operators had the opportunity and encouragement to voice not only what they knew but what they thought the decision should be and then what worked and what didn’t from their perspective, which was just as valid as any manager’s or engineer’s.
 

If you’re familiar with General McChrystal’s book “Risk” many of the principles in Lee’s book will resonate. McChrystal would add a phase of detection before Lee’s first stage. McChrystal acknowledges the many forms of bias that can fog our ability to detect—we see what we believe. The work by the authors of “Invisible Gorilla” have pointed this out. If we’re not expecting something, we don’t see it, whether it’s a gorilla in the midst of basketball players or a motorcycle in traffic. If we don’t expect to see the small, industry-disruptive and indirect competitor, we’ll miss the threat. Biases can prevent us from detecting the threat and our own vulnerabilities in our organizations. Even if we detect it, we might categorize the other “foe” based on past experience—Nokia did this when confronted with the first smartphones and the fragile screens—and not notice the important details and overlay unobserved but assumed details, like consumers were willing to risk screen fracture to obtain immense functionality with their nano-computer-in-a-pocket.. Lee doesn’t help us much with learning to overcome our observational biases, except to advise us to better understand the potential failure  in thinking phenomena follow a linear path when many can be exponential. Scaling up a small distinction can lead to tremendous results if we recognize this.


I would use Lee’s book as the foundation to teaching good decision-making to my “apprentices.” It is a worthwhile addition to any leadership library.

No comments:

Post a Comment