6 min read

The most critical skill in aviation safety is making good decisions, both before flight when time is plentiful and in flight when circumstances change and we may be rushed. The ability to generate and decide between diverse options (often with incomplete information and in the crunch) is essential to mitigate risk and achieve a safe outcome. This critical skill set is the central focus of the FAA Safety Program which requires the Aeronautical Decision Making (ADM) module as a required core subject at all levels.

I often compare each flight to a football game when coaching a student pilot to manage his expectations. Our careful and essential plans made beforehand in the huddle are often outdated the moment the ball is snapped and the opposing team breaks through the line. By their nature, both these environments are fluid and change is almost the norm. Pilots, like quarterbacks, must be ready to decide on the fly and embrace flexible decision-making. Suddenly it is time for a new plan and some fancy footwork!

Stanford brochure

Can pilots learn something from a decision-making class at Stanford?

I recently completed the Stanford Strategic Management Course in decision-making and would like to share some amazing insights from the business world I think any pilot will find useful. This program at Stanford is highly regarded in the business community and has been proven under real world pressure and earned many companies, most notably Chevron Energy, amazing increases in efficiency (read “profits”) as a result of employing trained decision makers at all levels of management. Chevron has deployed more than 4,000 trained decision makers inside the company and requires their participation on all higher level choices in the boardroom and in the field. The results are astounding.

The central take-away from this course is that, without training, we humans are pretty bad at making and evaluating decisions. Behavioral scientists have provided incontrovertible evidence that the human mind is “predictably irrational.” We are, by nature, subject to an amazing number of debilitating cognitive biases and we also tend to depend on results or outcomes to evaluate our decisions as good or bad.

This process of using the results to judge the decision might seem obvious and valid to most people as a best practice in life: “let’s see how this turns out” and then iterate. This ongoing change and validation builds the heuristics or rules of thumb by which we construct our lives and guide our future actions. Sometimes this is conscious and involves the higher order thinking processes but most often it operates almost reflexively and is built into our human operating system. Daniel Kahneman labels this “System One thinking” in his book Thinking, Fast and Slow and he points out how the results usually escape review by the higher level auditor of the conscious mind. I highly recommend this book for more depth on this subject (but he did win a Nobel Prize for his work in this field and it is a bit dense).

As a decision-making example, let’s say you had a bit too much to drink at a party but, despite being impaired, you drove home and arrived safely. Did this happy outcome validate your decision to drive? Absolutely not; it was still a bad decision. But unless you consciously address this and understand it, you might develop a tolerance for this risk based on luck and persist in this behavior (we all know people who do). Conversely, suppose in the same impaired state you instead decided to use a sober designated driver to get home but ended up in an accident on the way home. Would this poor outcome cause that to be a bad decision?

Again no, the decision was in this case sound but circumstances beyond your control led to disaster. In summary, our decision-making process needs to stand completely independent of the outcomes to be useful and decisions can and should be evaluated entirely on structure and internal merits. This is not how we usually conduct ourselves in life and also, unfortunately, not how we proceed in our world of aviation.

Radar image

You made it through that weather – was it a good decision or just luck?

Suppose we press on into deteriorating weather and make it home successfully by flying through lower than expected conditions, below personal minimums and maybe even on the edge of our comfort level. Without thinking too much about it, human nature causes us to expand our range of what is acceptable and safe and enlarge our operating envelope.

The successful result validates the poor decision and reinforces the future erroneous behavior.

This process of “human accommodation” is built into our operating system which makes the exotic and unusual the new normal. Unless we scare ourselves really badly, this new standard is welcomed into our comfortable repertoire quite quickly. Instead of evaluating the original decision based on objective standards, the outcome validates this bad decision and it becomes part of our future operating instructions. Even worse, we might even congratulate ourselves on our skill or cleverness and make this an even more durable imprint.

This process in the light of day is trusting to luck in establishing our new SOP. If we stopped and analyzed this, we would clearly realize luck should never be part of the planning process and any formula involving maybe in aviation should be discarded. Unfortunately our automatic reinforcement process escapes the higher order thinking skills (Kahneman’s “System Two”) of analysis and evaluation. We develop a new heuristic to guide our actions without even deciding at all.

This is a classic erosion of standards that we find often in NTSB reports; supposedly smart pilots doing very dumb things! To improve our game, the correct procedure requires carefully and consciously evaluating all critical flights very soon after landing. It is especially important to ask the question: “Was that a result of skill or luck?” and put the focus back on the original decision process. We can stop this automatic reinforcement process fairly quickly but this takes time and discipline.

Closely related to this automatic decision trap but operating on a higher, more conscious level is “normalization of deviance.” This is defined as: “the gradual process through which unacceptable practice or standards become acceptable. As the deviant behavior is repeated without catastrophic results, it becomes the social norm for the organization.” This was the major player in NASA’s flawed decision process to launch Space Shuttles with leaking O-Rings and accept foam shedding from fuel tanks. These occurrences became “acceptable practice” as the process kept working and we generated a new normal.

In both of these accidents, creeping standards in the very conscious decision process opened the door to huge risks that led to our highly public tragedies. Examine your own flying and life activities and see if some of these same forces are at play. As mentioned, most traps are embedded in the fabric of everyday experiences and operate automatically on a sub-conscious level. Kahneman also points out these forces are similar to optical illusions. Even when you are aware of the correct answer, we are physiologically compelled down the wrong path. Good decisions require hard work and discipline.

David St. George
Latest posts by David St. George (see all)
20 replies
  1. Duane
    Duane says:

    Your last sentence reads, “Good decisions require hard work and discipline.” True enough, but to better summarize the points you bring up, let me rewrite your last sentence as a set of bullet points:

    Good decisions require:

    1) Knowledge

    2) Practice

    3) Periodic testing and verification

    4) Humility … or “recognition of our own personal fallibilites”

    The “hard work and discipline” you wrote of requires all four of these traits or conditions

    It takes 1, 2, and 3 just to legally obtian and keep a license to fly, but number 4 is the one that usually trips up the most of us pilots, and which results in the most accidents or (if we’re fortunate) near misses.

    The series of posts published on http://www.flyingmag.com by John and Martha King this past year focus mostly on pilots’ tendency to neglect the humility factor. It’s good, if painful, reading for all us pilots.

    So why do we pilots so often fail to recognize our own fallibilities?

    Well, that’s a topic for libraries full of psychological studies and essays. And that’s why we’ll never eliminate pilot error as the single largest cause of aviation accidents, particularly for personal aviation.

    In my case, having reached the ripe old age of 60 (in private aviaiton pilot circles, that might easily be considered as mere “journeyman” status), unlike in earlier decades of my life, I am more aware of my weaknesses … and quicker to acknowledge specific errors in my judgment. I expect that’s a factor of aging that benefits most of us, as long as we manage to survive our earlier failures of judgement.

    Yet, even in recognizing my own fallibility, I am also painfully aware of a number of professional pilots I’ve known who were older, more skilled, and (I always thought) wiser jockeys of the stick and rudder than me … yet who still eventually succumbed to errors of judgment that ultimately killed them.

    So I cannot smugly assume that my supposed heightened sense of humility will necessarily buy me safer transport in the skies.

    Therefore all I can add is that, aside from the requirement to never stop learning more, never stop practicing, never stop being tested, and never stop questioning my own judgment … it must be said that no safe pilot can ever have too much humility.

    • David St. George
      David St. George says:

      Excellent points Duane! Human fallibility is indeed the root cause of most aviation accidents. Caution, currency and humility are central ingredients in aviation safety. Consequently though we add *ego* to the mix and we all know where that leaves many pilots. You have to be somewhat bold or you stay home. The balance between confidence to go and humility (wisdom) to stay is as old as the Greek “Golden Mean”…and if I remember correctly the first aviation fatality was a Greek dude with wax and feather wings!

  2. Larry Baum
    Larry Baum says:

    You are only as good as your next set of decisions. As David has written and Duane has summarized very well, the skills, practice, and verification of your skills are “table stakes”. You need to have them, or you can’t play (or are extremely foolhardy to make the attempt).

    Add in experience, but be open to tempering that experience with enough humility to know and understand when you just “got away” with something. More than once over the years, I’ve found myself saying after a successful trip that had issues: “Boy was that stupid!” And that got filed away along with everything else learned, so that the next time the decisions took into account the total experience.

  3. Chris
    Chris says:

    Very pleased with the article.
    My Greek Ancestor Icarus(Ικαρος)died so we can learn a lesson.
    He failed to follow Deadalus(Δαιδαλος) advice.
    The two Mythological figures reperesent the experienced and the inexperienced Aviator.
    Experienced meaning, always safe !!!!
    We are as safe as the last safe flight.
    The next flight will be better or not as safe as the last.
    ” Safety is no accident”

    • David St. George
      David St. George says:

      Thanks. Aviation safety is an ongoing enigma: very smart, talented people can have incredibly dumb accidents, e.g. running out of fuel in a Cirrus where it is calibrated perfectly and displayed elegantly and obviously. The problem is in the “wetware” and probably often involves some element of hubris but also this “cognitive trap” built into the neurology of every pilot. Stand by for a follow-up article attempting “the answers!”

      • Chris
        Chris says:

        Some aviators write articles involving “surgical operations”around inclement weather, using the “latest” electronics as a safety tool.
        The scenarios involve low time barely legal non proficient pilots.
        I welcome your article as a great educational tool towards Safety and looking forward to the follow-up.
        Thanks

        • David St. George
          David St. George says:

          Great point Chris; every new technology promises “increased safety” but pilots often cut their margins thinner and defeat any safety advantage…that human factor problem!

  4. Mark Finkelstein
    Mark Finkelstein says:

    I was fortunate to learn to fly at the East Hill Flying School where David is Chief Instructor. His excellent article included mention of “supposedly smart pilots doing very dumb things,” which reminded me of a personal incident and of a larger point.

    Shortly after getting my Sport Pilot certificate I was taxiing the club’s Flight Design LSA down to KITH’s Runway 14, and was getting a series of alternator warnings. I had seen something like this before, and decided to ignore the repeated warnings. Shortly after take-off I had a complete electrical failure. I was lucky to have been close enough to remain in the pattern, and being NORDO and unable to see the light gun, to gingerly complete the pattern and land without flaps. How incredibly dumb of me!

    It makes me think that perhaps one of the most dangerous moments for a pilot is precisely the point that I had been at: you’ve made the decision to fly that day. You’re in the plane. Perhaps you have a passenger. You have plans for the day and the flight. The plane started right up. Perhaps you’ve already started to taxi, when suddenly a warning light shows up. Ugh. Do you really want to taxi back? Have to deal with mechanics, expenses, lost time, perhaps scrubbing your plans for the day? No! The little devil on your shoulder tells you that there are a dozen explanations for the warning light, all of them harmless. Anyhow, you, or someone you’ve heard of, has had the same thing happen before, and it turned out to be nothing. So on you go, to . . .

    • David St. George
      David St. George says:

      Hi Mark, thanks for sharing..and confessing. I too have ignored “all the signs” and come to grief! Step One to safety is admitting errors and learning…hopefully without pain! Sharing is even better because others can also benefit (without pain). Newton’s (unrecorded) fourth law is “a pilot in motion tends to stay in motion…”

  5. Lauren
    Lauren says:

    Good decision making requires risk analysis. Risk is made up of two components – probability of a successful outcome and consequences.

    If I ask you to play a game and tell you the probabilities are 5 out of 6 you’ll win, that sounds pretty good. If I tell you the game is Russian Roulette, you won’t play.

    If you always fly at the edge of the envelope, sooner or later you’ll wreck the plane.

    • David St. George
      David St. George says:

      The X-Y axes of risk analysis usually are probability and severity. Don’t sweat the little stuff (if you are busy) because it will not kill you. But watch “the killers” (and why reading accident analysis is so valuable) Kahneman points out how we do not fear something with a 90% success rate as much as a 10% failure rate: we do not evaluate clinically, but emotionally. This makes all our show of objective analysis a little questionable. In flying we want to get there…this really colors the calculus.

  6. Bob
    Bob says:

    Seems to me we’ve evolved to value outcomes over the process of getting there. At the end of the day killing the mastodon, outrunning the sabre tooth tiger was all that mattered, not whether luck or skill got you there or whether you managed “safely”. Our heads still rate a successful outcome more important than how we got there.

    • David St. George
      David St. George says:

      True Bob, survival is a “biological imperative” at the most basic level (a good day when the tiger goes hungry) and it has conditioned our reptilian brain. Hopefully we have evolved far enough to reach the higher cognitive level (and our flights are not daily life threatening)! Awareness of our fallibility, coupled with some humility and after-action analysis (and the discipline to engage in the “process”) can certainly result in safer flying. You are right it is hard to “sell” safety if it always works though…”no need, no sale!”

      • Bob
        Bob says:

        Dave:

        “Awareness of our fallibility, coupled with some humility and after-action analysis (and the discipline to engage in the “process”) can certainly result in safer flying.”

        Hopefully we gain this “awareness and humility,” with age and hours. Thats why there are not many old, bold pilots.

  7. Doug
    Doug says:

    Great article! Interesting as Kahneman was mainly studying decision making and judgement particularly under stress. This has so many implications to flying and glad you are bringing to attention.

    • David St. George
      David St. George says:

      Hi Doug, thanks. When I discovered Kahneman it was immediately clear his work had huge applications to aviation. There are now many others expanding his work especially in the area of decision-making under constraints (called “fast and frugal”) We as pilots do not have the luxury of “white board analysis” in the cockpit. We always seem to have limited time and information and the consequences of error are obviously life-threatening. My follow up article is here: http://wp.me/pUZLB-gk

  8. William "Pete" Hodges
    William "Pete" Hodges says:

    I enjoyed this article very much. I especially liked the recommendation to post analyze all critical flights. That technique applied to my own flying is what led me to make the true airspeed cheat sheet I described in “Knowing True Airspeed for Fuel Management”

    Thanks for this really good article!

    • David St. George
      David St. George says:

      Hi Pete, thanks for logging in with the comment. Conducting an honest post flight analysis might be the most important (and most difficult) part of improving our safety!

  9. John
    John says:

    One of the better articles on cognition and risk I’ve seen recently. Commentary is equally insightful. Thanks to all for your contributions to this discussion and thread. My comments would be redundant to the thoughtful observations made previously, so I will merely express my appreciation.

Comments are closed.