G1000 panel
5 min read

Decisions are made from a set of assumptions. Most assumptions are derived from a set of generated hypotheses for driving those decisions, others are based on prior experience and understanding, and still others are based on all three. In the confluence of such adventures one learns new things and generates still newer ideas to create newer paradigms, to change the world around oneself.

The Wright brothers had a set of priors from unpowered flight and the motor vehicle that they applied to the Wright Flyer. The process of learning was by experimenting with their model aircraft. Errors were corrected through trial and error, not through assumptions, although some were made during the process along the way and not all were process worthy. We have come a long way from there.

In flight, assumptions are the Achilles heel in safety. One cannot press on with the assumption that all is well, when a crushing burden of mounting evidence is screaming against further pursuit. The fallacy of not knowing the unknowns ahead leads one to despair. Understanding teaches us to use our prior experiences cobbled together with knowledge to make wiser decisions. Experience without knowledge and knowledge without experience both serve different masters, and neither are wise nor helpful to pursue.

Flying with a pilot on a murky day with low visibility and ceilings, we came across the initial approach fix. He, the pilot in command, was busy pressing the buttons on the glass displays, setting up for a smooth approach into the airport. I, on the other hand, was watching his moves. He was careful with his presses and waited for the expected response. This went on for a few agonizing seconds that seem even in past memory as minutes. Meanwhile, the A36 kept hurtling through the minor, annoying, turbulent clouds.

We were straight and level awaiting the “outer marker” announcement, still three miles in front of us. The button pushing kept on its steady pace for a few more interminable seconds and then exasperated, he turned to me. “We’ve got to turn around and do it again.” He didn’t wait to proceed deeper into the approach. He was firm in his conviction of calling it off. “Let me work this out in the hold,” he said.

G1000 panel

Don’t assume those avionics will get you home.

And I, appreciative of his decision and his calm demeanor, thought, “now that is a person I would like to fly with.” He avoided the quicksand of a faulty display. He did not subjugate his decision-making to complete obedience to “the approach must go on.” He did not cross the bridge of ascribing virtue to a set of false assumptions. That decision, I thought, required a formal finesse of understanding and wisdom. He must have been there before and learned a lesson in his past. I would ask that of him later. “Bravo!” I said quietly in my mind. “Bravo!”

These elemental faults of relying on assumptions have long been the bane of human existence. Unfortunately, these practices began to take hold in the minds of some “bold pilots.” And unfortunately, such pilots become comfortable in talking about the “time when this or that happened, and they fixed the problem on the fly.” Most did not live to tell the tale. I am leery of the soft, comfortable, leathered armchair crowd that might denounce this as wimpy. I am, as you should be, aware that the mental quicksand of assumptions is forever unforgiving.

Most assumptions cannot be good, and cannot be bad. If good, then they must be based on hypotheses to be confirmed through empirical means, which allows us better understanding and must be based on priors. If bad, then they must not be relied upon with any degree of gravity, because that will lead to failures and bad outcomes. So, it is in reality that assumptions are used in an experimental mode only when one is trying to assess response and determine outcome to a particular set of hypotheticals. Assumptions, however, are not allowed in the normal course of a mentally challenging set of activities when the priors have established the known hazards and risks, as in flying an aircraft. The past remains the constant and the future is an undiscovered territory. History, through the NTSB database (especially of poor form and decision-making), teaches us lessons if we wish to learn and be safer pilots for it.

Perhaps in an untoward circumstance where things are not going according to plan, it behooves us to take a deep breath, use an ounce of silence, a drop of solitude, and then measure the fault before us. This helps reconfigure our mindset to repurpose our decisions. Slowing the mind expands the time for careful decision-making.

We need more awareness rather than building an army of guideline-following robots. We need more critical thinking in aviation. We need reason and logic based on priors and a good deal of fundamental knowledge. We need to prick the bubble of comfort that endlessly surrounds us!

When we solve this problem that has plagued the very underpinnings of avoidable accidents in aviation, then and only then can we begin to call our skepticism as “scientific inquiry into human behaviors leading to accidents.” It then becomes a real inquiry into the fundamentals of aerodynamics, knowledge, and pilot behavior. Unless we can reconcile inquiry, our priors, and our understanding as pilots, we will continue to exist in the dangerous bubble of IFTTT (if this then that) and not know what decisions will keep us safe.

Parvez Dara
Latest posts by Parvez Dara (see all)
2 replies
  1. John Opalko
    John Opalko says:

    It’s not just aviation that needs a strong infusion of awareness, critical thinking, reason, logic, and fundamental knowledge. Those are skills, abilities, and resources that seem to be sorely lacking in life, in general.
    And, yes, we do need to “prick the bubble of comfort.”

    Reply

Leave a Reply

Want to join the discussion?
Feel free to contribute!

Leave a Reply

Your email address will not be published. Required fields are marked *