The Great Debate: pilots are just backup

In the wake of the Air France 447 crash, many have suggested that basic stick and rudder skills have declined among airline pilots to the point where it creates a serious safety issue. With increasing cockpit automation, from autopilots and autothrottles to Flight Management Systems and fly-by-wire controls, there is a strong sense that the pilot is not in control.

But there is another perspective. Bill Voss, of the influential Flight Safety Foundation,  recently made a bold statement at a safety seminar in San Antonio:

Five years ago we passed the point where automation was there to back up pilots… Clearly, today, the pilot is there to be the backup to the automation… This is simply a realistic assessment of the world today, except we are not training pilots to be backups to automation. We have to own up to the fact that we need develop new kinds of pilot training… This is not just about better stick-and-rudder skills, though. What you die from is not understanding what configuration will keep the aircraft in the air safely.

Do you agree or disagree? Are modern airline pilots just there to monitor the automation and take over in an emergency? If so, what kind of changes need to take place in the training process? And how much of this is applicable to general aviation pilots? Add your answer below.

Tags from the story
,

73 Comments

  • Is there such a thing as too much stick and rudder skills? This is something that every aviator needs to continually practice, no matter what they fly. We all can get better at flying an airplane, automated or not.

  • My tailwheel instructor, also a 767 captain told me that this is precisely what is happening. Too little emphasis is placed on stick and rudder skills and it is the cause of some accidents in recent years. I believe it. What he taught me during tailwheel training wasn’t just how to land and take off in a tailwheel. My first lesson was ground reference, stalls, falling leaf, pylon eights….., got better and better. Made me a lot better pilot. I think every pilot should go through his tailwheel training.

  • Let’s not rush to throw basic stick and rudder skills under the bus. Years ago, at an NBAA dinner honoring William T. Piper, Sr., the MC asked everyone in the room full of heavy iron pilots who had soloed in a J-3 Cub to stand. Well over half did, there was a pause, everyone else stood, and the big crowd gave Mr. Piper and his little yellow airplane a standing ovation. That basic flying in a basic airplane was the foundation for today’s oustanding safety record in corporate flying.
    Training today needs to be in three phases, like military pilot training in WW-II. Primary, which today should be done in new tailwheel airplanes based on the Cub design; basic which should be done in airplanes like the Skyhawk and should include instrument flying; and, advanced which should be done in technically advanced airplanes. The Air France 447 crash is often used as an example of pilots not understanding automation. That is perhaps true but if the pilots had been able to recognize a stall they could have recovered. All bases simply must be covered in flight training. That would best be done by having three phases of flight training.

    • Mr. Collins is right. My first “big jets” were DC-9’s and B-727’s. Pretty automated we thought at the time, but now they look like something out of a Jules Verne novel. It was most noticeable when I transitioned from the 747 Classic to the -400. At that point I really believed I had become a custodian of the automation, and the 787/747-8 is even more so.

      Mr. Collins’ suggested training protocols are correct…and expensive. In today’s “pay to play” environment wherein prospective aviators must pay tens of thousands of dollars to become qualified for thoroughly mediocre jobs, I hope there is a way Collins’ ideas might be incorporated into modern training programs.

    • Amen, Dick! Flying skill is an acquired and perishable commodity. Pilots today (including myself at times!) tend to rely on the technology to save their arses rather using their piloting skills to confirm and evaluate the accuracy of the information displayed. Tech has a valid and important place in the cockpit, but a pilot’s competent judgment is the final arbiter of success or failure.

  • I agree that not respecting the aircraft configuration will bring deadly results, however, I do not agree with pilots being the back up to automation as an across the board truth for all airline pilots.

    Being an airline pilot flying a Regional Jet, I enjoy flying and hand fly as much as I can, workload permitting. Aside from enjoying the task, I want to keep my hand flying skills sharp. I think that maintaining proficiency (with aircraft systems and stick and rudder skills) is each pilots responsibility and should not be taken lightly. I think that training does not need to be fundamentally changed, but I think the fundamentals need to be reinforced.

  • As I see it, the problem is not too much automation – it’s too little. Designers should just cut to the chase, and take human pilots completely out of the loop.

    Although it may offend out human egos, that fact is that 95% of accidents are caused by the least reliable component in every aircraft – its pilot.

    A flight director is a classic example of stupid-think design. It’s a machine that figures out what the aircraft needs to do, then goes to great lengths to explain what it already knows, to a human pilot. Ironically, the pilot then can follow the step-by-step “commands” of the flight director – or s/he can couple the aircraft’s autopilot to the flight director. Basically, eliminating the (unnecessary, IMO) middleman.

    The military already has lots of experience with truly autonomous aircraft. These are not RPVs (remotely-piloted vehicles). They are completely autonomous machines – assign a mission, and they will figure out how do get the job done.

    Machines do not suffer from fatigue or poor nutrition/hydration. They don’t take rest breaks. They’re just as “alert” at the end of a 13-hour flight, as they were at its start. Most importantly, machines don’t improvise, get clever, or bend/break rules – ever.

    Although poorly-designed man-machine interfaces abound, the best solution is to eliminate that interface completely – by doing away with the requirement for the human.

    • I disagree with the idea that pilots should be taken out of the cockpit, it is fine to have partial automation but the failure rate of automated systems is to high to risk it. Besides there is no way a computer can make up for a humans judgement or a humans intuition. Automated systems I believe should stay as they were a few years ago as a pilot aid. As a pilot who first learned to flying in a wooden motor Gilder that was made in the late 60’s, that doesn’t have any automated systems the only bit of electronics it had in it was a electric starter, radio and transponder, It taught me that the stick and rudder flying is the most important thing a pilot needs to know. Everything else is a secondary thought, I personally think that more time needs to be spent with a trainee pilot in control of the aircraft instead of on auto pilot.

      • I, too, learned to fly before radios, lights, batteries, starters, and the myriad other cockpit goodies. My first experience with an autopilot was in a 250-hp Piper Comanche. Coupled with the old directional gyro, which tumbled, pressing the aircraft into a dive. My last personal airplane (I had five at the time, as it was my business) was a turbocharged Cessna 206 on Wipline amphibious floats. I had added many additional features to an already complete IFR panel. But since my first tumble with the Comanche, I’ve never again used an autopilot, nor even a wing leveler. My habit is to avoid, as much as possible, in trying to catch up with something that has gone wrong, and to avoid it before it happens if possible. Hand flying usually does that for me, and my 20,000 safe flying hours (18,000 of them in the Alaska outback) supports that policy. Now flying, occasionally, with the Garmin G1000 system, I don’t trust the system that requires inordinate attention to two color TV screens when I can do much more with an old fashioned panel scan.

    • Automation is great until something unexpected happens and it suffers an exception error. At least with pilots on board and reversion to simpler modes, you would have a chance to control the aircraft. That of course did not happen with AF 447, but I would challenge UAV designers to have found a way of controlling the aircraft given the paucity of data available to the system. Designers cannot count every permutaion and possibility, and half of the problems occur when engineers get too fancy and create 2 problems for everyone they solve.

      UAVs hit the ground hard alot You can go on the automated airlines; enjoy your ride, and hope the designers covered every eventuality. I’ll ride with a pair of well stick-and-rudder trained meat servos behind the controls anyday.

    • I agree that this is the future–eventually. But we’re a long way from that (at least for passenger travel). So what do we do in the interim? That’s the harder question. We are in a bit of a transition period. Having said that, I agree that it’s too early to ditch stick and rudder training completely. Airplanes still fly for the same reasons, no matter how automated.

    • Where would the US Airways plane have landed had Sully and Skiles not been on board? It was not a computer problem, it was an external factor (birds) and the plane would NOT have made it back to LGA. Where would the computer have landed? How about United 232? Would a computer have figured out the control issue and been able to navigate and aviate to a landing? I love the technology, but there is a place for both, and sometimes man is better than machine.

      • The computer would be flying by reference to its satellite-derived position and a terrain map. It would be aware of its position, its engine-less flight capabilities, and all surrounding terrain. Where an airport landing is unattainable, the computer would land the plane in the most-hospitable available terrain. In your example case, that would be the Hudson River. An important point here is that ALL such autonomous vehicles (employing the same control system) would react identically to identical situations. Not every pilot is a Sullenberger, but all of the control systems would possess the same capabilities.

    • Where would the US Airways plane have landed had Sully and Skiles not been on board? It was not a computer problem, it was an external factor (birds) and the plane would NOT have made it back to LGA. Where would the computer have landed? How about United 232? Would a computer have figured out the control issue and been able to navigate and aviate to a landing?

  • Every pilot, whether trained as a civilian aviator (which I was) or via a military tract needs to begin on an under powered, unsophisticated aircraft. The basics will translate to larger equipment and the fundamentals learned in a taildragger/152/ Cherokee/ tweet can save an airliner one day. As a captain at a major airline, I see co-pilots who come from all walks and the common thread to a great aviator is someone who wants to fly the aircraft. Not someone who is interested in pushing buttons to begin a process. Yes, the airspace we operate in is more crowded and has been made more complex. Yes, the manufactures have offered more automated products and avionics to keep us out of the rocks and rain. But the one common link is the guys and gals in the seat up front – either acting as a manager of compliance or as an aviator. I hope for me and the crew I spend the trip with, it is the latter.

    • Simon:

      I’m old enough to remember when the question was: “would you actually get into a box hanging by a cable in a shaft in a tall building, if there was no trained professional elevator operator aboard?”

      But to answer your question, yes – I would board such an aircfraft. As long as its software was not of French origin.

    • I have worked with avionics/electronics/computers for years. Believe me, I won’t be getting into any pilotless aircraft. Even fly by wire makes me nervous.

  • I can see some truth in Mr Voss’ remark. But if the time comes when it really is true I vote we all hang up the electronics and go back to a Taylorcraft on a grass strip. I, for one, have no desire to be the backup to a computer.

  • I know that we’re all supposed to decry the state of “stick and rudder” flying today (whatever that means), but I don’t buy it. I think Voss is right. Let’s admit where we are, and more importantly where we’re going. Let’s train for automation failure, not try to recapture the lost art of hand flying. An Airbus ain’t no Cub anyway…

    • Gabe:

      The fact is, the automation almost never fails. The pilots do. Take the most failure-prone element out of the loop, and you’ll improve reliability and safety by a couple of orders of magnitude.

      The 600-pound gorilla in this discussion room is this: in modern transport-category aircraft (and increasingly, in business jets), there are NO cables and pulleys; pushrods and bellcranks, between the human pilots and the vehicles’ control surfaces. None. The machine already is flying the plane. 100% of the time. That war was lost, years ago.

      The biggest flaw in contemporary control system design is the inclusion of a human in the near-real-time loop. Doing so introduces a complex requirement for the machine to “explain” to the human what the machine already knows (and what the machine already is capable of doing on its own, if permitted to do so). Lots of opportunities here for the human to misinterpret what the machine is trying to “say,” and lots of opportunities for the machine to say things very poorly. In technology parlance, the “man-machine interface” is a critical weak point in the overall design.

      The most reliable way to eliminate defects in a component is to eliminate that component entirely – where it is practical to do so. Crappy MMIs would go away if there was no need for an MMI at all – as is the case with autonomous aircraft control systems.

      • So given that argument, maybe we shouldn’t be so concerned about a dwindling pilot population…don’t need em…? little scary to me…

        • John:

          I don’t know. It’s scary only if you conclude that the arrival of autonomous aircraft will mean the end on piloted aircraft. I don’t think that it will. There always will be a place for piloted vehicles, just as there still is a place for sports fishermen in an age of ocean-going factory seafood ships.

          In terms of a Renaissance for the GA manufacturing industry, removing the biggest barrier to entry (the need to obtain a pilot certificate) would go a long way toward unlocking demand for personal aircraft.

      • Yeah, but when the automation fails, the pilots have to know what to do or everybody dies. The recent AF crash is a perfect example of that. If the pilots had gone back to the basic skill of instrument cross-check the airplane would not have crashed.

        Programmers can’t anticipate everything. When the system says “Beats me!” and throws up its electronic hands, that’s when an experienced an properly trained pilot needs to take over.

        • The problem IS the pilots – or at least the presence of them in the control loop. If the control system’s designers knew that no humans would be in the loop, they’d design accordingly.

          “Programmers can’t anticipate everything.” Correct. Neither can even the best trained, most experienced pilots. But programmers who are teamed with experienced pilots can anticipate far more than any given pilot ever could hope to. They can capture and leverage the accumulated knowledge and wisdom of thousands of pilots – and make that available to all.

          A well-designed control system NEVER “throws up its electronic hands,” because it’s not programmed to do so. People quit – machines don’t.

          Human pilots introduce variability into the process. From a quality perspective, that’s a very bad thing. Another commenter praised pilots’ “intuition.” Webster’s Unabridged Dictionary defines intuition as: “the act or process of coming to direct knowledge or certainty without reasoning or inferring: immediate cognizance or conviction without rational thought.” One of the nicest things about machines is that they lack intuition.

          Trouble happens when pilots abandon procedures and rational thought, and instead “get clever.” Or do something stupid like run out of fuel, penetrate weather, bust minimums, etc. Nice thing about a machine – it NEVER will do something that it’s not programmed to do. To a machine rules are rules – not suggestions.

          The pilots are the problem. Well-designed autonomous aircraft are the solution, precisely because they eliminate the problem.

          • “….A well-designed control system NEVER “throws up its electronic hands,” because it’s not programmed to do so….

            But Tom, that is exactly what the AF autopilot did, bailed out…

          • Tom,

            You cannot eliminate the human unless you have designed a maghine that can build machines. A human has to design it, program it, build it, etc… You aren’t going to get away from that anytime soon.
            Secondly, I’ve seen way to many UAVs as a smoking hole, because the computer couldn’t figure out the problem. It was never programmed to. You can give it information, programming, sensors, but what you can’t give it is thought! The computer I’m typing on has no idea it is sitting on a desk, what the temperture is outside or why my wife just left the room. Sure you can build in the sensors and inputs for it, but It has no idea that my wife left the room to tell the kids to come in because it started raining.
            I can see the time when you have might have a single pilot, but to replace the only thinking machine in the cockpit invites disaster!

  • I think I’ll let Boeing and Airbus figure this one out for the airlines. For non-corporate GA, full automation that could not only takeoff and land, but also recognize and appropriately react to weather conditions would probably make that form of transportation much safer. Notice I said it’s transportation, not flying. For those that want to fly, well, that’s what Cubs are for.

  • I’m not sure what we are trying to accomplish. There are lots of blogs and people with opinions. Is there supposed to be a consensus at some point?
    As a pilot, you fly the airplane. If you are monitoring a system, so be it. If the system isn’t doing what you want, fly the machine.
    Don’t forget to watch that the machine is doing its thing so you can turn it off!
    I feel somewhere someone needs to define what the problem is so we can fix it…I feel I can define at least one real big problem. How to fly!

    • Nah, no concensus, just pondering various ideas.
      So consider this hypothetical situation:
      You are in a fully automated airplane, “monitoring the system”. But this airplane is fully capable of making the flight from gate to gate. It has three computers, a primary, a secondary and a voter. These computers have been programmed by PHDs in everything from aerodynamics to meteorology (there would be some supervised high schoolers in there programming also, just to make the programs work better).

      You are in weather and the computer has been silently calculating the trajectory of a storm cell for the last hour and makes a turn to the left to avoid weather. To you, visually and on the radar, it looks like going right would be better. What do you do? Turn off the machine and go to the right?

        • Correct! And the reason is that a Systems Monitor would be more of a liability than an asset because he cannot have all the information but has the ability to turn off the machine that does, based on his experience which may be flawed; the monitor becomes an unknown quantity within a well defined system. That’s probably why “HAL” bypassed the OFF switch.

  • Google has launched a real time, mixed into real traffic demo/test of autopiloted (computer driven) cars in Nevada. The Army has demonstrated autopiloted (computer flown) logistical helicopters. MIT has ongoing research and demonstrations of coordinated autopilot maneuvering of small UAS craft. We are rapidly moving to a world where “pilots” of the flesh and blood variety are obsolete. This reminds me of the venerable Firemen who for nearly half a century continued to occupy the cabs of railroad locomotives. Now, go to any railroad switch yard and the engines are remotely operated (or autopilot driven). AF447, Colgan Air 3407 and other similar accidents in Part 121 or Part 91 flying illustrate that the solution may not be better “pilots”, but rather better hardware and software. Sad. But the end of a long and glorious era is always sad.

  • I read a lot of comments involving “lack of stick and rudder skills” in regards to this accident. I agree that more skill is better than less, but based on the accident reconstructions I’ve read, I believe that the lack of systems knowledge and faulty ADM is likely to be more to blame than the stick and rudder flying fundamentals.

  • Given the last four percent of air crash causes are due to the biologicals, the degree of automation isn’t yet complete. Once ATC is ruled by computers too, the elimination of air crashes will be complete.

  • @Steve Phoenix – Your comment about transportation and flying being different things was right on. I love to fly – that is I love the whole package, controlling the machine, making decisions, being in the air, and going from one place to another under my command. When I want to get somewhere on time and on schedule (transportation)? I could see the usefulness for full automation.

  • Richard Collins said it best in “IFR Crew of One”: an autopilot is critical to safe operations. If you fly a glider or taildragger you’ll learn good stick-and-rudder techniques, but in today’s modern aircraft you’d better know the systems and electronics. Yes, you need to know how to land in a crosswind. But I’d be more impressed by your ability to program a GNS-530W or G-1000 to an approach down to minimums.

  • The type rating training syllabus for jet transport aircraft such as the 737 and A320 is flawed from the beginning. In Western Europe and Asia there are 300 hour cadet pilots second in command on big jets like the A340 and 777. All their simulator training accents full automation. No wonder they have appalling loss of control accidents because they do not have hand flying skills because they receive no practice in the simulator during type rating.

    What should happen is that the first few sessions should be with all automatics switched off and that means flight directors, auto-throttles, autopilots all off. There should be no MAP mode flying at this point. The student is then taught how to to fly by hand properly and that includes high and low altitude stall recoveries, unusual attitude recoveries, limited panel flying (no primary ADI), crosswind take off and landings in strong crosswinds, GPWS pull up maneuvers, low level circling approaches flown by hand, raw data ILS/VOR/NDB approaches, profile descents using only DME v Height All raw data by hand. This then teaches them attitude flying rather than blind concentration on flight directors

    Once the student has mastered the skill of basic jet handling and could pass an instrument rating test manually flown raw data, then and only then should the automatics be introduced into the syllabus. Until that training is adopted in response to loss of control accidents, then more loss of control accidents are inevitable.

  • Basic airmanship skills will absolutely be necessary for a successful outcome during a significant incident. Recognition of when an aircraft being controlled by an automated flight control system is not responding appropriately will require aircrew training and discipline. Experience has taught many of us when a particular condition is likely to require superior airmanship skills(stick-rudder-throttle); therefore, discipline is required for us to have the appropriate situational awareness at the appropriate times. Perhaps increased time in an advanced simulator will assist in maintenance of airmanship skills as well as incident recognition skills.

  • Automation fails period. I know because I repair highly automated machines where I work. Nomatter how many redundancies are built into a system it can never replace a persons will to survive and gut instincts. The computer responds to instructions and feedback from sensors to coordinate responses. When sensors conflict, or no instructions are valid then what?? Are you going to rely on some guy whose sitting safety at his computer in a office somewhere writing code to control a plane your family is going to be flying in. And what about hackers? Terrorist gain access through a back door in the firewall code and it’s 9/11 on an unimaginable scale. Every plane becomes a guided missile with no bad guys on board and no warning. Think it isn’t possible?

    • Neil:
      There’s simply no comparison between PLC machinery and a well-designed control system. “Will to survive” is motivation – it’s a poor substitute for skill. Gut instinct is another description of “intuition,” which was addressed above.

      Sensors often deliver what turns out to be conflicting information. Good software/hardware accounts for this as SOP.

      The threat presented to autonomous systems by hackers is not different from the threat presented to any modern aircraft control system. Autonomy brings no increased exposure or risk. In fact, by eliminating the wide-open doorway of pilot input, the risk to occupants is reduced. Remember that Egyptian Air 767 event?

      You don’t need a machine to hijack a plane. Box cutters and committed perpetrators did the job quite effectively. But they would have been screwed if the aircraft had no cockpit.

  • Edd:
    “But Tom, that is exactly what the AF autopilot did, bailed out…”
    The AF aircraft (Airbus) control system never was designed to be autonomous. Autonomous control systems may look like “autopilots,” but they are fundamentally different. Apples and oranges.

  • Andy:

    The goal is not to eliminate human designers (“machines that can design machines”). The goal is to enhance safety. Pilot error is the cause of 95% of accidents. Eliminate the human pilots (not the control system designers), and you have the potential to embody the most significant increase in vehicle safety in the history of the industry.

    Military UAVs (the majority of which are remotely-piloted, rather than autonomous) are designed to observe and to kill. And they explicitly are considered to be expendable. When your assignment is to design an autonomous control system where human (occupant) safety is the highest priority, the end result is radically different from what you design for (example) a Predator.

    Comparing a home PC with a sophisticated control system is fruit salad (apples, oranges, etc.). If the control system needs to know when your wife leaves the room, its design will include multiple sensors that can capture that knowledge, and cross-check the data to identify unreliable sensor information and to deal with that algorithmically.

    I say stuff like this, not because I don’t enjoy flying, but because (as Steve Phoenix pointed out above) for many missions, the objective simply is safe transportation. And because I’ve spent a good portion of my engineering career designing embedded control systems for “mission critical” applications. (As used here, mission critical means that if the control system fails, people die and/or multi-million-dollar equipment goes “poof.”)

    Autonomous aircraft are coming, whether pilots like it or not. Their arrival won’t mean that people cannot continue to fly aircraft “the old-fashioned way.” But contemporary aircraft control systems have no direct connection between the pilots and the control surfaces, anyway. Any assertion the pilot is in control is a fantasy. He is simply a decision-maker. The pilot is relying upon the machine to interpret his will, and to make the vehicle comply. If that ever fails to happen, it won’t matter if your captain is Sullenberger or Mickey Mouse.

  • All this reminds me of the young architect who has just graduated from the engineering college of a large university. He’s a first-class CAD (computer assisted design)operator, but can’t design a project that works, nor meets a budget. Neither can most join two boards together with four nails and a hammer. I’ll bet that both Hoover and Yaeger, having flown on on both sides of the fence, would agree with the need for pretty firm stick and rudder skills. And quick, accurate, effective personal decision making, too . . . . .

  • Gentlemen,
    This is sort of a waste of an argument. Will there be an increase in automation in the future? Sure. As we strive to be more efficient, the computer can make adjustments and recommendations that can increase efficiency like we cannot.
    The argument that we must replace the human (machine) in order to reduce accident rates is silly. The discussion has been centered on the fact that 90-95% of accidents are caused by human error. The problem is we are looking through a soda straw. The fact is that flying a commercial airliner is safer than just about any other form of transportation. The chances of being killed on a 121 carrier (I assume these are US numbers since the Wikipedia said 121 carrier) are about 4 in every million flight hours! That would equate to 3.8 people are killed by Human Error out of every million flight hours. You have a much better chance of being killed driving to the grocery store. So if it is life you want to save, I suggest making cars autonomous.

    • It’s always interesting to watch as someone asserts that “flying already is safe enough.”
      121 ops aside, the biggest opportunity fpr autonomous aircraft is in Part 91 personal operations, where the requirement to obtain a pilot certificate is a huge barrier-to-entry.
      And Google and others already are preparing to introduce autonomous cars, which, yes, will be a tremendous life-saving advance.

      • Interesting. I don’t believe I ever said “flying already is safe enough.” I pointed out that your assertion and I will quote, ”The fact is, the automation almost never fails. The pilots do. Take the most failure-prone element out of the loop, and you’ll improve reliability and safety by a couple of orders of magnitude” is not true. You cannot increase safety by an order of magnitude when there are only 4 deaths in every million flight hours unless you play statistics within the soda straw.
        Another assertion you made is “Pilot error is the cause of 95% of accidents. Eliminate the human pilots (not the control system designers), and you have the potential to embody the most significant increase in vehicle safety in the history of the industry.” There have been a number of advancements since the beginning of flying. Aircraft accident investigation techniques in the late fifties and early sixties probably had more impact than anything else (but that is another argument) because it allowed designers to correct design flaws that went unnoticed.
        (see chart on page 17 of following webpage)
        http://www.boeing.com/news/techissues/pdf/statsum.pdf
        The British Comet and the Lockheed Electra were both plagued by a design flaws that it took a new approach at accident investigation to unlock. That new approach led to the foundation of modern accident investigation. Now we have new and better ways to find pilot error. Before the evolution of HFACS, accident investigators stopped at “pilot error”. Now we delve into the culture, supervision and other factors that led to the “pilot error”. Sometimes we blame the pilot when really he was in a situation he wasn’t supposed to be in (American Airlines 587). But I digress…
        I cited 121 ops because I thought that was the discussion are per your quote “The 600-pound gorilla in this discussion room is this: in modern transport-category aircraft (and increasingly, in business jets), …” If we now wish to discuss Part 91 that is a different story.

        • Andy:

          As time has gone by, we’ve identified and largely eliminated the mechanical and procedural issues you cited. That leaves fallible humans holding a larger and larger share of a thankfully smaller bag of accident causes. Fatigue, failure to adhere to rules/procedures, and just plain dumb decisions are elements of danger that will be well-addressed by machines that always follow rules and don’t try to get clever by going off-script.

          Part 91 certainly is the low-hanging fruit, and it’s where we will see at least an order of magnitude in safety benefits. That’s principally what I have been talking about. My comment about modern aircraft control systems and the gorilla goes to the (widely unacknowledged herein) existence of fly-by-wire systems in which there is absolutely NO direct connection between cockpit flight controls and airframe control surfaces. The machine already is flying such vehicles, in which human pilots are in-loop decision-makers. Consequently, control-system designers have been tasked with providing information to the pilots; hoping that they understand what is presented; awaiting instructions from the pilots; and hoping that the pilot’s instructions are not ambiguous, misinterpreted, or unwise.

          American 587 is a great example of crappy software design, and that assessment is independent of the presence of a human in the control loop. It’s absolutely unconscionable that any firm would deliver control software that would permit the vehicle to disassemble itself in flight – with or without the help of a human pilot. But don’t get me started on French software…

          A huge benefit of autonomous aircraft for Part 91 operations is one that’s been resisted by many in this space – enabling non-pilots to participate in GA, and thus vastly expanding that extremely limited market segment. As was stated above, it’s not flying – it’s transportation. But that’s not a threat to flying, any more than ocean-going factory ships are a threat to sport fishing. Some people like to fish, but a lot of people just like to eat fish. A lot less fish would be consumed if every seafood meal had to be caught by the diners. Automatic transmissions sold a lot of cars just because they lowered a barrier-to-entry. Autonomous airplanes could do the same thing for GA. But that wouldn’t stop “real pilots” from practicing their craft.

          • Tom,

            I read the references you provided. Thx.

            I’m aware you are involved in “embedded control systems”.

            I have over 30 years experience with “embedded control systems” operating in critical medical life-support systems, now retired.

            In all honesty, and with respect to your optimistic thinking, I doubt we will get to where we will have ‘autonomous’ aircraft carrying the general public in the next ~50 years.

            You’re probably aware that life-support medical devices need be approved by the FDA before marketing, somewhat akin to FAA requirements. I know what is involved in this process. The FDA has been reviewing such devices since 1978. They have learned how to ask the right questions and proof the code for the needed functions.

            I’m not sure where the FAA is with regard to analyzing ‘autonomous’ software/hardware systems. I will suggest, however, they are far behind the FDA.

            I’m reasonably certain that if I proposed an “embedded control system” of the complexity needed to provide for the safe transportation of the general public from point-to-point in reasonable weather conditions, etc. to the FAA, there would have to be a huge investment in time and monetary resources from both sides before an approval would be granted.

            The development side and the regulatory side will evolve over time, but will not happen in our lifetimes.

  • Computers will eventually replace pilots. I have been a Professional Flight Engineer for many years and have of course seen my position replaced by a computer. Airline travel as many have indicated is a very safe mode of transport but where pilots are more in control, ie General Aviation, accidents are on average 20 to 30 per month with some fatal. I know driving was mentioned but that accident rate averages out to about a 747 pax load every three days just in the USA alone. I always told everyone that the most dangerous part of my flying job was driving to the airport. I know people will say that I would not fly without a pilot but reduce the airfare and they will. Look what has happened to cabin service, bag charges etc and they still fly.

    • I was recently involved with training Chinese pilots to be first officers on the B737. They had 250 hours total time and told me that had no interest in flying but were literally press-ganged by their “authorities” into being pilots. A group of them studying at University were given aircrew medicals and those that passed were told some would be military pilots and the other part of the group would be airline pilots. During their 737 simulator training they learned by rote which buttons to push on the automatics. They all great trouble learning to fly by hand raw data. Asked what action as second in command they would they take if the captain persisted in a seriously unstable approach they all replied along the same lines. That was to advise the captain if he was too high or too low – but they would never take control from the captain no matter how dire the situation. This is 2000 years of culture at work where seniority is revered above logical common sense.
      Because of blind accent on automation in these operators, the problem occurs when a copilot knows he should take control if the situation is dire, but lacks the skill to fall back on hand flying.

  • I wonder what would have taken place if the plane that hit the birds near the Hudson River was on autopilot. I bet it would not have chosen to land on the river.

    • The plane’s existing autopilot is not empowered to “choose” to land anywhere. An autonomous aircraft control system absolutely would have chosen the Hudson River, because it is the only unobstructed flat terrain within gliding distance of the point in space (and the vehicle energy level) at which the loss-of-thrust event occurred. And one thread of its software continuously determines the optimum selection of a landing location, taking into account thousands of variables – and it does it all at a rate of hundreds of times each second. That’s a whole lotta “what-ifs.”

      • One word, Bull. Assuming the autonomous aircraft control system knows there is a river down there since currently they don’t “see”, and determines that that it is the only option (which was true) then it splashes down in the river. What if that day there had been a ferry right there. Sully can see it and make control inputs to correct the aircraft glideslope and path. Although the autonomous aircraft control system can correct glideslope and path it has to know the ferry is there to do so. Result, man saves ferry and maybe passengers. Machine kills them both.
        The human machine is not perfect, but does take in literally thousands of pieces of information and can correlate the data to produce a decision. Little things like the ability to see is huge. Why don’t we have a car that drive down the interstate? The sensors and programming required are huge investments, that’s why. I have no doubt we could build one, eventually, but the cost would be staggering. That is just to replicate the ability to see. Programing the computer to think, well, that is a whole other issue.
        Bottom line: Just because we can build a machine to manage the aircraft systems does not mean we can “teach” it to think. When you have let me know. Then we’ll have a discussion about decision making and philosophy.

        • Andy:

          If you substitute “runway” for “river,” you’ll see how silly your first statement is. Aircraft control systems that don’t “see” runways (or anything else, for that matter) land airplanes on runways, in zero-zero conditions – hundreds of times every day. That technology has been in commercial use for the last 40 years. Last time I checked, these “blind” systems don’t “think,” either. And yet the FAA allows them to do what they prohibit humans from doing – land the vehicle in zero-zero conditions. Somewhere here, there’s a lesson.
          A machine doesn’t need to “see” in order to know its position relative to a geographic feature. Terrain databases include information about enduring features (mountains, rivers, buildings, etc.). LIDAR is a common sensing technology that detects features that are not mapped (deer on runways, ferries in rivers, etc.). Google already has cars that drive along highways (and even on city streets) autonomously. And machines don’t have to “think” in order to fly an airplane.
          Nine out of ten accidents happen because the pilot screws up. Autonomous aircraft control systems will be the biggest safety improvement in the history of aviation.

          • Normally, I would let this go, but I can’t let you spew falsehoods.
            First. The control systems can land on a hyper-surveyed piece of concrete in zero-zero conditions under VERY controlled conditions. The runway doesn’t move, have waves, or have other traffic to deal with. When you have designed a control system that can ditch a plane without reading the waves, wind and surface traffic. Let me know. As one who has trained Naval viators to do just that, it is an incredibly difficult task. Do it with and airliner in New York, and EVERYONE walked away. That was a miracle. Believing you can “substitute river for runway”, that is truly naive .
            Second. This technology has NOT been around for the last forty years. Forty Years ago the most advanced technological commercial aircraft was the Lockheed Tristar. I think everyone will agree it could not land itself!
            Third. I never said the machine had to think to fly an airplane, but anyone that has flown one will tell you it sure does help! I can teach a monkey to fly, but teach it to come up with a plan when things go wrong is beyond its ability.
            Fourth. As I have already discussed, “the biggest safety improvement in aviation history” was probably advanced accident investigation techniques that allowed designers to correct design flaws that had previously gone unnoticed. Even if you could make it so a plane NEVER crashed again, it would only reduce the accident rate from 4 deaths in every million flight hours to 0 (again based on 121 ops). We have come a long way to get it to 4/1,000,000.
            Autonomous aircraft can “fly” (and I use the word loosely) when things are working fine. It’s when things go wrong that you need someone to figure out a new plan. You cannot program every possibility into the computer. The day you do, you’ll find a something else you didn’t think of; examples include United 232, American 587, US Air 1549, and there are dozens more.
            Do your best. Make it safer. PLEASE. But never get arrogant enough to believe you can control what happens in the wide blue yonder.

  • What a load of horse hockey! Fly the damned plane or get back in your Prius! Machines can do a lot – and a lot of it is nearly magical – but don’t ever make the mistake of assuming (with your life and your passengers lives) that you can recline in the cabin without a care. Until quantum computers are as common as bugs on wings, and somebody besides Microsoft has written the software, I think I’ll just keep doing my job as PIC.

  • Andy:

    You may find this Wikipedia article to be of some interest. Its narrow with regard to guidance technology, but it’s an easy read. http://en.wikipedia.org/wiki/Autoland

    All of the training that you have done with your naval aviators was designed to train HUMANS. When you’re designing a control system to do something without supervisory human control, you approach the problem from an entirely different perspective.

    “Teaching” a machine what to do in the event of an unscheduled event (“emergency”) is just part and parcel of good system design. As for figuring out how to “program every possibility into a computer,” I can say this: you can program for a vast number of conditions AND you can craft code that will account for every mathematically possible combination of such conditions. Done correctly, such a machine will outperform its human counterpart, consistently. That includes the four examples you’ve referenced here.

    You conflate arrogance and ignorance. It’s true that the combination is deadly – especially in an aircraft. But this international-award-winning design engineer has more than three decades of experience teaching people how to fly AND designing mission-critical control systems. While you certainly have the right to consider me to be ignorant of what it actually would take to design an autonomous aircraft control system, at least my ignorance has the benefit of actual applicable experience. I suppose that it’s tempting to conflate considered confidence with arrogance; that’s your privilege.

    Not every pilot is a Sullenberger. And Sully had the benefit of VFR conditions. If it had been a foggy day, or if it had been midnight, the outcome likely would have been very different. Here’s where a machine’s lack of dependence upon anthropomorphic “vision” is a big plus: its performance need not be degraded by a lack of visual conditions. It’s an unfair advantage – the palette of sensors that can be employed by a machine exceeds humans’ biology – but I’ll certainly take advantage of that disparity to achieve an increase in safety.

    One of the neatest things about software-based systems is that software can be improved – we can keep adding to the machine-based store of knowledge. That means that, over time, the machines become “smarter” and “more experienced.” If only human pilots would do the same with any consistency. The accident record shows that they don’t. The reasons that pilots kill themselves today are largely unchanged from the reasons they employed a half-century ago. Different humans; same screw-ups. That should offer a lesson.

    In another thread on this site, the readership discussed the safety record of Cirrus SR-2x pilots. No one can argue that, as a group, those pilots are not acquiring lots of experience in weather flying. One germane issue is whether or not they’re learning anything as a result of their experiences. See my paragraph immediately above; ibid.

    I’ve made it very plain in this thread that I consider general aviation to be the low-hanging-fruit market for autonomous aircraft; not the airlines. In that realm, 90%+ of accidents are caused by the least-reliable component in the vehicle – its pilot. And I certainly remember that you’ve pointed out that it wouldn’t be worth the bother to introduce autonomous control systems into airline flying. It’s a fascinating perspective, that’s for sure.

    If I had ten bucks for every person who’s told me “machines will never be able to ,” and who ultimately was proven to be totally wrong about that, I’d be able to afford one of those nifty little single-engine mini-jets that’s been argued about in yet another thread here.

    Examples abound. Anti-lock brakes, stability-augmentation systems, CNC machine tools, active flight-control systems. The genius of the human is not to wield the machine’s levers. The genius of the human is to invent the machine. All the better if the machine can wield its own levers, thus freeing up the genius to go do something else that machines cannot do… yet.

    • Tom,

      Read the article. I will grant you the technology has existed for 40+ years, but you said “that the technology has been in commercial use for the last 40 years.” It has not been in commercial use for the last 40 years.
      Interesting, even your article lays out an accident caused by a failure in the system during autoland. Of course, the pilots were blamed for not executing a missed approach. Humans are supposed to know better.
      The problem is that you can only program the system with what you know. It cannot correlate things it knows into things it doesn’t like humans can. I’ll use my wind shear training as an example. I used to give initial students a wind shear during landing scenario. The training was designed to teach the aviator to recognize wind shear and then recover according to the procedures. When the aviator would come back for refresher training, I’d give the exact same scenario, except this time I’d use “Extreme Windshear” setting instead of moderate. The Extreme setting represented what a microburst can do to an aircraft. Given under the same conditions as the first one, every aviator crashed because they did the procedure as outlined in the manual. At which point I would ask them, “When did you know you were going to die?” They answered at about “500 feet AGL.” To which I’d ask them, “Then why didn’t you do something different?” They all replied, “Because that was what the manual said. Besides nobody can recover from that situation.” I which point I put them in charge of the sim and I got into the pilot seat. I told them to execute “Extreme Windshear” at anything above 800 feet (the settings I had just used on them). Of course they all gave it to me at exactly 800 feet, but I had to recognize windshear, declare I was in wind shear and then do the procedures (actually slowing my recovery). I would execute the procedures and then about 500 feet when it was obvious that the plane was going to crash if I didn’t do something, I say “We’ve going to die if I don’t do something.” Then I overtemped all the engines, and pulled the nose up as high as I could in order to fly right at the edge of stall. In a microburst situation, time is everything. Every second you are still flying is another second towards escaping it. The aircraft bucked, but I never lost it, and eventually it started gaining altitude on the other side of the microburst. At this point every aviator would cry foul. “You overtemped the engines !” they would say. To which I replied, “yes, but maintenance can inspect them tomorrow since I landed the airplane. In your case, the accident board would have made another notch in the ‘Human Error’ statistics.” Here is an instance where “thinking” saves the aircraft. An automatic control system would have crashed along with my aviators. Sometimes the procedures don’t work. Every year procedures are updated based on some new experience. The automatic system can be updated as well, but nowhere in the system programming does it say that the manuals “provide the best available operating instructions for most circumstances, but no manual is a substitute for sound judgment. Compound emergencies, available facilities, adverse weather or terrain, or considerations affecting the lives and property of others may require modification of the procedures contained herein. Read this manual from cover to cover. It is the air crewman’s responsibility to have a complete knowledge of its contents.” That is a quote that is in the front of every Airframe Manual the US Navy Operates (NATOPS manual). Where are they going to program “sound judgement”?
      There will ALWAYS be something new that we haven’t programmed the box to handle. Without someone there that has the experience to correlate known into unknown (in other words think), bad things happen.
      In my opinion, Cirrus has the problem of pilots believing the aircraft will save them. Therefore they push into a condition that they themselves cannot handle, and then “push the button” for the parachute when it all goes wrong. Of course the aircraft is not designed to save them in EVERY situation.
      There are a number of problems with adding it to GA. #1 is the cost. Have you priced a new GA aircraft today? A Cessna 172 is about is about $400k. How much would an autonomous system add to that? #2 is that in GA most people don’t do it for transportation. They do it for fun. It isn’t much fun to go to the airport, park your car, file a flight plan, load up the aircraft, push a button and go to sleep. Besides the original discussion was started in light of the Air France accident and whether we need more stick and rudder skills or new training to deal with the automation. To which the answer is BOTH.
      By the way, I never called you ignorant or arrogant. The point at which you call someone names just confirms you have nothing more to say.
      Good luck with your designs. Maybe I’ll see them in my airplane before I retire.

      • Andy:

        “An automatic control system would have crashed along with my aviators.” Not a good one.
        Clearly, this space is a bad venue for doing “Embedded Control 101;” I won’t even try. But I will say that the windshear procedure that you demonstrated to your charges is EXACTLY what one of “my” control systems would be programmed to do. That’s no coincidence. But its response wouldn’t be a “windshear routine.” It would be emblematic of its “fly the airplane” routine.
        To a software guy like me, this stuff really isn’t all that tricky. What a pilot might see as 1,000 unique situations really is just a bunch of variations (permutations, really) on a surprisingly limited hierarchy of operations.
        Apart from a vast list of “housekeeping” chores, what do you want the vehicle to do? Assuming that you’ve told it where you want to go, the vehicle has a Maslow’s hierarchy, of sorts: don’t collide with other vehicles; don’t collide with terrain; don’t run out of fuel; don’t penetrate (defined) weather; maintain awareness of knowable external factors (your destination airport has closed, etc.); adhere to operating limitations, except where exceeding them is required in order to prevent loss of the vehicle (your windshear example, and not saving the engines for the accident investigators). Feel free to add “don’t get shot down” to the list of objectives!
        While it’s more detailed than that, proper code nonetheless is objective-oriented (that’s not the same thing as object-oriented code) and hierarchic. Actions are based on deviations from desired/required performance. Selection of and amplitude of efforts is based on rate-of-deviation and upon projected consequences of current action (or inaction, as it may be). Background routines continuously and skeptically monitor and evaluate the reliability of sensor data. “Knowledge” always is derived data, based upon information received from multiple sensors – not just multiple examples of one type of sensor, but different types altogether. Example: “airspeed” can be derived from multiple air data sensors, but it should include confirming/refuting information from position-sensing apparatus (GPS, inertial, etc.) and active processing that subjects the gathered data to “reality checks” in real time.
        Situational awareness combined with the ability to project outcomes and rank alternatives is what comprises “good judgment.” A well-designed machine would exhibit superior judgment.
        And one really nice thing about such an autonomous system is that a vehicle occupied by the rankest amateur would display the same good judgment as one occupied by the most experienced, talented, and wise professional.
        You raised cost as an objection. Moore’s Law, higher volumes, greater paybacks – all will reduce the cost of autonomous control systems. The biggest constraint to GA’s health is not cost – it’s lack of participation (low volumes). You may say the LOP is caused by high costs. I’d argue the inverse. I agree that the majority of TODAY’s pilots fly for reasons other than transportation. That’s my point. We need to enable the hundreds of thousands of potential GA customers who would join the ranks in pursuit of safe, reliable, on-demand transportation – if we removed the most significant barrier-to-entry: the requirement to obtain a Pilot’s Certificate.

  • Andy:
    Indeed, we do. http://www.avweb.com/avwebflash/news/pc12_crash_flight_break_crash_six_children_family_206798-1.html
    One difficulty in using military “drones” as an example of autonomous aircraft, is that they are designed with accepted expendability. Civilian occupants and a non-theatre environment preclude that deliberate option.
    The recent introduction of “straight-and-level buttons – and full-time “always on” operation of that functionality – are a shot over the bow. The biggest obstacle I see to widespread use of autonomous technology in GA is ego-driven resistance from pilots. I suspect that many of them fear that guys like me want to take away their rights to fly an airplane “the old-fashioned way.”
    I’m more like the guy who invents seat belts, but opposes laws that require you to wear one.
    Meanwhile, the equipment manufacturers may see a disincentive, in the form of not wanting to offend the sensibilities of many of their existing customers who may be in the “no autonomous airplanes, ever!” group.
    Cirrus could go a long way toward pushing Garmin off the dime, by abandoning their complex and expensive efforts to create a parachute-based whole-aircraft recovery system for their SF-50 jet; and by giving Garmin an RFP for a control system that, once “told” to land, has the autonomy and the capability of doing so. If Garmin built that (properly), it would be little more than a formality to turn it into an engine-start-to shutdown autonomous control system.
    But that’s always one frustration for engineers like me – just because we’re capable of doing something, doesn’t mean that we will do it – or even that we should do it. (Perceived) demand is what drives management to authorize product development programs. My advocacy (diatribe?) makes two points:
    1. “If you build it, they (customers) will come.”
    2. It can be built.
    Opponents’ leading arguments are:
    1. It can’t be built.
    2. “Over my dead body.”
    Exactly.
    Andy, I hope that we both live long enough to see widespread use of autonomous aircraft in GA. /Spock

  • Why does everyone blatantly shy away from the money argument?
    In our modern world and especially in aviation, see airline, everything and I mean everything, is offered up on the altar of “profit”!
    If I as the owner/CEO/COO/CFO can bring down and deride and de-elevate the position and dream and aspiration and love of the position of the airline pilot, I can pay him less money and have it rather for myself or the shareholders (all 5 or 1500?)! I assuage my conscience by verbalising: “If you want money, go open your own business”. Or the favourite “But you fly for the love of it, hence I don’t need to pay you a proper salary”. Or finally, “It’s all about supply and demand. If there are too many people wanting to be pilots I can therefore pay you less!”
    Let me assure you, if my children ever mention wanting to be a pilot, I lambaste them, which is in itself utterly rare, because they will not follow a route of occupation which lends itself so easily to utter abuse by the operator so that he/they can have more at your cost, the pilot!

  • Tom Yarsley, what do you do for a living? What have you done in the past for a living?
    What is your take on the Sioux City crash, where the highly intelligent people with thick glasses and starchy white coats assured all that you CANNOT lose all 3 hydraulic systems on an airliner!!
    What would the intelligent computer have done in that case?
    Very very curious to hear……

Leave a Reply

Your email address will not be published. Required fields are marked *